Users Guide
- Overview
- Browser requirements
- Logging in
- Listing the test cases
- Writing test cases
- Setting the test case level
- Test case construction and execution tips
- Further help
Most of the information about writing test cases is adapted from the help file for Test Case Manager, a freeware program developed by Pierce Business Systems.
Overview
Open Test Manager is a tool for managing test cases. As a user you work with Open Test Manager through a web interface where you may add and update test cases as well as interact with other testers in your team. One person in your team must take the responsibility to run the Open Test Manager server on his computer. If that one is you, you should have a look at the Administrators Guide.
Once the server is up and running, a server URL should be distributed among all the testers in the group. As a user, you access the server by entering the server URL in your web page.
Browser requirements
The only requirements for using Open Test Manager is a standard web browser. As all application web pages generated are valid XHTML 1.1, most modern browsers should work fine:
- Internet Explorer (supported but lacks certain features)
- Konqueror
- Mozilla Firefox
- Opera
- Safari
Be sure to have the latest version of your browser for best and most reliable user experience.
Several convenience features of the web interface, such as sorting of tables, are only available if javascript is enabled in your browser.
Logging in
All work with Open Test Manager goes through a server, the address of which you obtain from the person running the Open Test Manager server. To access the server, enter the server address into the location bar of your browser.
You should be greeted by a login page.
Screenshot of the login page
By entering your user name and password, given to you by the tester in the group who created your account, you should be taken to the main page:
Screenshot of the main page
Listing the test cases
At every time while working with Open Test Manager, you can view a list of test cases by following the Test Cases link in the menu at the top right of the page.
Screenshot of the test case list page
Click on a test case name to work with it and see more detailed information on it.
Note that you can sort the columns on name, status, area or responsible tester by clicking on the appropriate table header.
Writing test cases
You enter a new test case by clicking on the Create new test case link available on the main page, which will lead you to a form where you can enter the data for the new test case.
Some general guidelines for authoring and entering test cases are given below.
Do’s When Writing Test Cases:
- Do have one "Verify" statement per test case (at the objective line).
- Do write each case to be executed standalone. It shouldn’t depend on other test cases.
- Do break down the app and write steps to the functionality. This eliminates constant rework of the test cases when code changes.
- Do think re-usable scripts when writing your test cases.
Do Not’s When Writing Test Cases:
- Do not use the words "correctly", "as it should", "properly" or other value statements. Define the expected behavior.
- Do not use test files or other attachments unless clearly helpful. If so, try to have the creation of the test file be part of the test case. Also, Open Test Manager gives you the option to save the file as an attachment to the test case — use it with care so as not to bloat the file server (test data, etc. are valid uses).
- Do not give setup or repro steps unless they are necessary. In other words, if the test case should start off at a well known form, then just list that one step in the setup steps (start at form X). You do not need to list all the steps to get to that form too, unless it is an obscure dialog and it would help new testers.
Setting the test case level
When the tester first fills out a test case, it is crucial to assign the appropriate Level. Test case level is a breakdown for priority of execution of the test cases. It is very important to properly breakdown the test cases so that you can organize your testing approach. If you are approaching the alpha milestone, a test manager can set a target of testing all smoke tests, critical path tests. Any failure in those two levels of test case can result in a rejection of the alpha build. Final release builds can apply the same criteria but also include the acceptance criteria test cases too. Having the test cases broken down this way will give you more meaningful summary reports too.
- Smoke Test.
These test cases are run when testing first receives a new build of an application. They verify that further testing is possible. These must all pass for a build to be considered testable.
Less than 5% of your test cases should be smoke tests. You should have at least a few standard tests that you run through every build to ensure it is testable.
- Critical Path.
These test cases are run on every build that testing certifies as testable (has passed the smoke tests). They verify that core functionality works exactly as indicated. These must all pass for a build to be considered stable. These are the valid test cases meaning that they test exactly what the specs indicate the app should do. If there are multiple pathways through an application (click menus, click buttons, etc.), then the critical path is the most traversed pathway through an application.
Roughly 20-50% of your test cases should be critical path tests; the more thoroughly you test, the lower this percentage will be (more test cases as level 3 and 4).
- Acceptance Criteria.
These test cases need to be run at least once during the entire test cycle for this release. These cases are run once, and do not need to be repeated. They verify the minimum requirements for the application to be released. These test cases (plus the smoke test and critical path cases) must all pass for a build to be considered ready for release. These test cases include items such as: Stress, Performance, Bug Regression, or the Invalid test cases (test for implied issues such as a numeric field not accepting alpha characters, etc.). If there are multiple pathways through an application (click menus, click buttons, etc.), then this level of test case is comprised of the non-standard, non-critical path through the application.
Roughly 40-70% of your test cases should be critical path tests; the more thoroughly you test, the higher this percentage will be.
- Suggested
These are test cases that would be nice to execute, but may be omitted due to time constraints. Typically, these test cases would have a low visibility of occurrence to users.
Roughly 0-30% of your test cases should be smoke tests; the more thoroughly you test, the higher this percentage will be.
Test case construction and execution tips
TODO: Rewrite and add matching functionality to Open Test Manager.
- If you do not have time to fully build all test case details, you can get by with using all the defaults and filling out a full title. You can then execute test cases rapidly from the test case list page. Where details are required, then drop in and fill out the reproduction steps and other details. Use this as a way to get started if you are in a hurry but want to catch up later with finalizing test case entry.
- Avoid assigning test cases to parent tree branches in the Test Specification Areas. Instead, breakdown the parent branch and into sub-branches and assign test cases to the lowest level tree branches only.
- As a general rule, try to assign no more than 30 test cases per branch of the tree. If you get more than 30, attempt to re-think better ways of breaking down your tree.
- Be sure to always fill out the test case description in as much detail as possible from the start. This will benefit you by: saving time later, omitting building test cases for certain areas if need be, easily distribute test cases among testers, etc.
- Test case titles should be started out with the word ‘Verify’, such as ‘Verify tab order on form X’.
- Test case titles should be detailed enough to stand on their own (given the context of the test specification area in which they are located), but not overly detailed. A good rule of thumb is 4-10 words in length.
- Do not be afraid to move test cases and re-arrange the test specification as your knowledge of the project grows. It is inevitable that you see better ways to organize your test breakdown as you test the system.
- You can speed up testing by printing out a summary sheet (test case titles only), then test from the printout, writing the results on the report. One testing a section is completed, go to the test case list page and update ranges of test cases, marking them as Pass, Fail, or Untested accordingly. If you require more detail, print out the Detail Test Case Report. If you have two computers, you can walk through test cases in TCM on one, and test the app on the other.
Further help
If you are having problem with or just wish to discuss Open Test Manager, you are welcome to our discussion board.