Tuesday, June 8, 2010

How much details Test Cases should have...

This is a very crucial aspect for designing the test cases. Even though it vary from situation to situation, but I'm giving some of the guidelines that may help you decide the optimal level of test case you should keep:

1. Test cases should cover positive, negative, boundary conditions, If it is a new requirement then we should try to cover full functional testing and if it is a patch or hot fix then we should focus more on Impact Assessment to the existing functions.


2. Wherever possible database testing should be planned have SQL related tests

3. We should keep number of steps to minimal and provide most of the tests data driven

4. Keep test data combinations in tabular format and try to make it compressed from easier maintenance and consistency

5. Use test techniques like DOE and Orthogonal array (also pair testing) for the scenario where we have large combination and data parameters

6. Identify and keep screenshots for test execution for future references and audits

7. Try to keep test cases modular and keep cross reference between different scripts for easier maintenance

8. In case you have related specifications available then you should try to keep references instead of repeating the scenarios e.g. For UI testing you can keep Style guides, for configuration testing, you can keep Configuration Baseline Document (CBD), for report validation, you can keep Report Mapping document etc for consistency purpose 

9. Have different sets (and levels) of test cases focusing on Functional testing (Validation of functional requirement), Platform testing (Validity of different environment combination), Installation Qualifications for Upgrades (Upgrade from older version to new version) as per applicability to your situations

10. For Regression testing you should try to focus more on positive and business scenario level details

11. Following criteria should also be considered while deciding the level of test cases needed:
         a. Test case detail dependency on the kind of lifecycle you are following e.g. Iterative vs. V-Model vs. Agile
         b. Details of test cases depending on critically and priority of modules
         c. Type of work is being planned - If you have your internal experts working vs if the testing work is outsourced
        d. Level of available requirements and design specs
        e. Level of testing performed functional, security, automation, performance

No comments: