MSF for CMMI Process Improvement Visual Studio Team System logo

Activity:

Develop Tests (CMMI Level 3 : VAL 1.3 )

Participating Roles

Responsible:

Tester

Consult:

Architect

Overview

Entry Criteria

    When:

    • A test task is assigned to the tester for the current iteration.

    Dependencies:

    • Test Cases: The test cases are written and stored with the requirements and scenarios.
    • Requirements: The functional, operational, or quality of service requirements are validated and have test cases written.
    • Scenarios: The scenarios are validated and have test cases written.
    • Threat Model: The threat model identifies expected avenues of attack.
    • Task: A test task is assigned to the tester.
    • Test Approach: Identifies test data and conditions for tests.

    Sub-Activities

    1

    Determine Type of Test

    • For each test case, examine the test to determine what kind of test is necessary. Test cases for a quality of service requirement will generally require performance, security, stress, or load tests to be written. Test cases for a scenario will generally require validation tests.
    • After you determine which tests are necessary, write the appropriate test as an automated or manual test.

    2

    Write Performance Tests

    • Optional
    • Performance tests measure a product’s response time and ensure the product meets established quality of service requirements.
    • The objective of the test must be clearly identified. For example, there should be no significant difference in product response time for users with different internet connection speeds. The outcome of the test must be clearly documented and expressed as a range of values.
    • Document any test result deviations that depend on the test configuration. For example, a typical user configuration as opposed to a simulated lab configuration. Document external conditions that might affect the test timing.
    • Map out the test conditions including prerequisites and the scenario being timed. Review the scenario to determine areas where performance is critical and where it is not. Work with the architects to understand the performance model used to instrument the code.
    • Itemize the test steps so all testers who execute the performance tests perform each step the same way. Failure to itemize the test steps can result in incorrect measurement of performance. Use automation wherever possible.
    • Update the test approach worksheet with any test data or other considerations for the test.
    • Check in the performance tests.

    3

    Write Stress Tests

    • Optional
    • Stress tests determine a product's breaking points and push the application past its upper limits as resources are saturated. They are used to identify the application's upper load boundaries where application response has degraded to an unacceptable level or the application has failed entirely.
    • The objective of the stress test must be understood and the outcome of the test expressed in concrete terms, such as an acceptable range of values. Map out the test environment, test conditions including prerequisites, the number of virtual users, scenario being timed, and resources to be monitored. Understand the distribution of activities and simultaneous scenarios.
    • Create stress test scenarios. Within a stress test scenario, create the test mix, load profile, and simulated environment. The test mix is comprised of the tasks that are performed and the frequency of the tasks. Do not insert think times or delay between tasks.
    • In the load profile, determine how many users are tested in a constant or stepped load. Determine the simulated environment such as the network emulation speed and types of browsers used. Add any unit tests to simulate additional load. This is the unit test under load.
    • Record the steps that users perform. Include validations to check that results returned are accurate under stress. Use data binding to make the tests dynamic using a .NET data source such as SQL Server, Microsoft Excel, or Microsoft Access.
    • Check in the stress tests.

    4

    Write Load Tests

    • Optional
    • A load test is a type of performance test. Load tests help make sure the product meets its quality of service requirements under load conditions. Load tests measure application performance for typical user scenarios.
    • The objective of the load test must be understood and the outcome of the test expressed in concrete terms such as an acceptable range of values. Map out the test environment, test conditions including prerequisites, the number of virtual users, scenario being timed, and resources to be monitored. Understand the distribution of activities and simultaneous scenarios.
    • Create load test scenarios. Within a load test scenario, create the test mix, load profile, and simulated environment. The test mix is comprised of the tasks that are performed, the time between tasks, and the frequency of the tasks. Insert the think time or the delay between responses.
    • In the load profile, determine how many users are tested in a constant or stepped load. Determine the simulated environment such as the network emulation speed and types of browsers used. Add any unit tests to simulate additional load. This is the unit test under load.
    • Record the steps that users perform. Include validations to check that results returned are accurate under stress. Use data binding to make the tests dynamic using a .NET data source such as SQL Server, Microsoft Excel, or Microsoft Access.
    • Check in the load tests.

    5

    Write Security Tests

    • Optional
    • Security tests or penetration tests utilize the threats found in the threat modeling process to simulate an adversary’s attempt to achieve specific malicious goals in the product. This form of testing can be divided into three parts: exploration, flaw identification, and exploitation.
    • Identify a product’s entry points and functionality for the protection of assets. Use an informed testing approach, gathering information from the threat model to determine the expected avenues of attack. Prioritize the entry points and cross reference the entry points with the trust levels. Create environments and test configurations for each of the trust levels.
    • Write tests that utilize directed or semi-random attacks to attempt to access an asset. Directed attacks are aimed at bypassing specific security measures. For example, look to acquire a session identifier and modify the account number in a URL. Semi-random attacks may use fuzzing or the manipulation of a data format or protocol to test boundary conditions or elicit errors from the product. Test limits such as buffer sizes, integer roll overs, negative numbers, and buffer lengths.
    • Add tests to exploit any weaknesses found to attempt access to assets. Some of these tests will have to be exploratory rather than fixed. Take into consideration the amount of time required to figure out how to exploit weaknesses to access assets. While unauthorized entry into the system is a bug, access to protected assets presents the strongest case for fixing these bugs.
    • Update the test approach worksheet with any new data requirements or considerations.
    • Check in the security tests.

    6

    Write Validation Tests

    • Optional
    • Validation tests make sure the system functionality being built matches what is specified in the scenarios.
    • Identify the test data required for the test case. If new data is identified, update the test approach worksheet with the new data.
    • Identify all constraints and boundary conditions for the tests called for in the test task. Determine if the tests can be automated. Identify procedural steps for the scenario flow.
    • Write the test documentation for manual test cases.
    • Check in the validation tests.

    Exit Criteria

    Tests are written and checked into version control.

    Test approach is updated with any new data or considerations.

    (C) 2005 Microsoft Corporation. All rights reserved.

    MSF for CMMI Process Improvement: Build 050707