Test Plan According to IEEE Standard 829-1998 - Software Testing Foundations: A Study Guide for the Certified Tester Exam (2014)

Software Testing Foundations: A Study Guide for the Certified Tester Exam (2014)

A. Test Plan According to IEEE Standard 829-1998

This appendix describes the contents of a test plan according to IEEE Standard 829-1998.1 It can be used as a guide to prepare a test plan.

Test Plan Identifier

Specify uniquely the name and version of the test plan. The identifier must make it possible to clearly and precisely refer to this document distinct from other project documents. A standard for document identification is often given by rules set by the project archive or by the organization’s central document management. Depending on the size of the project organization, the identifier may be more or less complicated. The minimum information to be used is the name of the test plan, its version, and its status.

Introduction

The introduction should give a short summary of the project background.

Its intent is to help those involved in the project (customer, management, developer, and tester) to better understand the contents of the test plan.

A list of documents used in developing this plan or referred to should be included in this chapter. This typically includes policies and standards, such as industry standards, company standards, project standards, customer standards, the project authorization (possibly the contract), project plan and other plans, and the specification.

Test Objects or Items

This section should contain a short overview of the parts and components of the product to be tested; a list of the test items including their version/revision level; their transmittal media and their specification. In order to avoid misunderstandings, there should be a list of system parts not subject to testing.

Features to Be Tested

This section should identify all functions or characteristics of the system to be tested. The test specification should be referred to. This section should contain an assignment to test levels or test objects.

Features Not to Be Tested

In order to avoid misunderstanding and prevent unrealistic expectations, it should be described which aspects of the product shall not or cannot be tested. (This may be due to resource constraints or technical reasons).

Hint

§ Because the test plan is prepared early in the project, this list will be incomplete. Later it may be found that some components or features cannot be tested anyway. The test manager should then issue warnings in the status reports.

Test Approach or Strategy

This section should describe the test objectives, if possible, based on a risk analysis. The analysis shows which risks are threatening if faults are not found due to lack of testing. From this it can be derived which tests must be executed and which are more or less important. This assures that the test is concentrated on important topics.

Building on this, the test methods to be used are selected and described. It must be clearly visible if and why the chosen methods are able to achieve the test objectives, considering the identified risks and the available resources.

Acceptance Criteria (Test Item Pass/Fail Criteria)

After all tests for a test object have been executed, it must be determined, based on the test results, if the test object can be released2 and delivered.3 Acceptance criteria or test exit criteria are defined for this end.

The criterion “defect free” is, in this context, not a very useful criterion because testing cannot prove that a product has no faults. Usually, criteria therefore include a combination of “number of tests executed,” “number of faults found,” and “severity of faults found.”

For example, at least 90% of the planned tests are executed correctly and no class 1 faults (crashes) have been found.

Such acceptance criteria can vary between the test objects. The actual definition of the criteria should be made dependent on the risk analysis; that is, for uncritical test objects, acceptance criteria can be weaker than for safety-critical test objects, for example. Thus, the test resources are concentrated on important system parts.

Suspension Criteria and Resumption Requirements

Aside from acceptance criteria, there is also a need for criteria to indicate a suspension or cancellation of the tests.

It may be that a test object is in such bad shape that there is no chance it will be accepted, even after an enormous amount of testing. To avoid such wasteful testing, we need criteria that will lead to termination of useless testing early in the testing life cycle. The test object will then be returned to the developer without executing all tests.

Analogous to this, criteria for resumption or continuation of the tests are needed. The responsible testers will typically execute an entry test (smoke test). Only after this is executed without trouble should the real test begin.

Hint

§ Criteria should involve only measurements that can be measured regularly, easily, and reliably—for example, because they are automatically collected by the test tools used. The test manager should then list and interpret this data in every test report.

Test Documentation and Deliverables

In this section, we describe which data and results the test activities will deliver and in which form and to whom these results will be communicated. This not only means the test results in a narrow sense (for example. incident reports and test protocols), it also includes planning and preparation documents such as test plans, test specifications, schedules, documents describing the transmittal of test objects, and test summary reports.

Hint

§ In a test plan, only formal documentation is mentioned. However, informal communication should not be forgotten. Especially in projects that are already in trouble, or in very stressful phases (for example, the release week), an experienced test manager should try to directly communicate with the involved people more than he usually would. This is not to conceal bad news, but it should be used to assure that the right consequences are chosen after possible bad news.

Testing Tasks

This section contains a list of all tasks necessary for the planning and execution of the tests, including an assignment of responsibilities. The status of the tasks (open, in progress, delayed, done) must be followed up. This point in the test plan is part of the normal project planning and follow-up and should therefore be reported in the regular project or test status reports, which are referred to here.

Test Infrastructure and Environmental Needs

This section lists the elements of the test infrastructure that are necessary to execute the planned tests. This typically includes test platform(s), tester workplaces and their equipment, test tools, development environment (whatever is necessary for the testers), and other tools (email, Internet access, Office software packages, etc.).

Hint

§ The test manager should consider the following aspects: Acquisition of the unavailable parts of a previously mentioned “wish list”; questions about budget, administration, and operation of the test infrastructure; the test objects; and tools. Often, this requires specialists, at least for some time. Specialists may need to be from other departments or from external providers.

Responsibilities and Authority

How is testing organized with respect to the project? Who has what authority, availability, etc.? Possibly the test personnel must be divided into different test groups or levels. Which people have which tasks?

Hint

§ Responsibilities and authority may change during the course of the project. Therefore, the list of responsibilities should be presented as a table, maybe an appendix to the test plan.

Staffing and Training Needs

This section specifies the staffing needs for implementing the plan (roles, qualifications, capacity, and when they are needed, as well as planning vacations, etc.). This planning is not only for the test personnel, it should also include personnel for administrating the test infrastructure (see above), as well as developers and customers to be included in testing.

Plans for training to provide necessary skills should be included.

Schedule

This section describes an overall schedule for the test activities, with the major milestones. The plan must be coordinated with the project plan and maintained there. Regular consultation between the project manager and the test manager must be implemented. The test manager should be informed about delays during development and must react by changing the detailed test plan. The project manager must react to test results and, if necessary, delay milestones because extra correction and testing cycles must be executed.

Hint

§ The test manager must assure that the test and quality assurance activities are included in the project plan. He or she must not be an independent “state in the state.”

Risks and Contingencies

In the section about test approach, risks in the test object or its use are addressed. This section, however, addresses risks within the testing project itself (that is, risks when implementing the test plan, and risks resulting from not implementing wanted activities) because it was already clearwhen planning that there would be no resources for them in the concrete project.

The minimum should be a list of risks to be monitored at certain points in time (for example, in connection with the regular test status reports) in order to find measures to minimize them.

Approval

This section contains a list of the persons or organizational units that will approve the test plan or need to be informed. Approval should be documented by signature. It should also be documented that parties have been informed after major changes, especially changes of strategy and/or personnel.

Hint

§ Respective groups or organizational units are typically development group(s), project management, project steering committee, software operators, software users, customers (clients) and naturally, the test group(s).

Depending on the project situation, the intent of the approval described here may be different.

The ideal situation is, “You approve that the mentioned resources will be financed and used in order to test this system as described here in a reasonable way.”

The often-more-usual situation is, “Because of the lack of resources shown, tests can only be performed in an insufficient manner, testing the very most important parts. However, you approve this and accept that based on this, release decisions will be made, which may have a high risk.”

Glossary (not in IEEE829-1998, but lower case!)

Testing has no tradition for using standardized terminology. Thus, the test plan should contain an explanation of the testing terms used in the project. There is a high danger that different people will have different interpretations of testing terms. For example, just ask several people involved in the project for the definition of the term load testing.

Test Plans According to IEEE Standard 829-2008

This appendix describes the contents of a test plan according to IEEE Standard 829-2008. It can be used as a guide to prepare a test plan.

This version of the IEEE 829-2008 standard [IEEE 829-2008] differentiates between →Master Test Plan and →Level Test Plan. An existing test plan created according to IEEE 829-1998 can be converted to conform to an IEEE 829-2008 master test plan and level test plans—for example, by using an appropriate table of cross-references. The 2015 syllabus will use IEEE 829-2008 as reference for the Certified Tester – Foundation Level examination.

The Master Test Plan

The objective of a master test plan is to describe the overall test approach in a project or an organization. It describes all the different test levels, test activities, and test tasks to be done and their relationship. Level test plans, on the other hand, describe what is to be done in one test level. The older IEEE 829 standard had only one test plan, and it was not clear if it applied to a whole project or one level.

The template from the standard4

1. Introduction

1.1. Document identifier

1.2. Scope

1.3. References

1.4. System overview and key features

1.5. Test overview

1.5.1 Organization

1.5.2 Master test schedule

1.5.3 Integrity level schema

1.5.4 Resources summary

1.5.5 Responsibilities

1.5.6 Tools, techniques, methods, and metrics

2. Details of the Master Test Plan

2.1. Test processes including definition of test levels

2.1.1 Process: Management

2.1.2 Process: Acquisition

2.1.3 Process: Supply

2.1.4 Process: Development

2.1.4.1 Activity: Concept

2.1.4.2 Activity: Requirements

2.1.4.3 Activity: Design

2.1.4.4 Activity: Implementation

2.1.4.5 Activity: Test

2.1.4.6 Activity: Installation/checkout

2.1.5 Process: Operation

2.1.6 Process: Maintenance

2.1.6.1 Activity: Maintenance test

2.2. Test documentation requirements

2.3. Test administration requirements

2.4. Test reporting requirements

3. General

3.1. Glossary

3.2. Document change procedures and history

Master Test Plan Identifier

Specify the name and version of the test plan. The identifier must make it possible to clearly and precisely refer to this document distinct from other project documents. The minimum information to be used is the name of the test plan, its version, and its status.

1. Introduction

The introduction should give a short summary of the project background.

Its intent is to help the readers of the plan (customer, management, developer, regulating authorities, and tester) to better understand the contents of the test plan.

The introduction should describe the entire test effort, including the test organization, the test schedule, and the integrity schema. A summary of required resources, tools and techniques might also be included in this section.

1.2 Scope

Describe the purpose, goals, and scope of the test effort.

Identify the project(s) or product(s) for which the plan is written and the specific processes and products covered by the test effort. Describe what is included and excluded, as well as assumptions and limitations. It is important to define clearly the limits of the test effort to control expectations.

1.3 References

Include here a list of all of the applicable reference documents. The references are separated into “external” references that are imposed external to the project and “internal” references that are imposed from within the project. This section may also be at the end of the document, for example in chapter 3.

Referenced documents should include policies and standards, such as industry standards, company standards, project standards, customer standards, the project authorization (possibly the contract), project plan and other plans, and the specification.

1.4 System Overview and Key Features

This section should present the mission or business purpose of the system or software product under test as well as a short overview of the features, parts, and components of the product to be tested; a list of the test items including their version/revision level; their transmittal media and their specification. To avoid misunderstandings, there should also be a list of system parts not subject to testing, i.e., an overall summary of the “features not to be tested” chapter in the test plan according to the older standard.

1.5 Test overview

In this section, describe the test organization, the overall test schedule, and the integrity level scheme to be used to control testing, the major test resources, responsibilities, tools, techniques, and methods to be applied.

1.5.1 Organization

Describe the relationship of the test processes to other development and supporting processes. It may be beneficial to include an organization chart. Describe how testing and other tasks shall communicate.

1.5.2 Master Test Schedule

Outline an overall schedule for the test activities, with the major milestones. The test plan must be coordinated with the project plan and maintained throughout the project. Regular consultation between the project manager and the test manager must be implemented. The test manager should be informed about delays during development and must react by changing the test plan. The project manager must react to test results and, if necessary, delay milestones because extra correction and testing cycles must be executed.

To handle changes and iterations, describe the task iteration policy for the re-execution of test tasks and any dependencies.

1.5.3 Integrity Level Scheme

Describe how integrity levels are identified and how they govern the testing effort. The plan should document the assignment of integrity levels to individual documents and components as well as how integrity levels are used to control the testing tasks.

1.5.4 Summary of Necessary Resources

Describe the needed test resources, including staffing, facilities, tools, and special procedural requirements like security, access rights, and documentation control. Include a description of training needs.

1.5.5 Responsibilities and Authority

How is testing organized with respect to the organization and the project? Who has what authority, availability, etc.? Possibly the test personnel must be divided into different test groups or levels. Which people have which tasks? Who shall provide support to testing?

Hint

§ Responsibilities and authority may change during the course of the project. Therefore, the list of responsibilities should be presented as a table, maybe an appendix to the test plan.

1.5.6 Tools, Techniques, Methods, and Metrics

Describe documents, hardware and software, test tools, techniques, methods, and test environment necessary for the test process. Describe the techniques to be used to identify and capture reusable testware (for regression testing). Include information regarding acquisition, training, support, and qualification for each tool, technology, and method, at least for everything new to the organization.

Document the metrics to be used by the test effort, and describe how these metrics will be collected, evaluated, and used to support the test objectives.

More details about topics regarding specific test levels may be included in level test plans.

2. Details of the Master Test Plan

This section describes the test processes, test documentation requirements, and test reporting requirements for the entire test effort.

2.1 Test Processes, Including Definition of Test Levels

Identify test activities and tasks to be performed for each of the test processes and document those test activities and tasks. Provide an overview of the test activities and tasks for all development life cycle processes. Identify the test levels, including any “special” tests like security, usability, performance, stress, recovery, and regression testing.

If the test processes are already defined by an organization’s standards, a reference to those standards could be substituted for the contents of this section.

2.1.1 through 2.1.6 “Life cycle” Processes, i.e., Activities and Tasks

There may be up to six subsections here, for the life cycle processes Management, Acquisition, Supply, Development, Operation, and Maintenance. Normally for a development project, there is only the subsection about development.

This section contains a list of all activities and tasks necessary for the planning and execution of the tests, including an assignment of responsibilities. The status of all these tasks (not started, in progress, delayed, done) must be followed up.

Address the following eight topics for each test activity5:

a) Test tasks: Identify the test tasks to be performed as well as the degree of intensity and rigor in performing and documenting the task.

b) Methods: Describe the methods and procedures for each test task, including tools. Define the criteria for evaluating the test task results.

c) Inputs: Identify the required inputs for the test task. Specify the source of each input. For any test activity and task, any of the inputs or outputs of the preceding activities and tasks may be used.

d) Outputs: Identify the required outputs from the test task. The outputs of the management of the test and of the test tasks will become inputs to subsequent processes and activities, as appropriate.

e) Schedule: Describe the schedule for the test tasks. Establish specific milestones for initiating and completing each task, for the receipt of each input, and for the delivery of each output.

f) Resources: Identify the resources for the performance of the test tasks. Specify resources by category (e.g., staffing, tools, equipment, facilities, travel budget, and training).

g) Risks (project risks) and Assumptions: Identify the risk(s) (e.g., schedule, resources, technical approach, or for going into production) and assumptions associated with the test tasks. Provide recommendations to eliminate, reduce, or mitigate risk(s). This section takes much of the information provided in the section “Risks and contingencies” of the old standard.

h) Roles and responsibilities: Identify for each test task the organizational elements that have the responsibilities for the execution of the task, and the nature of the roles they will play.

2.2 Test Documentation Requirements

In this section, we describe which data and results the test activities will deliver and in which form and to whom these results will be communicated. This not only means the test results in a narrow sense (for example, incident reports and test protocols), it also includes planning and preparation documents such as test plans, test specifications, schedules, documents describing the transmittal of test objects, and test summary reports.

Hint

§ In a test plan, only formal documentation is mentioned. However, informal communication should not be forgotten. Especially in projects that are already in trouble, or in very stressful phases (for example, the release week), an experienced test manager should try to directly communicate with the involved people more than he usually would. This is not to conceal bad news, but it should be used to assure that the right consequences are chosen after possible bad news.

2.3 Test Administration Requirements

This section should describe how the test will be administered in practice, during its execution.

2.3.1 Anomaly (defect) Resolution and Reporting

Describe the method of reporting and resolving anomalies (incidents, failures). This section of the plan defines the criticality levels for defects. Classification for software anomalies may be found in chapter 6 of this book. This section may also refer to a general standard way of defect handling in the organization.

2.3.2 Task Iteration Policy

Describe the criteria used to determine the extent to which a testing task is repeated after changes (e.g., re-reviewing, retesting, and regression testing after problems have been repaired). These criteria may include assessments of change extent and risk, integrity level, and effects on budget, schedule, and quality.

2.3.3 Deviation Policy

Describe the procedures and criteria used to deviate from the master test plan and level test plans. Identify the authorities responsible for approving deviations.

2.3.4 Control Procedures

Identify control procedures applied to the test activities. These procedures describe how the system and software products and test results will be configured, protected, and stored.

These procedures may describe quality assurance, configuration management, data management, or other activities if they are not addressed in other plans or activities. Describe any security measures necessary for the test effort.

2.3.5 Standards, Practices, and Conventions

Identify or reference the standards, practices, and conventions that govern testing tasks, if they are not “matters of fact.”

2.4 Test Reporting Requirements

Specify the purpose, content, format, recipients, and timing of all test reports. Test reporting consists of Test Logs, Anomaly (failure, incident) Reports, Interim Test Status Report(s), Level Test Report(s), and the Master (or final) Test Report. Test reporting may also include other reports as deemed necessary.

3. General

This section includes general information and could as well be put into chapter 1 or at the title page or into a general place accessible for all people in the project.

3.1 Glossary

Testing has no tradition for using standardized terminology. Thus, the test plan should contain an explanation of the testing terms used in the project. There is a high danger that different people will have different interpretations of testing terms. For example, just ask several people involved in the project for the definition of the term load testing.

Thus: Provide an alphabetical list of terms and acronyms that may require definition for the users of the plan with their corresponding definitions. You may also refer to a project glossary.

3.2 Document Change Procedures and History

The section should define the configuration management procedures to be followed for this document, if they are different from other documents. But at least the change list and history should be included.

The Level Test Plan

This is a test plan for only one test level, like acceptance test plan, system test plan, etc.

The template from the standard

1. Introduction

1.1. Document identifier

1.2. Scope

1.3. References

1.4. Level in the overall sequence

1.5. Test classes and overall test conditions

2. Details for this level of test plan

2.1 Test items and their identifiers

2.2 Test Traceability Matrix

2.3 Features to be tested

2.4 Features not to be tested

2.5 Approach

2.6 Item pass/fail criteria

2.7 Suspension criteria and resumption requirements

2.8 Test deliverables

3. Test management

3.1 Planned activities and tasks; test progression

3.2 Environment/infrastructure

3.3 Responsibilities and authority

3.4 Interfaces among the parties involved

3.5 Resources and their allocation

3.6 Training

3.7 Schedules, estimates, and costs

3.8 Risk(s) and contingency(s)

4. General

4.1 Quality assurance procedures

4.2 Metrics

4.3 Test coverage

4.4 Glossary

4.5 Document change procedures and history

It can be seen that chapters 2 and 3 in a level test plan contain many of the points the old standard test plan contained. Guidance for these sections is given before in this Appendix, under the heading of the old standard (829-1998). Section 3.4 is new in this standard. New are also sections 4.1through 4.3. Sections 4.4 and 4.5 may be placed the same way as in the master test plan.

Guidance for section 3.4

Describe the communication between the individuals and groups identified in section 3.5. This includes what needs to be communicated, how, and when. A figure that illustrates the flow of information and data may be included.

Guidance for sections 4.1 through 4.3

4.1 Quality assurance procedures

Identify the means by which the quality of testing processes and products will be assured. Include or reference procedures for how problems in testing will be tracked and resolved. A general Quality Assurance Plan or Standard Procedure may be referenced, if there exists one.

4.2 Metrics

Identify the specific measures that will be collected, analyzed, and reported. The metrics specified here are those that only apply to this particular test level (the global metrics are described in Master Test Plan section 1.5.6). This may be a reference to where metrics are documented in a Quality Assurance Plan or to a generally used measurement program.

4.3 Test coverage

Specify the requirement(s) for test coverage. The type of coverage that is relevant varies by the level of test. See chapter 5 in this book for details. For example, unit test coverage is often expressed in terms of percentage of code tested (white box test coverage), and system test coverage can be a percentage of requirements tested (black box test coverage).