ISO/IEC DIS 33063
ISO/IEC DIS 33063
ISO/IEC DIS 33063: Information technology — Process assessment — Process assessment model for software testing

ISO/IEC DIS 33063:2026(en)

ISO/IEC / JTC1/SC7 WG 10

Secretariat: BIS

Date: 2025-12-23

Information technology – Process assessment – Process assessment model for software testing

© ISO/IEC 2026

All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of the requester.

ISO copyright office

CP 401 • Ch. de Blandonnet 8

CH-1214 Vernier, Geneva

Phone: +41 22 749 01 11

Email: copyright@iso.org

Website: www.iso.org

Published in Switzerland

Contents

Foreword v

Introduction vi

1 Scope 1

2 Normative references 1

3 Terms and definitions 1

4 The process assessment model 2

4.1 Introduction 2

4.2 Structure of the process assessment model 2

4.2.1 General 2

4.2.2 Processes 3

4.2.3 Process dimension 5

4.2.4 Quality dimension 6

4.3 Assessment indicators 6

5 The process dimension 7

5.1.1 General 7

5.2 Organizational test process group 8

5.2.1 OT.1 Organizational test process 8

5.3 Test mmanagement process group 9

5.3.1 TM.1 Test strategy and planning process 9

5.3.2 TM.2Test monitoringand control process 10

5.3.3 TM.3 Test completion process 11

5.4 Dynamic test process group 12

5.4.1 DT.1 Test design and implementation process 12

5.4.2 DT.2 Test environment and data management process 13

5.4.3 DT.3 Test execution process 14

5.4.4 DT.4 Test incident reporting process 15

6 The quality dimension 15

Annex A (informative) Assessment guidelines 17

A.1 General assessment guideline(informative) 17

A.2 Process application guideline (normative) 17

A.3 Guideline on the use of additional processes from other PAM(informative) 19

Annex B (informative) Information product characteristics 20

B.1 Information product categories 20

B.2 Information product description 21

Annex C (informative) Additional processes 25

C.1 Static testing process group 25

C.1.1 STAT.1 Work product review process 25

C.1.2 STAT.2 Static analysis process 26

C.2 Test management process group 27

C.2.1 TM.4 Problem resolution management process 27

Annex D (informative) Supplementary process definition 29

D.1 Static analysis process 29

Bibliography 30

Foreword

ISO (the International Organization for Standardization) is a worldwide federation of national standards bodies (ISO member bodies). The work of preparing International Standards is normally carried out through ISO technical committees. Each member body interested in a subject for which a technical committee has been established has the right to be represented on that committee. International organizations, governmental and non-governmental, in liaison with ISO, also take part in the work. ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of electrotechnical standardization.

The procedures used to develop this document and those intended for its further maintenance are described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the various types of ISO documents should be noted. This document was drafted in accordance with the editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. ISO shall not be held responsible for identifying any or all such patent rights. Details of any patent rights identified during the development of the document will be in the Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents).

Any trade name used in this document is information given for the convenience of users and does not constitute an endorsement.

For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions related to conformity assessment, as well as information about ISO's adherence to the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT),see www.iso.org/iso/foreword.html.

This document was prepared by Joint Technical Committee ISO/IEC JTC1, Information technology], Subcommittee SC 7, Software and systems engineering.

This secondedition cancels and replaces the firstedition (ISO/IEC 33063:2015), which has been technically revised.

The main changes are as follows:

— adaptation to the updated version of ISO/IEC/IEEE 29119-2:2021 Software and systems engineering – Software testing – Part 2: Test processes

— adaptation to the updated version of ISO/IEC 33020:2019 Information technology – Process assessment – Process measurement framework for assessment of process capability

A list of all parts in the ISO 330xx series can be found on the ISO website.

Any feedback or questions on this document should be directed to the user’s national standards body. A complete listing of these bodies can be found at www.iso.org/members.html.

Introduction

The ISO/IEC 330xx set of standards covering the domain of process assessment are based on a view of assessment that establishes architecture of three components:

— Process models that define processes, the entities that are the subject of anassessment;

— Measurement frameworks that provide scales for evaluating specified attributes; and

— A specification of the process to be followed in conducting assessments.

This standard provides an example of a process assessment model for software testing for use in performing a conformant assessment in accordance with the requirements of ‘ISO/IEC 33002 – Process Assessment – Requirements for performing process assessments’.

An integral part of conducting an assessment is to use a process assessment model (PAM) related to a process reference model (PRM) and conformant with the requirements defined in ISO/IEC 33004.

A process reference model cannot be used alone as the basis for conducting a consistent and reliable assessment of process capability since the level of detail is not sufficient.

Therefore:

— the description of the process purpose and process outcome(s) provided by the process reference model need to be supported with a comprehensive set of indicators of process performance; and

— the capability levels and process attributes defined in ISO/IEC 33020 and its associated rating scale need to be supported with a setof indicators of process capability.

Used in this way and in conjunction with a documented process, consistent and repeatable ratings of process capability is possible.

The ISO/IEC 33063 standard, a process assessment model for software testing, contains a set of indicators to be considered when interpreting the intent of the Process reference model. These indicators may also be used when implementing a process improvement program or to help evaluate and select an assessment model, methodology and/or tools.

The process reference model defined in ‘ISO/IEC/IEEE 29119-2 - Software and Systems Engineering — Software Testing — Part 2: Test Processes’ has been used as the basis for the ISO/IEC 33063 process assessment model for software testing.

Within ISO/IEC 33063:

— clause 4 provides a detailed description of the structure and key components of the process assessment model, which introduces two dimensions: a process dimension and a quality dimension; assessment indicators are also introduced in this clause;

— clause 5 addresses the process dimension. It uses process definitions from ISO/IEC/IEEE 29119-2 to identify a process reference model. The processes of the process reference model are described in the process assessment model in terms of purpose and outcomes. The process assessment model expands the process reference model process definitions by including a set of process performance indicators called base practices for each process. The process assessment model also defines a second set of indicators of process performance by associating work products with each process;

— clause 6 provides a brief description of the quality dimension of the process assessment model;

— Annex A provides a guideline on how the planning and scoping of an assessment is done with this process assessment model for software testing;

— Annex B provides selected characteristics for typical information products to assist the assessor in evaluating the quality level of processes;

— Annex C introduces additional process areas for the process assessment model;

— Annex D provides the additional process reference model processes which will be used by the PAM in Annex D;

— Bibliography contains a list of informative references.

NOTE As the processes described in this model are generic when practically applied to an assessment, they have to be applied to the different test levels or test levels or test types encountered in the project which is to be assessed. The multiple applications of the processes have to be documented in the assessment scope. It also provides guideline on the use of additional processes from other Process assessment models.

The International Organization for Standardization (ISO) draws attention to the fact that it is claimed that compliance with this document may involve the use of a patent.

ISO takes no position concerning the evidence, validity and scope of this patent right.

The holder of this patent right has assured ISO that he/she is willing to negotiate licences under reasonable and non-discriminatory terms and conditions with applicants throughout the world. In this respect, the statement of the holder of this patent right is registered with ISO. Information may be obtained from the patent database available at www.iso.org/patents.

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights other than those in the patent database. ISO shall not be held responsible for identifying any or all such patent rights.

Information technology – Process assessment – Process assessment model for software testing)

1.0 Scope

This document:

— defines aprocess assessment model that meets the requirements of ISO/IEC 33004and that supports the performance of an assessment of process capability using the process measurement framework defined in ISO/IEC 33020. The process assessment model provides indicators for guidance on the interpretation of the process purposes and outcomes as defined in ISO/IEC/IEEE 29119-2 and the process attributes as defined in ISO/IEC 33020;

— provides guidance, by example, on the definition, selection and use of assessment indicators.

A process assessment model comprises a set of indicators of process performance and process capability.The indicators are used as a basis for collecting the objective evidence that enables an assessor to assignratings, following the requirements of ISO/IEC 33002. The set of indicators included in this standard is neither intended to be an all-inclusive setnor is it intended to be applicable in its entirety. Subsets that are appropriate to the context and scope of theassessment should be selected.

The process assessment model in this standard is directed at assessment sponsors andcompetent assessors who wish to select a model, and associated documented process method, forassessment (for either capability determination or process improvement).

Any process assessment model for Software Testing meeting the requirements defined in ISO/IEC 33004 concerning models for process assessment may be used for assessment. Different models and methods may be needed to address differing business and testing needs. This assessment model is provided as an exemplar of a model meeting all the requirements expressed in ISO/IEC 33004.

2.0 Normative references

The following documents are referred to in the text in such a way that some or all of their content constitutes requirements of this document. For dated references, only the edition cited applies. For undated references, the latest edition of the referenced document (including any amendments) applies.

ISO/IEC/IEEE 29119‑1, Software and systems engineering — Software testing — Part 1: General concepts

ISO/IEC/IEEE 29119‑2, Software and systems engineering — Software testing — Part 2: Test processes

ISO/IEC 33001, Information technology — Process assessment — Concepts and terminology

ISO/IEC 33004, Information technology — Process assessment — Requirements for process reference, process assessment and maturity models

ISO/IEC 33020, Information technology — Process assessment — Process measurement framework for assessment of process capability

3.0 Terms and definitions

For the purposes of this document, the terms and definitions given in ISO/IEC 33001, ISO/IEC/IEEE 29119-1 and ISO/IEC/IEEE 29119-2apply.

ISO and IEC maintain terminological databases for use in standardization at the following addresses:

— ISO Online browsing platform: available at https://www.iso.org/obp

— IEC Electropedia: available at https://www.electropedia.org/

4.0 The process assessment model

4.1 Introduction

ISO/IEC 33063 provides aprocess assessment model (PAM) for software testing.

The process assessment model is a two-dimensional model. In one dimension, the process dimension, the processes are defined and classified into process categories together with theset of assessment indicators of process performance. In the other dimension, the quality dimension, for each process attribute in the process measurement framework a set of processquality indicators is defined for the selected process quality characteristic.

Figure 1 — Structure of the process assessment model

Figure 1 shows the process assessment model as a two-dimensional model, the process dimension with its relationship to ISO/IEC/IEEE 29119-2Software and systems engineering – Software testing; Part 2: Test processes, and the quality dimension in relationship to a process measurement framework. The process assessment modelfor software testing defined in ISO/IEC 33063 is conformant with the ISO/IEC 33004 requirements for a process assessment model, and can be used as the basis for conducting an assessment of software testing processes.

4.1.1 Structure of the process assessment model

4.1.2 General

This clause describes the detailed structure of the process assessment model and its key components.

This process assessment model expands upon the process reference model by adding the definition and use of assessment indicators. Assessment indicators comprise of indicators of process performance and process capability. These are defined to support the assessor’s judgment of the performance and capability of an implemented process.

Clause 5, together with its associated Annex B, describes the components of the process dimension. Clause 6 describes the quality dimension.

ISO/IEC 33004 requires that processes included in a process reference model satisfy the following criteria:

"The fundamental elements of a process reference model are the set of descriptions of the processes within the scope of the model. These process descriptions shall meet the following requirements:

a) a process shall be described in terms of its purpose and outcomes;

b) the set of process outcomes shall be necessary and sufficient to achieve the purpose of the process;

c) process descriptions shall not contain or imply aspects of the process quality characteristic beyond the basic level of any relevant process measurement framework conformant with ISO/IEC 33003."

As processes are directlyderived and applied from ISO/IEC/IEEE 29119-2, these requirements are satisfied.

The process assessment model includes process groups defined in ISO/IEC/IEEE 29119-2 which are:

— the Organizational test process;

— the Test management processes;

— the Dynamic test processes.

The Static Test Processes group is added in an informative Annex D expanding the current process reference model since they are the processes to be assessed when considering industry practice.

The quality dimension comprising a set of process attributes for a selected process qualitycharacteristic is incorporated as a process measurement framework together with a set of processquality indicators.

NOTE ISO/IEC 33020 provides a process measurement framework for the assessment of process capabilitywhich canbe incorporated into this document. ISO/IEC 33020 also includes a set of process quality indicators foreach process attribute in the process measurement framework.

4.1.3 Processes

Figure 2 lists the processes fromISO/IEC/IEEE 29119-2 that are included in the process dimension of the process assessment model for software testing.

Figure 2 — Process groups and processes of PRM, ISO/IEC/IEEE 29119-2

Test processes in this process assessment modelare classified into organizational test(OT) process group, test management (TM) process group and dynamic test (DT) process group, exactly following the structure given in the process reference model.

The organizational testprocess group includes a single process performed for the creation and maintenance of organizational test specifications, such as organizational test policies, organizational test strategies, and other organization-wide specifications.

This group includes the process listed in Table 1.

Table 1 — Organizational testprocess group

Process Identification

Process name

Source

OT.1

Organizational test process

ISO/IEC/IEEE 29119-2

The test managementprocess group consists of processes that cover the management of testing. The processes contain practices that may be used by anyone who manages the whole test project or a particular test level, or test typewithin the project.

This group includes the processes listed in Table 2.

Table 2 — Test management process group

Process Identification

Process name

Source

TM.1

Test strategy and planning process

ISO/IEC/IEEE 29119-2

TM.2

Test monitoring and control process

ISO/IEC/IEEE 29119-2

TM.3

Test completion process

ISO/IEC/IEEE 29119-2

Thedynamic testprocess group consists of processes that prepare and maintain the test environment; design, implement and execute the tests and/or report the incidents resulting from the test execution.

Thedynamic testprocess group includes the processes listed in Table 3.

Table 3 — Dynamic test process group

Process Identification

Process name

Source

DT.1

Test design and implementation process

ISO/IEC/IEEE 29119-2

DT.2

Test environment and data management process

ISO/IEC/IEEE 29119-2

DT.3

Test execution process

ISO/IEC/IEEE 29119-2

DT.4

Test incident reporting process

ISO/IEC/IEEE 29119-2

As illustrated in the Annex B.2, the processes within the test management process group and dynamic test process group are generic. Within the context of an assessment, they have to be applied to the whole project, different test levels, or test types depending on the characteristics of the project to be assessed.

For example, processes within the test management process can be applied to the management ofprojects (e.g., master test level), of test levels such as unit testing or acceptance testing, orof types of testing such as security testing and performance testing. The processes within the dynamic test process group can e.g., be applied to integrationtesting or performance testing, security testing or other test levels/test types.

NOTE Application of the test management processes is unique for each situation with distinct characteristics, hence Annex B.2, Process Application Guideline, is normative.

For practical purpose, in an informative Annex C, three processes are added. The additional processes problem resolution managementprocess (TM.4), is from the ISO/IEC 15504-5, andwork product review process(STAT.1)is from the ISO/IEC 20246 while the static analysis process (STAT.2) is from Annex D. Those additional processes in the Annex C are included for the practical assessment of software test processes reflecting the industry practice.

The guideline on the use of additional processes from other process assessment models is depicted in the Annex B.3.

4.1.4 Process dimension

For the process dimension, all the processes in Figure 2 are included within the process dimension of the process assessment model. The processes are classified into process groups. There are four processgroups: organizational test process, test management process, dynamic test process, and static test process. The process groups and their associated processes are described in Clause 5 and in Annex C.

Each process in the process assessment model is described in terms of a purpose statement. These statements contain the unique functional objectives of the process when performed in a particular environment. A list of specific outcomes is associated with each of the process purpose statements, as a list of expected positive results of the process performance.

Satisfying the purpose statement of a process represents the first step in building a level 1 process capability where the expected outcomes are observable.

4.1.5 Quality dimension

For the quality dimension, the minimum requirement is that the process is performed, i.e. theimplemented process achieves its process purpose and the expected outcomes are observable.

Process attributes are features of a process that can be evaluated on a scale of achievement, providing ameasure of the quality of the process and are applicable to all processes.

Further details on the quality dimension can be found in Clause 6.

4.2 Assessment indicators

A process assessment model is based on the principle that the quality of a process can be assessed bydemonstrating the achievement of process attributes on the basis ofevidences related to assessmentindicators.

There are two types of assessment indicators: process performance indicators and process qualityindicators. Process performance indicators address the process purpose and outcomes of each process in the process dimension. Process quality indicators demonstrate the achievement of the processattributes in the quality dimension.

The process performance indicators are:

— Base practice (BP);

— Information products (IP).

The performance of base practices (BPs) provide an indication of the extent of achievement of the process purpose and process outcomes. Information products (IPs) are either used or produced (or both), when performing theprocess. Informationitems that are the key outputs of the process are primarily used as performanceindicators.

Annex B provides the list of information products (IP) associated with the processes in Clause 5. The information products are identified by categories. The information products are indicated by the process IDs.

Process quality indicators depend on the process quality characteristic of interest. The minimumrequirement is that at least one of the process attributes shall comprise the achievement of the definedprocess purpose and process outcomes for the process; this is termed the process performanceattribute (see ISO/IEC 33003 Clause 4.2.1).

The process performance and process quality indicators represent types of objective evidence thatmight be found in an instantiation of a process and therefore could be used to judge achievement ofquality. Figure 3 shows how the assessment indicators are related to process performance and processquality.

Figure 3 — Assessment indicators

5.0 The process dimension

5.1 General

This clause defines the processes and theprocess performance indicators of the process assessment model. The processes in the process dimension can be directlymapped to the processes defined in theprocess reference model.

The processes are classified (for the purpose of this process assessment model) into process groups which are listed in Clause 4.

The individual processes are described in terms of process name, process purpose, and process outcomesas defined in ISO/IEC/IEEE 29119-2.

In addition, the process performance indicators of the process assessment model provide information in the form of:

a) base practices for the process providing a definition of the tasks and activities needed to accomplish the test process purpose and fulfil the process outcomes; each base practice is associated to one or more process outcomes; and

b) information productsthat are the key outputs of the process, and are related to one or more processoutcomes; and

c) characteristics associated with each information product.

The process purposes, outcomes, the base practices and the information products associated with the processes are included in this clause. The information product characteristics are contained in Annex B. The base practices and information products constitute the set of indicators of process performance.

The associated information products listed in this clause may be used when reviewing potential inputs and outputs of an organization's process implementation.

The associated information products provide objective guidance for outputs to look for and objective evidence supporting the assessment of a particular process.

A documented assessment process and assessor judgment is needed to ensure that process context (application domain, business purpose, development and testing methodology, size of the organization, etc.) is explicitly considered when using this information.

This assessment process should not be considered as a checklist of what each organization must have but rather as an example and starting point for considering whether, given the context, the information products are necessary and contributing to the intended purpose of the process.

NOTE Consideration of assessing the additional test process areas in Annex C such aswork product review process (STAT.1), static analysis process (STAT.2), problem resolution managementprocess (TM.4) may be required to guarantee the assessment of all the test processes.

5.1.1 Organizational testprocessgroup

5.1.2 OT.1 Organizationaltestprocess

Process ID

OT.1

Process name

Organizational test process

Process purpose

The purpose of the organizational test process is to develop, monitor conformance and maintain organizational test specifications, such as the organizational test policy and organizational test practices document.

Process outcomes

As a result of the successful implementation of the organizational test process:

a)   The requirements for organizational test specifications are identified.

b)   The organizational test specifications are developed.

c)   The organizational test specifications are agreed by stakeholder(s).

d)   The organizational test specifications are made accessible.

e)   Conformance to the organizational test specifications is monitored.

f)   Updates to organizational test specifications are agreed to by stakeholders.

g)   Updates to the organizational test specifications are made.

NOTE   Updates to the organizational test specifications will only be made when needed.

Base practices

OT.1. BP1: Develop organizational test specification. Develop an organizational test specification such as organizationaltest policy or organizational test practices, processes, procedures and other assets.[Outcome: a, b, c]

OT.1.BP2: Monitor and control use of organizational test specification. Monitor and control usage of organizational test specification to determine whether it is being used effectively. [Outcome: d, e]

OT.1.BP3: Update organizational test specification. Update organizationtest specification by reviewing feedbacks. [Outcome: e, f, g]

Information products

Organizational test policy [Outcome: a, b, c, d, e, f, g]

Organizational test practices document[Outcome: a, b, c, d, e, f, g]

5.2 Test management processgroup

5.2.1 TM.1 Test strategy and planning process

Process ID

TM. 1

Process name

Test strategy and planning process

Process purpose

The purpose of the test strategy and planning process is to develop, agree, record and communicate to relevant stakeholders the scope and approach that will be taken to testing, enabling early identification of resources, environments and other requirements of testing.

Process outcomes

As a result of the successful implementation of the test strategy and planning process:

a) The scope of the testing is analysed and understood.

b) The stakeholders who will participate in designing the test strategy and the test planning are identified and informed.

c) Risks that can be treated by testing are identified, analysed and classified with an agreed level of risk exposure.

d) Test strategy, test environment, test tool and test data needs are identified.
EXAMPLE Tools, special equipment, test environment, office space.

e) Staffing and training needs are identified.

f) Each activity is scheduled.

g) Estimates are calculated and evidence to justify the estimates is recorded.
EXAMPLE Cost, staff, and timeline estimates.

h) The test plan is agreed to and distributed to all stakeholders.

Base practices

NOTE Base practices in this process relate to the project level test planningand also particular test planning of a test level (unit/integration/system/acceptance test plan) or a test type (e.g. performance/security/usability test plan).

TM.1.BP1: Understand context.Understand and document the context and the software testing requirements through reviewing the related documents (e.g. software development plan, related test basis, etc.) and identifying and interacting with the relevant stakeholders. [Outcome: a]

NOTE This activity should be an on-going activity throughout the lifetime of the project and the tasks in this activity can, in principle, be carried out in any order.

TM.1.BP2: Organize test plan development. Organize activities for test plan development ensuring early involvement of testing in the software development life cycle. Normally test planning starts with development planning and completes during requirement & analysis phase. [Outcome: b]

TM.1.BP3: Identify and analyzerisks. Identify, classify, evaluate and document risks that are related to project and/or product, which can be treated by software testing. [Outcome: c]

TM.1.BP4: Identify risk treatment approaches. Identify and document appropriate means of treating the risks (such as test levels, test types, test techniques and test completion criteria). [Outcome: c]

TM.1.BP5: Design test strategy.Design and document the test strategy and standard test process to be undertakenconsidering:

 

 

— Overall test planning in the project level or test planning for particular test level(s) and/or test type(s)

— functional and non-functional testing requirements

— early involvement of test planning and design activities in development life cycle

 

Also include the strategy to be undertaken with regard to:

 

 

— selected test levels and test types

— test deliverables

— test design techniques

— entry and exit criteria

— test completion criteria

— degree of independence

— metrics to be collected

— test data requirements

— justified deviations from the organizational test practices

 

An initial estimate of the required resources to perform the complete set of actions described in the test strategy should also be produced. [Outcome: d, e, f, g]

NOTE Where an organizational test process is available, existing process may be tailored to fit to the project context and risk.

TM.1.BP6: Determine staffing and scheduling.Identify the roles and skills that will berequired to carry out the testing described in the test strategy. Each test activity in the test strategy should be scheduled. [Outcome: e, f, h]

NOTE Where appropriate, identify recruitment and/or training needs.

TM.1.BP7: Recordtest plan. Document identified risks, test strategy and all the test decisions, final test estimates, test schedule and create the test plan. [Outcome:h]

TM.1.BP8: Gain consensus on test plan. Issue the draft test plan for review and approval by stakeholders. [Outcome: h]

TM.1.BP9: Communicatetest plan and make available.Publish thetestplan in a suitable form so that it is accessible to all stakeholders. [Outcome: h]

Information products

Test strategy [Outcome a, b,c, d]

Test plan [Outcome a, b, c, d, e, f, g, h]

5.2.2 TM.2Test monitoring and control process

Process ID

TM.2

Process name

Test monitoring and controlprocess

Process purpose

The purpose of the test monitoring and control process is to determine whether testing progresses in accordance with the test plan and with organizational test specifications (e.g. theorganizational test policy and the organizational test practices). It also initiates control actions as necessary and identifies necessary updates to the test plan (e.g., revise completion criteria or identify new actions to compensate for deviations from the test plan).

The process is also used to determine whether testing progresses in accordance with higher level test plans, such as the project test plan, and to manage the testing performed at specific test levels (e.g., system testing) or for specific test types (e.g., performance testing).

Process outcomes

As a result of the successful implementation of thetest monitoring and control process:

a) The means of collecting suitable measures to monitor test progress and changing risk are set up.

b) Progress against the test plan is monitored.

c) New and changed test-related risks are identified, analysed and necessary action(s) invoked.

d) Necessary control actions are identified.

e) Necessary control actions are communicated to the relevant stakeholders.

f) The decision to stop testing is approved.

g) Test progress and changes to the risks are reported to stakeholders.

Base practices

TM.2.BP1: Set-up.Identify suitable measures for monitoring test progress against the test plan and define means of identifying new and changing risks. [Outcome: a]

TM.2.BP2: Monitor. Monitor the progress of test processes (e.g., unit test, system test, performance test, usability test, etc.) against the test plan,then identify and document the divergence of actual testing from planned testing and analyse any new risks. [Outcome: b, c]

TM.2.BP3: Control. Undertake the testing activities documented in the test plan and control directives received from higher level management processes. Identify and take corrective action to deal with any discrepancy between planned progress and actual progress. [Outcome: d, e, f]

TM.2.BP4: Report. Document and communicate testing progress against thetest plan to stakeholders. [Outcome: g]

Information products

Test status report [Outcome:a, b, c, d, e, f, g]

5.2.3 TM.3 Test completion process

Process ID

TM. 3

Process name

Test completion process

Process purpose

The purpose of the test completion process is to make available useful test assets for later use, leave the test environment in a satisfactory condition and record and communicate the results of the testing to relevant stakeholders. Test assets include test plans, test case specifications, test scripts, test tools, test data and test environment infrastructure.

Process outcomes

As a result of the successful implementation of the test completion process:

a) Test assets are either archived or passed directly to the relevant stakeholders.

b) The test environment is in its agreed state (e.g. so that it is available for any following testing).

c) The test completion report is recorded.

d) The test completion report is approved.

e) The test completion report is communicated to relevant stakeholders.

Base practices

TM.3.BP1: Archive test assets. Identify test assets that may be useful in future or are expected to be reused at a later date. Then make them available using appropriate means and archives. [Outcome: a]

TM.3.BP2: Clean up test environment. Restore the test environment to a pre-defined state on completion of all testing activities. [Outcome: b]

TM.3.BP3: Identify lessons learned. Document the outcomes of the meeting for collecting lessons learned in the test completion report and communicate to the relevant authorities. [Outcome: c, e]

TM.2.BP4: Report test completion. Summarize the information collected during the project into a test completion report, obtain approval of the report and distributeto the relevant stakeholders. [Outcome: c, d, e]

NOTE Relevant information fromtest plans, test results, test status reports, test completion reports, Incident reports, etc. can be used.

Information products

Test completion report [Outcome: a, b, c, d, e]

5.3 Dynamictestprocessgroup

5.3.1 DT.1 Test design and implementation process

Process ID

DT.1

Process name

Test design andimplementation process

Process purpose

The purpose of the test design andimplementation process is to derive test procedures that will be executed during the test execution process. As part of this process the test basis is analysed, and a test model, test coverage items, test cases, and test procedures are derived.

Process outcomes

As a result of the successful implementation of the test design and implementation process:

a) The test basis for each test item is analysed.

b) A test model is created.

c) The test coverage items are identified.

d) Test cases are derived.

e) Test procedures are created.

Base practices

DT.1.BP1: Create test model. Analyse the test basis, identify characteristics of the test item that are to be tested based on test strategy and create the test model for the test item.Record the traceability between the test basis and the test model. [Outcome: a, b]

DT.1.BP2: Identifytest coverage items. The test coverage items to be exercised are identified from the test modelbyapplying test design techniques (e.g. statement testing, branch testing, decision testing, etc.) to achieve the testcompletion criteria specified in thetest plan. Record the traceability between the test basis, test model and the test coverage items. [Outcome:c]

DT.1.BP3: Derivetest cases. Deriveone or more test cases by determining pre-conditions, selecting input values and, where necessary, actions to exercise the selectedtest coverage items, and by determining the corresponding expected testing results. Record the traceability between the test basis, test model, test coverage items and test cases. [Outcome: d]

DT.1.BP4: Createtest procedures. Derivetest procedures by ordering test cases according to dependencies described by pre- and post-conditions andother testing requirements, such as risks to be treated for testing. Record the traceability between the test basis, test model, test coverage items, test cases, and test procedures (and/or automated test scripts). [Outcome: e]

Information products

Test model specification [Outcome: a, b]

Test coverage item [Outcome: c]

Test case specification [Outcome: d]

Test procedure specification [Outcome: e]

Traceability information [Outcome: a, b, c, d, e]

Test data requirements [Outcome: e]

Test environment requirements [Outcome: e]

5.3.2 DT.2Test environment and data management process

Process ID

DT.2

Process name

Test environment and data management process

Process purpose

The purpose of the test environment and data management process is to establish and maintain the required test environment and test data and to communicate theirstatus to all relevant stakeholders.

Process outcomes

As a result of the successful implementation of the test environment anddata managementprocess:

a) The test environment is set-up in a state ready for testing.

b) The status of the test environment is communicated to all relevant stakeholders.

c) The test environment is maintained.

d) The test data is prepared and is in a state ready for testing.

e) The status of the test data is communicated to all relevant stakeholders.

f) The test data is maintained.

Base practices

DT.2.BP1: Establishtest environment. Plan for the gathering of environment requirements, design and implement of the test environment.The status of the test environment is communicated to the relevant stakeholders, such as the testers and the test manager. [Outcome: a, b]

NOTEWhere appropriate, set up test tools to support the testing and install and configure test items.

DT.2.BP2: Prepare test data. Based on the test plan, the detailed requirements generated as a result of the test design and implementation process, and the scale/formality of the testing, plan the preparation of the test data and prepare the test data.The status of the test datais communicated to the relevant stakeholders, such as the testers and the test manager. [Outcome: d,e]

DT.2.BP3: Maintain test environment.Maintain the required test environment for immediateand continuous use. Atest environment defect log may be established which is managed and tracked to keep the environment in a state ready for testing. [Outcome: c]

DT.2.BP4: Maintain test data. The test data is maintained as defined by the test data requirements.Changes to the status of the test data is communicated to the relevant stakeholders.[Outcome: f]

Information products

Test environment readiness report [Outcome: a, b, c]

Test environment [Outcome: a, c]

Test environment updates (where applicable) [Outcome: c]

Test data readiness report [Outcome: d, e, f]

Test data [Outcome: d, f]

Test data updates (where applicable) [Outcome: f]

5.3.3 DT.3Test execution process

Process ID

DT.3

Process name

Test execution process

Process purpose

The purpose of the test execution process is to execute the test procedures created in the test design and implementation process in the prepared test environmentand record the results.

Process outcomes

As a result of the successful implementation of the test execution process:

a) the test procedures are executed.

b) the actual results are recorded.

c) the actual and expected results are compared.

d) the test results are determined.

Base practices

DT.3.BP1: Execute test procedure(s).Execute one or more test procedures in the prepared testenvironment and maintain the test execution log and/or updates to recorddetails of the test execution results. [Outcome: a, b]

DT.3.BP2: Compare test results.Compare the actual and expected results for each test case in thetest procedure. The test result of executing the test cases in the test procedure is determined. [Outcome: c, d]

DT.3.BP3:Recordtestexecution.Record test execution, as specified in the test plan. [Outcome: d]

Information products

Actual result [Outcome: a, b]

Test result [Outcome: c, d]

Test execution log [Outcome: a, b]

5.3.4 DT.4 Test incident reporting process

Process ID

DT.4

Process name

Test incident reporting process

Process purpose

The purpose of the test incident reporting process is to report to the relevant stakeholders those incidents requiring further action identified as a result of test execution. In the case of a new test this will require an incident report to be created.In the case of a retest,this will require the status of a previously-raised incident to be updated, but may also require a new incident report to be raised where further incidents are identified.

Process outcomes

As a result of the successful implementation of the test incident reporting process:

a) Test results are analysed.

b) New incidents are confirmed.

c) New incident report details are created.

d) The status and details of previously raised incidents are determined.

e) Previously raised incident report details are updated as appropriate.

f) New and/or update incident reports are communicated to the relevant stakeholders.

Base practices

DT.4.BP1: Analyse test results. Analyse the test results. Where a test result relates to a previously raised incident, update the incident details accordingly. Where the test result indicates that a new issue has be identified, determine whether it is an incident that requires reporting, an action item that will be resolved without incident reporting, or requires no further action to be taken.Assign action items to appropriate person for resolution[Outcome: a, b, c]

DT.4.BP2: Create/Update incident report. Create an incident report in the case of new incident and/or update the status of a previously raised incident in the case of retest. The status of new and/or updated incidents is communicated to the relevant stakeholders. [Outcome: d, e, f]

Information products

Incident report [Outcome: a, b, c, d, e, f]

6.0 The quality dimension

A process assessment model incorporates a process measurement framework conformant with therequirementsof ISO/IEC 33003 and is expressed as a process quality characteristic with a defined set ofprocess attributes. At minimum, a process measurement framework includes a process quality attributeofprocess performance, which is needed to demonstrate that the process achieves its expected processoutcomes. Other process quality attributes may be added over the process performance attribute.

NOTE 1 ISO/IEC 33020 provides a process measurement framework for the assessment of process capability which can be incorporated into this document. ISO/IEC 33020 also includes a set of process quality indicators for each process attribute in the process measurement framework.

The assessment indicators are used as a basis for collecting objective evidence to support an assessor’sjudgement in assigning ratings of the performance and quality of an implemented process. The set ofindicatorsdefined in ISO/IEC 33063 are not intended to be an all-inclusive set and applicable in its entirety.Subsets appropriate to the context and scope of the assessment should be selected, andpotentially augmented with additional indicators.

A process assessment is conducted according to a documented assessment process. A documentedassessment process will identify the rating method to be used in rating process attributes and identifyor define the aggregation method to be used in determining ratings.

NOTE 2 ISO/IEC 33020 includes a process attribute rating scale, process attribute rating method, and aggregation method which can provide a suitable basis for use for incorporating into any documented assessment process.


  1. (informative)

    Assessment guidelines
    1. General assessment guideline(informative)

This process assessment model for software testing builds upon the standard ISO/IEC/IEEE 29119Software and Systems Engineering – Software Testing – Part 2: Test Processes.Within this standard the test process is described in a generic way using a multi-layer approach, see figure 2 in Clause 4.2.2.

The top level consists of the Organizational Test Process Group consisting of one process called the Organizational Test Process. The second level is the Test Management process group (TM). The three processes Test Strategy and Planning Process, Test Monitoring and Control Process and Test Completion Process belong to this group. The bottom layer is the Dynamic Test Process group (DT). Within these group four processes are defined: Test Design & Implementation Process, Test Environment &Data ManagementProcess, Test Execution Process and Test Incident Reporting Process.

All can be seen in Figure 2, no test levels or test types are prescribed within ISO/IEC/IEEE 29119-2. As every project is different, the testing needs also differ. In a project which develops a safety-related system a lot of emphasis will be placed on testing with several test levels and test types, whereas a project which develops a web application might do less testing.

In order to accommodate all projects, the generic description of test processes was chosen. This is an important concept in ISO/IEC/IEEE 29119 and needs to be known in order to understand the standard. The process assessment model described in this standard follows the same concept. For each of the generic processes defined in ISO/IEC/IEEE 29119-2 a corresponding process description in the process assessment model is defined.

However, this generic approach has a high impact of the planning and scoping of an assessment.

In case a project has two test levels, e.g., unit testing and system testing, the test processes defined in the Test Management process group (TM) and in the Dynamic Test Process group (DT) will have to be “applied” two times. Test design and implementation, test environment will not be the same for unit testing and system testing. Also test planning, monitoring & control and test completion may be done differently for the two test levels and therefore be assessed separately.

In addition, conformant with the requirements of ISO/IEC 33002 the lead assessor may decide together with the assessment sponsor to include further processes from other process assessment model.

    1. Process application guideline (normative)

Within the scope definition of the assessment the lead assessor together with the sponsor shall specify which processes will be assessed and to which the test level(s), and test type(s) they are applied.

In case a project has two test levels, e.g.,unit testing and system testing, and a test type(non-functional testing), e.g., reliability testing, the test processes defined in the Test Management process group (TM) and in the Dynamic Test Process group (DT) will have to be “applied”three times. It also needs application of the Test Management process group (TM) for project test including project(master) test plan process.The following processes would be assessed and rated separately:

— OT.1 Organization test process

— TM.1 Test strategy andplanning process [for Project Test]

— TM.2 Test monitoring and control process [for Project Test]

— TM.3 Test completion process [for Project Test]

— TM.1 Test strategy andplanning process [for Unit Testing]

— TM.2 Test monitoring and control process [for Unit Testing]

— TM.3 Test completion process [for Unit Testing]

— DT.1 Test design and implementation process [for Unit Testing]

— DT.2 Test environment and data management process [for Unit Testing]

— DT.3 Test execution process [for Unit Testing]

— DT.4 Test incident reporting process [for Unit Testing]

— TM.1 Test strategy andplanning process [for System Testing]

— TM.2 Test monitoring and control process [for System Testing]

— TM.3 Test completion process [for System Testing]

— DT.1 Test design and implementation process [for System Testing]

— DT.2 Test environment and data management process [for System Testing]

— DT.3 Test execution process [for System Testing]

— DT.4 Test incident reporting process [for System Testing]

— TM.1 Test strategy andplanning process [for Reliability Testing]

— TM.2 Test monitoring and control process [for Reliability Testing]

— TM.3 Test completion process [for Reliability Testing]

— DT.1 Test design and implementation process [for Reliability Testing]

— DT.2 Test environment and data management process [for Reliability Testing]

— DT.3 Test execution process [for Reliability Testing]

— DT.4 Test incident reporting process [for Reliability Testing]

When the lead assessor performs the rating of the Test Design and Implementation Process and Test strategy and planning process, for example, they have to be rated three times for unit testing, system testing, and reliability testing as seen in figure B.2.

Figure A.2 — Example of process rating

In the case in figure B.2, the application context shall be clearly scoped if the lead assessor is to perform rating once for each of both processes, for example, partially for the Test Design and Implementation Process and fully for the Test strategy and planning process. This means that the Test Design and Implementation Process is partially achieved, and Test strategy and planning process is fully achieved only in the scope of unit testing, system testing and reliability testing. Here, the Test strategy and planning process for project test, so called the Project(or Master) Test strategy and planning process can be assessed together or separately based on the lead assessor’s judgement.

    1. Guideline on the use of additional processes from other PAM(informative)

Conformant with the requirements of ISO/IEC 33002 the lead assessor may decide together with the assessment sponsor to include further processes from other process assessment models, other standards, and self-defined processes (Annex E in this case). For example, if the assessment sponsor wants to assess the defect resolution capability, the process“SUP.9Problem resolution management process” from the process assessment model in ISO/IEC 15504-5 may be added. Other examples of additional processes are shown in Annex D. They include the Work Product Review Process(from ISO/IEC 20246), and the Static Analysis Process (from Annex E).

Other candidates for the additional processes can be the incident management process, the configuration management process, and so forth that the lead assessor together with the help from assessment sponsor considers necessary for practical test process assessment.


  1. (informative)

    Information product characteristics
    1. Information product categories

This annex describes the categories of information product. The category description characterizes the distinctive nature or features of the category.

Category

Category description

agreement

Mutual acknowledgement of terms and conditions under which a working relationship is conducted. [ISO/IEC/IEEE 15288:2015]

data

Representation of facts, concepts, or instructions in a manner suitable for communication,interpretation, or processing by humans or by automatic means. [ISO/IEC/IEEE 24765]

description

Planned or actual concept, function, design, or object. [ISO/IEC/IEEE 15289]

plan

Systematic course of action for achieving a declared purpose, including when, how, and by whomspecific activities are to be performed. [ISO/IEC/IEEE 15289]

policy

Clear and measurable statements of preferred direction and behavior to condition the decisions madewithin an organization. [ISO/IEC/IEEE 15289]

procedure

An ordered series of steps to perform a process, activity, or task. [ISO/IEC/IEEE 15289]

product

Result of a process. [ISO/IEC/IEEE 15288]

Note: Includes systems that are both products and services, and systems elements such as softwareand hardware products

record

Set of related data items treated as a unit to state results achieved or to provide evidence of activitiesperformed. [ISO/IEC/IEEE 15289 and ISO 9000]

registry

Book or system for keeping an official list or record of work products and the associated informationitems. [ISO/IEC/IEEE 24765]

Note: Repository and library items can be recorded in registries to enable better management andgovernance of these items

report

Results of activities such as investigations, assessments, and tests. A report communicates decisions. [ISO/IEC/IEEE 15289]

request

Defined course of action or change to fulfill a need. [ISO/IEC/IEEE 15289]

specification

Identifies, in a complete, precise, and verifiable manner, the requirements, design, behavior, or otherexpected characteristics of a system, service, or process. [ISO/IEC/IEEE 15289]

    1. Information product description

The information items associated with the processes in Clause 5 and Annex C are described in this annex. The descriptions are exemplary.

Information product

Information product description

Category

Output of

Actualresult

—   comparison actual vs. expected result

record

DT.3

Analysis report

—   object of analysis

—   person conducting the analysis

—   analysis criteria used

—   analysis results

—   aspects of correctness to analyze

report

TM.4

Evaluation report

—   purpose of evaluation

—   method used for evaluation

—   requirements used for the evaluation

—   assumptions and limitations

—   context and scope information

—   evaluation result

report

TM.4

Incident report

—   context of incidence occurrence

—   originator

—   timing information

—   description of incident

—   severity

—   priority

—   risk

—   status

report

DT.4

STAT.1

STAT.2

Issue log

—   a list of issues identified during individual review

record

STAT.1

Organizational test policy

—   test lifecycle management policy

—   test lifecycle reference models

—   test process improvement policy

—   resource allocation plan

—   training strategy

policy

OT.1

Organizational test practices

—   decision-making strategy

—   risk treatment strategy

—   configuration management strategy

—   information management strategy

—   integration strategy

—   test strategy

—   measurement strategy

policy

OT.1

Organizational test specification

—   test policy

—   test strategy

—   test mission

—   organizational test process

specification

OT.1

Problem management plan

—   problem resolution activities including identification, recording, description and classification

—   problem resolution approach: evaluation and correction of the problem

—   problem tracking

—   any timing constraints

—   mechanism to collect and distribute problem resolutions

plan

TM.4

Problem status report

—   summary of problem records

—   status of problem solving

report

TM.4

Project test plan

—   master test plan

—   specific level test plan

—   validation & verification plan

—   staffing plan

—   test resource/environment plan

—   monitoring and control

—   test completion

—   entry and exit conditions

—   project and product risk identification& treatment

—   schedule and estimation

plan

TM.1

Review plan

—   review technique

—   review schedule and owner

—   review level and type

—   review process

plan

STAT.1

Review report

—   context of the review

—   coverage of the review

—   required corrective actions

report

STAT.1

Static analysis completion report

—   static analysis completion criteria

—   static analysis summary

—   static analysis measurement technique

report

STAT.2

Static analysis environment readiness report

—   status of each static analysis environment requirement

—   Static analysis entry conditions

report

STAT.2

Static analysis item

—   function and non-function of system and software

—   customer requirements

—   product specification

—   software process and structure

—   data type and structure

—   storage or repository

specification

STAT.2

Static analysis methods and rules

—   rule definition mechanism

—   rule sets

—   analysis tool operation

—   analysis techniques such as semantic analysis, cyclomatic complexity, rule checking, etc.

specification

STAT.2

Static analysis result

—   compliance deviations

—   summary of deviations

record

STAT.2

Test case

—   pre-conditions

—   Input

—   execution conditions

—   expected results

—   specific test coverage items

specification

DT.1

Test completion report

—   test completion criteria

—   testing summary

—   test measurement technique

report

TM. 3

Test coverage item

—   acceptance criteria

—   test objectives

—   safety and security conditions

specification

DT.1

Test data

—   test item

—   storage environment

—   creation and distribution schedules

—   data type & size

data

DT.2

Test data readiness report

—   status of each data requirement

report

DT.2

Test data requirements

—   test data type

—   test data conditions

—   input and output data

—   data parameter

—   data storage

specification

DT.1

Test environment

—   development environment

—   working environment- hardware and software system

—   installed system

—   integrated system

description

DT.2

Test environment readiness report

—   status of each environment requirement

—   test entry conditions

report

DT.2

Test environment requirements

—   test environment

—   test data

—   communication plan with related team

—   test automation requirement

specification

DT.1

Test execution log

—   operational configuration

—   test cases executed

—   test period

—   execution time stamp

—   execution owner

record

DT.3

Test model

—   requirement statements, equivalence partitions, state transition diagram, use case description, decision table

specification

DT.1

Test plan

—   master test

—   detail level test

—   level test process and procedure

—   test input and output

—   test progress

—   test measurement

—   test staff

—   test schedule and estimation

plan

TM.1

Test procedure

—   test design & implementation

—   test environment

—   test tool

—   test case/suite

—   operator task training

—   assembly sequence

—   fault diagnosis sequence

—   acceptance conditions

procedure

DT.1

Test result

—   comparison actual vs. expected result

—   decision pass/fail

record

DT.3

Test status report

—   planned cost, resources, schedule

—   actual cost against project plan

—   actual or estimated labor costs

—   actual or estimated material costs

—   actual or estimated test costs

—   actual measured achievement

—   milestone completion

report

TM.2

Test strategy

—   specific level or instantiated test strategy

—   risk mitigation strategy

—   test technique approach

plan

TM.1

Traceability information

—   mapping information between requirement, test case/suite list, and coverage

—   configuration information

description

DT.1


  1. (informative)

    Additional processes
    1. Static testing process group
      1. STAT.1 Work product review process

Reviewing to check testability and deriving test cases during the course of review is core part of software testing. The work product review process is from the ISO/IEC 20246, Software and systems engineering — Work product reviews.

Process ID

STAT.1

Process name

Work product review process

Process purpose

The purpose of the work product review process is to provide a structured but flexible framework from which review processes (both formal and informal) may be tailored for specific contexts and purposes.

Process outcomes

As a result of successful implementation of the work product review process:

a)   defects/issues in the work product are identified.

b)   quality characteristics of the work product are evaluated

c)   reviewers have gained knowledge about the work product.

d)   consensus on decisions made has been reached.

e)   new ideas have been generated.

f)   updates to the work product are made.

g)   participants have identified potential improvements in their working practices.

NOTE   The successful implementation of this process also include planning and initiating reviews.

Base practices

STAT.1.BP1: Plan review. Define the scope of the review, which comprises the purpose, the work product to be reviewed, quality characteristics to be evaluated, areas to focus on, exit criteria, standards, effort and the timeframes are defined. [NOTE]

STAT.1.BP2: Initiate review. Required review materials are distributed to review participants and the review leader should communicate the scope and characteristics of the review to the review participants. [NOTE]

STAT.1.BP3: Perform individual review. Each reviewer should perform a review to identify issues with the work product. [Outcome: a, b, c]

STAT.1.BP4: Communicate and analyse Issue. Identified issues is communicated and analysed to assign them a status based on the subsequent action to be taken on them. [Outcome: b, d, e]

STAT.1.BP5:Fix and report issue. Issues with a status of requiring a change to the information productare actioned and the results of the review are reported. [Outcome: f, g]

Information products

Review plan [Outcome: NOTE]

Issue log [Outcome: a, b, c]

Incident report [Outcome: d, e]

Review report [Outcome: f, g]

      1. STAT.2Static analysis process

Static analysis has been widely adopted in industry as one of the major practices detecting defects in an efficient manner. This process is from the “E.1Static analysis” process defined on Annex E.

Process ID

STAT.2

Process name

Static analysisprocess

Process purpose

The purpose of the static analysisprocess is to identify all test items, define the static analysis method, perform thestatic analysis and report the outcomes.

Process outcomes

As a result of successful implementation of the static analysisprocess:

a)   the static analysis items are selected;

b)   the static analysis method is chosen;

c)   if appropriate, the static analysis rules are selected;

d)   the tools and environment are ready;

e)   the static analysis is performed;

f)   the static analysis results are recorded;

g)   incidents are resolved as part of the Incident Reporting Process.

Base practices

STAT.2.BP1: Select test items for static analysis. Select the artefacts that constitute the test item for static analysis based on the strategy defined in the testplan (e.g., test completion criteria, features to be tested). Identify resource skills, training needs and availability to undertake the scheduled analysis. [Outcome: a]

NOTE Selection result can be documented as a separate document or inside the Sub-process Test Plan for static testing.

STAT.2.BP2: Select static analysis method and if appropriate the rules. Select the rules to support the static analysis which are specified in the test plan. [Outcome: b]

STAT.2.BP3: Set-up tools and environment. Identify, procure and set up any requirement for tools needed to support static analysis. [Outcome: d]

STAT.2.BP4: Execute static analysis. Undertake the analysis of the test item using specified rules and tools as documented in the test. [Outcome: e]

STAT.2.BP5: Record analysis outcome. Record the result of static analysis undertaken; results should include details of rules applied and where the rule has been broken [Outcome: f]

STAT.2.BP6: Record issues. Pass the recorded issues (potential incidents) to the Incident Reporting Process. [Outcome: g]

Information products

Static analysis methodsand rules[Outcome: b, c]

Static analysis environment readiness report [Outcome: d]

Static analysis result [Outcome: e, f]

Incident report [Outcome: g]

Static analysis completion report [Outcome: g]

    1. Test management process group
      1. TM.4 Problem resolution management process

Problem resolution management is an integral process with testing processes to improve software quality. Without proper defect triage and resolution, improving the actual quality of the test items is not possible. The problem resolution management process is directly from the “SUP.9 Problem resolution management” from the process assessment model in ISO/IEC 15504-5 (withdrawn).

Process ID

TM.4

Process name

Problem resolution management

Process purpose

The purpose of the problem resolution management process is to ensure that all discoveredproblems are identified, analyzed, managed and controlled to resolution.

Process outcomes

As a result of successful implementation of the problem resolution management process:

a)   a problem management strategy is developed.

b)   problems are recorded, identified and classified.

c)   problems are analyzed and assessed to identify acceptable solution(s).

d)   problem resolution is implemented.

e)   problems are tracked to closure.

f)   the status of all problem reports is known.

NOTE   Problem resolution management may initiate a change request.

Base practices

TM.4.BP1:Develop problem resolution strategy. Determine the problem resolutionstrategy for ensuring that problems are described, recorded, analyzed, and corrected. [Outcome: a]

TM.4.BP2: Identify and record the problem. Each problem is uniquely identified andrecorded. [Outcome: b]

TM.4.BP3: Provide initial support and classification. Provide initial support and feedbackon reported problems and classify problems according to the severity. [Outcome: b]

NOTE Classification of problems may be in terms of criticality, urgency, relevance etc.

TM.4.BP4: Investigate and diagnose the cause of the problem.Analyze problems inorder to identify the cause of the problem. [Outcome: c]

NOTE A problem may be a known error or may impact application installed on multipleplatforms.

TM.4.BP5: Assess the impact of the problem to determine solution. Assess the impactof the problem to determine appropriate actions, and to determine and agree on a solution.[Outcome: c]

TM.4.BP6: Execute urgent resolution action, where necessary. If the problem warrantsimmediate resolution pending an actual change, it obtains authorization for immediate fix.[Outcome: d]

TM.4.BP7: Raise alert notifications, where necessary. If the problem is of high severityand impacts other systems or users, an alert notification may need to be raised, pending a fixor change. [Outcome: d, f]

TM.4.BP8: Implement problem resolution. Implement problem resolution actions toresolve the problem and review the implementation. [Outcome: d]

TM.4.BP9: Initiate change request. Initiate change request for diagnosed errors.[Outcome: e]

TM.4.BP10: Track problem status. Track to closure the status of identified problems.[Outcome: e, f]

Information products

Problem management plan [Outcome: a]

Problem record [Outcome: b]

Analysis report [Outcome: c]

Evaluation report [Outcome: c]

Problem status report [Outcome: f]


  1. (informative)

    Supplementary process definition
    1. Static analysis process

Process ID STAT.2

Process name Static analysis process

Process purpose

The purpose of the static analysisprocess is to identify all test items, define the static analysis method, perform thestatic analysis and report the outcomes.

Process outcomes

As a result of successful implementation of the static analysisprocess:

a) the static analysis items are selected;

b) the static analysis method is chosen;

c) if appropriate, the static analysis rules are selected;

d) the tools and environment are ready;

e) the static analysis is performed;

f) the static analysis results are recorded;

g) incidents are resolved as part of the Incident Reporting Process.

NOTE This process description is modified based on the “static analysis preparation” process and “Perform static analysis” process of PRM, ISO/IEC CD1 29119-2:2011, Systems and Software Engineering — Software Testing — Part 2: Test Processes.

Bibliography

NOTE The following documents contain definitions and may provide general guidance to terms in the indicator set.

[1] ISO 9001:2015, Quality management systems — Requirements

[2] ISO 9004:2018, Quality management — Quality of an organization — Guidance to achieve sustained success

[3] ISO/IEC/TR 9294:2005, Information technology — Guidelines for the management of softwaredocumentation

[4] ISO 10007:2017, Quality management — Guidelines for configuration management

[5] ISO/IEC 12207:2017, Systems and software engineering — Software life cycle processes

[6] ISO/IEC 15288:2023, Systems engineering — System life cycle processes

[7] ISO/IEC 15289:2019, Systems and software engineering — Content of systems and software lifecycle process information products (Documentation)

[8] ISO/IEC/TS 33061:2021, Information technology — Process assessment — Process assessment model for software life cycle processes

[9] ISO/IEC 25000:2014, Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — Guide to SQuaRE

[10] ISO/IEC/IEEE 29119‑3:2021, Software and systems engineering — Software testing — Part 3: Test documentation

[11] ISO/IEC/IEEE 29119‑4:2021, Software and systems engineering — Software testing — Part 4: Test techniques

[12] ISO/IEC 33002:2015, Information technology — Process assessment — Requirements for performing process assessment

[13] ISO/IEC 15504‑5:2012, Information technology - Process assessment – Part 5: An exemplar software process assessment model (withdrawn)

espa-banner