Beta

This is a new service and pages are being tested and improved.

34.1 Purpose of verification and validation

The purpose of verification is to check the correctness of a solution, or part of a solution, to confirm that it complies with the specified design. Verification should be aimed at detecting problems, faults or failures.

The purpose of validation is to ensure the right problem is being addressed and the solution (or part of a solution) is likely to fulfil the user needs when operating in its intended environment. Validation should be aimed at demonstrating user and stakeholder satisfaction.

34.2 Key points

  • Start verification and validation early, the later problems are identified, the more costly they are to resolve.
  • Involve users and stakeholders in validation where their consent is needed for achieving the desired outcomes.
  • Finalise the verification and validation strategies alongside the high-level design.
  • The people designing the solution should be involved when developing the verification and validation strategies and planning the activities.
  • Set limits on time and cost to stop costs running away and the schedule slipping.

34.3 Why manage verification and validation?

Verification and validation provide the evidence base for confident decision-making throughout work. They demonstrate that outputs comply with specifications, that the solution satisfies policy or business objectives, and that user and stakeholder needs are being met. They also provide the basis for supplier payment and give confidence that what has been built performs a useful purpose.

Well thought-through verification and validation strategies can lead to significant savings without compromising the quality of the outputs or their utility. Achieving the balance between the cost and extent of verification and validation comes down to risk. For example, considering the risk and impact of a solution failure in operations, or of a clunky solution which people then are reluctant to adopt.

Good discipline in the choice and use of methods for determining user needs and requirements, solution design, development and integration can drastically reduce the number of faults or problems detected and therefore the cost of verification and validation. In other words, quality assurance tends to have greater effect on quality than quality control (see Chapter 30: Quality management).

34.4 What is verification and validation?

The Project delivery glossary defines verification as:

An activity that ensures that a solution (or part of) is complete, accurate, reliable and matches its design specification.

The Project delivery glossary defines validation as:

An activity that ensures a solution (or part of) meets the needs of the business. Validation ensures that business requirements are met even though these might have changed since the original design.

Traditionally, verification and validation are thought of as being done towards the end of development, such as when the outputs are tested and accepted, or once the whole solution is built, for example, operating on a trial or shadow basis before it goes into full operation. However, verification and validation can, and should, be undertaken throughout requirements-gathering and the design and build of a solution so that:

  • the user needs and requirements reflect what is truly wanted
  • the solution meets the government’s policy objectives and/or the sponsoring organisation’s strategic objectives
  • the different solution options can be assessed robustly in terms of their quality and utility
  • the component parts of the solution can be proved as compliant prior to integration into the whole solution
  • the materials used in physical outputs can be proved fit for purpose
  • the interfaces among the different parts of the solution can be proved to be effective

Verification and validation should be iterative and should evolve in line with the solution as work progresses.

34.5 Who manages verification and validation?

People undertaking a project delivery role require at least an awareness in verification and validation and of its implications for their own role, in particular.

For a programme or project, the senior responsible owner is accountable for ensuring a verification and validation management framework is in place and that it is effective.

The programme or project manager, as appropiate, is responsible for ensuring the verification and validation strategies and methods are being managed and used.

The verification and validation strategies and methods should be defined and owned by a solution specialist working with a designated component specialist for each output or group of closely-related outputs. The responsibilities normally follow the solution hierarchy (sometimes called a system hierarchy or product breakdown structure). The role titles for people who manage verification and validation differ widely, depending on the type of output and methodology used.

For the programme or project manager to fulfil their responsibilities, they need to have access to verification and validation records and to understand the implication the results have on the completion of the work and the likelihood of achieving the objectives. The senior responsible owner is accountable for the programme or project achieving its objectives at an acceptable level of risk and therefore needs to be kept updated on issues that could threaten the achievement of those objectives.

34.6 How to manage verification and validation

34.6.1 What to consider in verification and validation

34.6.1.1 Involving the users and stakeholders

Careful consideration should be given to ensuring that users and stakeholders are engaged in verification and validation activities where their opinions have a direct impact on the utility of the solution. In many cases it is not possible to engage all users and stakeholders and so representatives often need to be selected. Users and stakeholders should have been engaged in determining the user needs and requirements and so it is appropriate that they are satisfied these needs have been appropriately interpreted and met. Some development approaches have user involvement built into their methodologies (such as some agile approaches) but other approaches need specific activities planned for the work.

34.6.1.2 Monitoring the work and maintaining records

Each chapter of Part F: Solution delivery  emphasises the need to have measures to monitor the progress of the work. Verification and validation measures are a vital part of that monitoring. Even if the schedule and spend profiles of the plan look to be on target, the level of problems or defects detected could indicate that the plan is either not realistic or the current rate of progress is unlikely to persist. There are likely to be many techniques used for monitoring verification and validation, and the programme or project manager needs to collate the results from these into a holistic view of the overall status and to help identify problematic areas.

34.6.1.3 Choosing the right techniques

The techniques which can be used for verification and validation are the same, although the purposes are very different. A common verification technique is testing for functionality, but this can be too expensive, inappropriate or not always feasible for all deliverables and outputs. Other techniques are sampling (for example, when concrete cube samples are taken to verify the concrete is strong enough in a construction project) or demonstration (such as trials for a new marine vessel or aircraft). In service transformation, user-based verification and validation are usually used before wider operational trials.

Many of the techniques can be used early in the life cycle well before the solution or any part of it, has been built, often on digital simulations. In iterative or agile work, verification and validation is carried out on an ongoing basis as an integral part of development.

Table 34.1 Examples of verification and validation techniques
Technique Verification

To show compliance with the specification and detect errors

Validation

To prove stakeholder satisfaction regarding operational capability

Inspection Examination of a component relying on the human senses or simple methods of measurement and handling. Inspection is generally non-destructive, and typically includes the use of sight, hearing, smell, touch, and taste, simple physical manipulation, measurement and, where necessary, mechanical and electrical gauging. No tests are necessary.
Analysis Based on analytical evidence using mathematical or probabilistic calculation, and/or logical reasoning to show theoretical compliance.
Simulation Performed on models or mock ups for verifying features and performance. Can be physical (for example, ‘model office’ environments, or digital, with the latter used increasingly in the form of ‘digital twin’ and virtual simulation in infrastructure, operational and service delivery settings.
Analogy Based on evidence of similar components to the submitted component or on experience feedback. Analogy should only be used if the submitted component is similar in design, manufacture and use; more stringent verification criteria were used previously; and the intended operational environment is the same or less rigorous than the similar component.
Demonstration Demonstration (or operational / field testing) of the correct operation of the submitted component against operational and observable characteristics without, or with minimal, test equipment). It generally consists of a set of tests to show that the component is fit for its purpose or to show that people can perform their work as part of or when using the component.
Test Performed to confirm functional, measurable characteristics, operability, supportability, or performance capability is satisfactory when subjected to controlled conditions that are real or simulated. Testing often uses special test equipment or instrumentation to obtain accurate quantitative data to be analysed.
Sampling Technique based on verification of characteristics using a representative sample. The number, tolerance and other characteristics should be specified to reflect to population the sample represents.

 

 

Testing is most common in digital delivery where specific test serve to verify a particular requirement has been met. These can be automated or manual. For example:

  • functional tests verify the solution does what it is designed to do
  • user acceptance tests validate the user need has been met.
  • penetration testing for security
  • performance testing for response time and stress testing for capacity

34.6.1.4 Applying verification across the life cycle

Verification can, and should, be applied to any aspect of the management of the programme or project, to verify the management documentation or the delivery of the solution to make sure it is designed and/or built correctly. The earlier errors are detected, the easier they are to rectify. For example, a poorly drafted or ambiguous user requirement which is not discovered until the whole solution is being accepted, could lead to the wrong solution being built. Table 34.2 includes some examples.

Table 34.2 Examples of applying verification
Application Purpose
Management deliverable To make sure the deliverable conforms to its product description
Stakeholder or user requirement To make sure the syntax and grammatical rules or followed and that the description is unambiguous, consistent, complete, feasible traceable and verifiable
Solution architecture To check that the chosen modelling techniques or methods have been applied and used, consistently and correctly
Solution component To check the chosen design principles, models, constraints and technologies have been applied and used, consistently and correctly
Solution To check that the solution’s actual characteristics meet the specified requirements as described in the requirements and design documents

 

34.6.1.5 Applying validation across the life cycle

Validation can, and should, be applied to any aspect of the management of the delivery of the solution to ensure that it is usable and works in an operational setting. Decisions to validate should be based on predefined acceptance criteria.

There is little point in a solution complying with the design specifications if what has been designed does not meet the users and stakeholders’ needs or if it does not work in its proposed environment. Similarly, there is little point in a plan which is ‘right’ in terms of content but does not lead to the achievement of the objectives. Table 34.3 includes some examples.

Table 34.3 Examples of applying validation
Application Purpose
Management deliverable To make sure the deliverable meets the needs of those using it for management oversight, supervision and decision-making
System requirement or specification To make sure the description of the requirement is justified and relevant to the users’ and other stakeholders’ needs or expectations
Design document To make sure the content is justified and relevant to the user’s and other stakeholders’ needs or expectations and contributes to achieving the business objectives and the operational scenarios
Solution To demonstrate that the product service or enterprise satisfies its user and other stakeholder requirements, business objectives and operational scenarios

 

 

 

The involvement of users in validation is important and often built into delivery approaches, such as in agile delivery. Commonly used approaches include user acceptance testing, operational acceptance testing, operational trials and pilots.

34.6.2 Preparing for verification and validation

34.6.2.1 Overview

Verification and validation are specialist disciplines and the approach, documentation and tools used depend on the approach chosen for each output. Regardless of the approach adopted, it is vital for project delivery practitioners to understand the context for the work so they can define and agree the management approaches to be used with those designing, developing and integrating the solution.

34.6.2.2 Make sure strategies are robust

The strategies of verification and validation, like that for integration (see Chapter 33: Solution development and integration), are set when the high-level design of the solution is chosen. Once agreed, the high-level design is the major constraint on managing any future issues or defects. These strategies should be reflected in the detailed plan for the work (see Chapter 16: Planning). Care should be taken to assess all these strategies to make sure they are consistent and cover all the phases of the solution’s life cycle, paying particular attention to the:

  • timing of the delivery of solution components: will they be in time? What is the impact if late?
  • transition arrangements, especially what needs to be in place before they can happen
  • timing of outputs, and associated operational, organisational or societal changes
  • time taken and availability of verification and validation resources
  • allowance for rework in case of defects, problems or validation failures

34.6.2.3 Prepare the management information systems

Management information systems, whether automated or manual, should be set up to track the results of the planned verification and validation activities. Specialist applications are usually used for specific types of output but the results need to be able to be pulled together to gain an overview of how well work is proceeding at a programme or project level. Overall reporting should be designed to track the results in real time as part of doing the work, or they risk being seen as an administrative burden and fall into disuse (see Chapter 18: Reporting).

34.6.3 Key activities in verification and validation

34.6.3.1 Overview

Verification and validation comprise a set of similar activities undertaken throughout the life of a programme or project which are repeated through the solution hierarchy. The activities are summarised in Figure 34.1.

Flowchart illustrating a verification and validation framework, outlining the roles of the Solution Specialist (managing the entire process) and Component Specialist (managing specific parts). It depicts the steps involved, from developing the framework to closing it, including preparation, execution, and result management. The framework incorporates feedback loops, quality registers, and reporting mechanisms, while interacting with other processes like traceability management, solution design, development, and transition.
Figure 34.1 An overview of the key verification and validation activities and their primary relationships

34.6.3.2 Develop and maintain the verification and validation management framework

The approach to verification and validation should be defined including any processes, methods, tools and techniques to be used. This forms part of the quality management framework (see 30.6 on how to manage quality) and the overall governance and management framework for the work (see Chapter 4: Governance and management). The important aspects of this activity are discussed in more detail in 34.6.2 preparing for verification and validation. The framework should be maintained to address relevant feedback from its use.

34.6.3.3 Oversee verification and validation

It is essential to maintain an overall understanding of how development and integration work is progressing and what the prevailing risks are. Defects and problems identified during the work should be investigated to see if there is any pattern or underlying cause. The progress against plan, resources used, and overall results should be analysed to assess the likelihood of achieving the objectives of the programme or project.

The management framework for verification and validation should be monitored to make sure it remains effective and appropriate as the work proceeds.

34.6.3.4 Prepare for verification and validation

The main activities are described in 34.6.2 preparing for verification and validation. Those for management deliverables, user needs and requirements should start at the beginning of the programme or project. Those relating to the delivery of the solution should sit alongside the high-level design.

34.6.3.5 Perform verification and validation

Finalise and agree the detailed plans and techniques to be used before verification and validation itself starts. This should include actions to be taken and expected results. Acquire the necessary resources. Perform the actions in accordance with the plan, record the results as either compliant or not compliant.

34.6.3.6 Manage the results of verification and validation

Analyse the results comparing the actual to expected outcome. Take necessary action to resolve non-compliant results, which can mean reworking to resolve defects or changes to the design or even requirements (see Chapter 17: Controlling). Coordinate with the designers to make sure that any resultant actions are fully defined, and plans amended accordingly.

34.6.3.7 Close the verification and validation management framework

Once the programme or project has been completed, the management framework should be either merged into the management framework for the solution or closed,retaining information and data in accordance with the sponsoring organisation’s information retention policy (see Chapter 24: Information and data management).

Updates

Page permissions updated for public launch.

First published for closed beta consultation.

Back to top