The economics of software testing

Opinion

The economics of software testing

The challenge facing any software development manager is how to balance the natural tension that exists between time, cost and quality. On top of this, programmes also have to balance the three organisational elements of corporate strategy, delivery mechanisms for change and the business-as-usual environment. Quite a balancing act.

Programmes are often time or cost constrained, and there is a temptation to perceive quality activities as blockers to delivery or a costly, if necessary, evil. Experience, however, suggests that investing top-down efforts on quality actually improves the chances of delivery by reducing re-work and driving down the ultimate cost and duration of the programme. This also ensures the right focus on acceptance criteria and communication of required project outputs, programme outcomes and ultimately benefits realisation.

A practical and measurable example of this is to look at the varying cost of defects in the software lifecycle. The universal challenge to software development projects is to detect and remove defects quickly and cheaply and reduce the risks they impose on the functioning of the product. Defects are introduced at a number of points during the software development lifecycle. The closer to the point of insertion a defect is removed, the more efficient the solution. In addition, the cost of defect removal increases exponentially as the development lifecycle progresses. It is therefore advantageous to detect and fix defects as early in the lifecycle as possible.

Relative cost of defects

The following chart is a summary of the relative cost of fixing defects detected at various stages in the software development lifecycle process in a typical project. The cost of defect removal increases exponentially as the development lifecycle progresses. In addition, the later defects are found and fixed, the greater the risk to the business they pose.

Relative cost of defects

Source: StickyMinds.com, calculating the economics of inspections, by Ed Weller

Defect insertion and detection points

Defects are introduced to a system at a number of points during the software development lifecycle. The following chart is a model of a typical development lifecycle that illustrates the usual points of defect insertion and detection. The left-most line here shows at which points in the project lifecycle the defects are introduced. The second line illustrates the traditional points at which defects are detected and removed.

Detection and insertion points

Source: Six Sigma Software Metrics Part 2, by David L. Hallowell

A project should aim to push these two lines together as much as possible. One way of doing this is to treat the requirements and design documentation to a structured formal review and testing process early on in the project lifecycle. This will reduce defects occurring related to ambiguous, unverifiable, contradictory or incomplete requirements before they have even been committed to code. Furthermore, unclear or ambiguous requirements often leads to defect tennis between business analysts, testers and developers, which is hardly good use of valuable project time.

Signed-off business requirements that are clear, unambiguous and have measurable, testable acceptance criteria are much more likely to be developed to reflect the business' intent.

Market research indicates that 30% of a project budget is consumed by rework activities, meaning those things that should have been done or communicated properly the first time. A focus on top-down quality with measurable acceptance criteria should look to reduce this figure. Research also indicates that of that rework, 70% is related to the requirements. In other words, 21% of the project budget is spent on correcting and removing defects originating in the requirements and design documents.

A requirements testing process has been successfully implemented for a large UK Banking programme. On average, there were 3.5 defects per requirement, which were updated and prevented from moving forward into the later stages of development and testing. If these defects were not found and fixed at this stage, based on a return on investment model below, they would cost a factor of 45 times more to find and fix.

The cost of testing process omissions

The following illustrates a requirements testing process that was implemented but not followed through. Defects were detected in the requirements but the business analysis team was busy writing the next phase of requirements and were directed not to update and fix the earlier documents. As a result, a key area of the system was ambiguously defined and developed according to the interpretation of a third-party offshore developer. This led to a defect that meant the project became unusable. The development team were asked for an estimate to fix the issue and did an excellent impression of a car mechanic looking at a wreck, scratching their chins and shaking their heads. Two weeks were needed to fix the problem as it rippled through the system. This delayed the £20m programme at the end of its lifecycle, precisely the point at which it could not afford to slip and there was no chance of making up the time. Only one hour of a business analyst's time would have been needed to clarify the ambiguous area and avoid this impact to the programme.

Cost of testing oversight

Quality costs manifest themselves in a variety of ways. Some of these are positive activities, such as prevention and appraisal, and others are manifested as impacts to the business in terms of internal and external failures. The key to a successful programme is to invest early in preventative activities, minimising rework and the risk of downstream business impact and associated costs.

Avoid quality shortcuts - reduce rework and business impact costs

- Be realistic with scope within the constraints of cost and time tolerances

- Prioritise desired features and anticipated benefits

- Exercise rigour with requirements quality and clarity before committing to development

- Use risk-based prioritisation to focus efforts - work smart

- Ensure regular stakeholder engagement and feedback

- Enforce acceptance criteria and phase containment between test phases

- Adopt best practise testing methodology such as the SQS Vicars model

Placing quality at the top of the delivery triangle and driving quality top-down through a programme minimises the need for re-work and gives a programme the best chance of success in meeting cost and time objectives. This also ensures the right focus on acceptance criteria and communication of required project outputs, programme outcomes and benefits realisation

By Carl Allen, director of programme management at SQS Group

Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in June 2008

 

COMMENTS powered by Disqus  //  Commenting policy