Clear and present danger: why CW refused to give up on Chinook

The loss of Chinook ZD576, with its four crew and 25 VIP passengers, was more than a helicopter crash. It led to a notorious and...

The loss of Chinook ZD576, with its four crew and 25 VIP passengers, was more than a helicopter crash. It led to a notorious and flagrant injustice. The pilots were blamed, not because anyone had any specific, certain knowledge of their failings, but because human error seemed to be the cause in the absence of any other explanation.

It was a tidy conclusion; but it overlooked the dearth of hard evidence - and the density of pre-crash evidence of potentially serious problems with the aircraft's safety-critical Fadec engine control software.

Yet the injustice done to the pilots is not the only reason Computer Weekly has campaigned to have the verdict re-examined.

The most worrying single facet of the crash is that the Ministry of Defence and the RAF assumed that a lack of evidence of malfunction points to operator error being the cause.

What if a software problem caused the accidental firing of a missile that destroyed a town? What if a design error in a software-controlled train set off a complex sequence of events that caused a fatal crash? What if dozens of people were killed in a fire in a tunnel because software-controlled sprinklers failed?

The chances are that if software caused any of these accidents, we would never know. This is because when software fails, or it contains coding or design flaws, and these defects cause a major accident, there will be no signs of any software-related deficiency in the wreckage.

And only the manufacturer will understand its system well enough to identify any flaws in its design, coding or testing. Yet no commercial manufacturer can be expected to implicate itself in a major software-related disaster. So, if software kills large numbers of people it is highly likely that the cause of the accident will never be known.

This is especially likely to be the case if the software has failed in no obvious way, such as when a coding error has set off a chain of complex events that cannot be replicated after a disaster.

But after a major accident, convention dictates that someone must be blamed. Step forward the vulnerable equipment operators: the pilots, keyboard clerks, train drivers or anyone who cannot prove their innocence.

This is particularly so because the manufacturer, in proving its equipment was not at fault, may have millions of pounds at its disposal. It will also have the goodwill of the customer, which bought the highly specialised equipment and relies on the manufacturer's support for its maintenance.

In contrast, individuals - the system operators - may have minimal resources: no access to the manufacturer's commercially sensitive information, none of the manufacturer's knowledge of how the systems work, and little money for expert reports and advice.

Therefore, the weakest link after a major fatal accident will always be the operators - particularly if they are dead.

That is why the loss of Chinook ZD576 is so much more than a helicopter crash. To accept the verdict against the pilots is to accept that it is reasonable to blame the operators if the cause of a disaster is not known.

If we accept this dangerous principle we may as well say to manufacturers of safety-critical software, "We recognise you will try to do a good job, but if you create poorly designed, haphazardly developed and inadequately tested safety-critical computer systems that kill people, we acknowledge that you will never be held to account."

The road to justice

1994(2 June) Chinook ZD576 crashes killing all on board. An RAF Board of Inquiry (BOI) is formed.

1994 Unknown to crash investigators, the MoD draws up a legal case against the suppliers of the Chinook's Fadec software.

1995 Two air marshals find that the pilots of Chinook ZD576 were grossly negligent - despite an inconclusive BOI report.

1995 The MoD wins its legal case over the Fadec system after claiming it contained design flaws and was "not airworthy".

1996 A Scottish Fatal Accident Inquiry says there is insufficient evidence to blame the pilots. The MoD stands by its air marshals.

1997 An MoD report leaked to Channel 4 News says the Fadec software had 486 anomalies. Experts urged a software rewrite but the MoD refused.

1997 A US Army Chinook flips over and rights itself at 250 feet. The cause remains a mystery.

1998 The Commons Defence Committee accepts the MoD's assurance that the Chinook Mk2 had no serious software problems.

1999 Computer Weekly publishes RAF Justice, a 140-page report on a cover-up of the Chinook's software problems. MPs call for a new inquiry.

1999 The MoD tells the Commons that Computer Weekly's findings are "nothing new".

1999 In a handwritten note, Tony Blair says he has taken a personal interest in the crash. He supports the verdict against the pilots.

1999 The House of Lords debates whether the issues raised in RAF Justice justify reopening the RAF Board of Inquiry into the crash. No vote is taken.

2000 C omputer Weekly asks Parliament's Public Accounts Committee (PAC) to investigate the airworthiness of the Chinook Mk2.

2000 After an inquiry, the PAC accuses the MoD of "unwarrantable arrogance" and says the verdict against the pilots is unsustainable.

2000 T he Times calls for a new inquiry and points to the results of a three-year investigation by Computer Weekly.

2001 Lord Chalfont wins a strongly-opposed fight for the House of Lords to form a committee to investigate the crash.

2002 The Lords report.

Read more on IT risk management