Lessons to be learned from Chinook tragedy

Since 1997, Computer Weekly has studied the IT-related lessons from the crash of Chinook helicopter ZD576 on the Mull of Kintyre...

Since 1997, Computer Weekly has studied the IT-related lessons from the crash of Chinook helicopter ZD576 on the Mull of Kintyre in 1994. Some of the mistakes made by the Ministry of Defence and the RAF after the accident are similar to those that dog many large-scale computer projects. After talking to experts and reviewing the hundreds of e-mails about the crash sent in by Computer Weekly readers, Tony Collins lists the top six general lessons that the IT industry can learn from the accident. The interrelated points highlight the problems of establishing the truth after a major incident

Do not blame the person operating the equipment simply because the disaster has no other obvious cause
After the crash of Chinook ZD576, the equipment operators - the pilots - were blamed largely because there was no alternative explanation. But the absence of any evidence of the supplier's negligence does not prove end-user negligence.

After a major incident, suppliers may have large resources to establish that they were not to blame. So it will be easy to blame the keyboard clerks or system operators - or pilots in the case of Chinook ZD576. Les Hatton, professor of software reliability at Kent University, who has made a study of the Chinook crash, said, "If a [helicopter] rotor fails, there is clear evidence of failure for investigators. With software, there is frequently no clear evidence of failure It is too easy to blame the pilots in this case."

Do not launch personal attacks on your critics merely because you dislike their criticism
The MoD has sought to discredit those who have tried to shed light on possible causes of the crash of Chinook ZD576, in particular, Malcolm Perks, who was a Fadec engine control system expert at Rolls-Royce, and Robert Burke, a squadron leader who was a Chinook unit test pilot.

The fathers of the deceased pilots of Chinook ZD576 were also criticised by the MoD, as was a Scottish sheriff Stephen Young, whose independent Fatal Accident Inquiry did not find the pilots negligent. In addition, critics have been attacked in general terms by the MoD in its letters to MPs. It has suggested that campaigners for a new inquiry have been motivated more by emotion than the facts.

Similarly, some major organisations have sacked staff after they made unwelcome, candid criticisms of failing IT projects.

The person in command should hear both sides of the story - from both sides
In large organisations bad news about an impending disaster may be filtered and so not pass up the chain of command. In the end the person in charge - the chief executive - may not be aware of a disaster until it is all too obvious.

For example, an independent report on an air traffic control system at Swanwick in Hampshire said the reporting of bad news to top-level management was rendered more palatable by qualifying every negative statement with a positive one.

In the case of the Chinook crash, the prime minister Tony Blair believes that the pilots of ZD576 were negligent. But he has heard the arguments for and against pilot negligence only from the MoD and the RAF, which have a fixed view.

Be on top of the project, or you will be almost entirely in the hands of the supplier if there is a disaster
When a user company passes day-to-day control of a project to its IT supplier and then suffers a serious software-related problem or even a major software-related fatal accident, there may be no sure-fire way of establishing what has gone wrong or why.

This is because manufacturers cannot be expected to indict themselves after a major incident by identifying defects in their product or management of a project.

The problem with software is that only the manufacturer may understand it well enough to know what has caused or contributed to a crash; and even if the supplier wants to tell the whole truth, will its lawyers let it?

One solution may be to employ independent experts to scrutinise the manufacturer throughout a project - it may be too late to employ them after a disaster.

If a mistake has been made, admit it, learn the lessons and move on
If the MoD and the top echelons of the RAF had admitted that a mistake had been made over the verdict of gross negligence against the pilots of Chinook ZD576, the crash would not have led to years of intense media interest.

This media scrutiny has tied up resources within the MoD and much debating time in the House of Commons and the House of Lords. It has exacerbated the anguish for the families of the pilots and relatives of the passengers of Chinook ZD576. And all of the campaigning has apparently had only one purpose: to get a government department to admit a that mistake has been made.

The House of Commons Public Accounts Committee published in full Computer Weekly's evidence that the MoD continued to issue incorrect statements after these were shown to be incorrect, quoted selective parts of accident reports, withheld relevant information, and mixed undisputed facts with disputed speculation.

None of this would have been necessary if the MoD and the RAF had put in place a credible system for ensuring that the decisions of its most senior personnel could be subject to genuinely independent assessment.

Read more on IT legislation and regulation

CIO
Security
Networking
Data Center
Data Management
Close