At least once in their career an airline pilot is likely to need to break the rules to escape a tricky situation - at least according to a few experienced flyers I know.
A few year's ago the Leeds football team had a narrow escape at Southend airport. The engine of their aircraft caught fire during take-off. According to the rule book the pilot should have circled and landed, but he took a risk, broke the rules and landed immediately.
A fly-by-wire system would have over-ruled him, as it did an Airbus pilot trying to stop his aircraft from aquaplaning while attempting to land in a heavy storm. The sensors thought the plane was still in the air, whereas it was cushioned on water and out of control. A crash was inescapable.
Programmers cannot think of everything, so accidents with life-critical software are inevitable. But ironically when people do die, software fault denial usually takes over. The software running any failed system should be the first factor to be examined and should remain a high priority.
Life-critical software demands clear lines of responsibility and accountability. Computer Weekly's coverage of a five-year old software-related Chinook crash seeks long-term positive procedural change. This work shows that, above all, two rules need breaking after such mishaps - secrecy and obfuscation.