Getty Images/iStockphoto

How Prospect Theory helps us understand software disasters

By understanding the human psychology behind IT system failures that directly impact people, we can better protect ourselves

Over the past few weeks, software problems have been brought to the forefront of the public consciousness. In March, for instance, software problems left Greggs, Sainsbury's, Tesco and McDonald's unable to take payments. 

The balancing of risk and reward is a key aspect in engineering, including in software engineering. In the words of Engineering Council UK’s guidance on risk: “Risk is inherent in the activities undertaken by engineering professionals, meaning that members of the profession have a significant role to play in managing and limiting it.” 

However, humans don’t necessarily behave rationally when making decisions under risk. In 1979, the psychologists Daniel Kahneman and Amos Tversky wrote the paper Prospect Theory: An analysis of decision under risk. This work was cited in the decision to award Kahneman the 2002 Nobel Prize in Economics. 

Backed by controlled studies, Prospect Theory posits that humans feel the pain of losses far greater than the pleasure of wins. For example; experiments indicated that the pain of losing $1,000 could only be compensated by the pleasure of winning $2,000. 

Over the past few months, I’ve deeply studied a number of catastrophic software failures in detail, including those which have led to fatal car crashes, killer radiation overdoses in hospitals and miscarriages of justice. Specifically, I looked into how these disasters materialised into real-life impact and why they weren’t prevented earlier. 

Today, this work has been published in a new book How to protect yourself from killer computers - making these investigations open to the public for the first time. 

Whilst writing this book, I’ve realised that Prospect Theory goes some way to helping us understand why things go wrong. Robert Prentice, a professor at the University of Texas at Austin, describes the relationship as follows: “Prospect theory describes how people tend to take much greater risks to avoid losing things compared to the risks they would’ve taken to gain those things in the first place. Sometimes, to avoid a loss, we consciously decide to lie. We cover up what might be a simple accidental mistake because we don’t wish to suffer the consequences of making that mistake.” 

This concept was expertly illustrated by Derren Brown in his Netflix special The Push, which attempts to use an escalation of commitment from someone engaging in a minor unethical act to coercing them into committing what they believe to be murder. 

Another psychological factor is known as "normalcy bias", whereby humans will not believe a disaster is ongoing and behave like everything is normal, even in a disaster like the Tenerife airport disaster in 1977. The "bystander effect" has found people to be less likely to help others in the presence of other people. 

This is compounded by the risk of retaliation. In November 2023, Computer Weekly covered research I had conducted with the research agency Survation which found that 53% of software engineers had suspected wrongdoing in the workplace and 75% of those who reported wrongdoing reported facing retaliation after speaking up. The leading reason provided for those who did not speak up was fear of retaliation from management (59%) and then fear of retaliation from colleagues (44%). 

The impact of Prospect Theory can also be seen in the general public’s attitude to software even before ITV’s Mr Bates vs The Post Office drama took to TV screens in 2024. Between 29 September 2023 and 8 October 2023, with Survation I asked a representative sample of British adults about what mattered most to them in computer systems. The public was most likely to say that data security, data accuracy and avoiding serious bugs mattered to them "to a great extent" when using software systems, with getting the latest features quickly as the least important of the 10 different dimensions measured. 

In the new book, I explored cases whereby a binary number flipping (e.g. a 1 becoming a 0), caused by cosmic rays in the atmosphere, would be enough to trigger a potentially fatal outcome in computer systems. It is therefore impossible to foresee every problem that could emerge. However, the solution appears in what my friend Piet van Dongen often speaks about at technology conferences - resilience engineering. 

Instead of considering computer systems as technical systems in their own right, we can think of them as socio-technical systems where both human and technical systems play a role in their safety. In an operating theatre, it isn’t just the technology that plays a safety role but also the doctors, nurses and other healthcare staff. As one case study I investigated found - in the cockpit of a plane, it is the fast-thinking intervention of the pilots that has the potential to save lives when computers go wrong. 

In other words, humans play a key role in preventing disasters and this is why it is so essential that software engineers and others working in technology feel psychologically safe to raise the alarm when things go wrong and that we act when they do. By learning about our own cognitive biases, we’re able to ensure we don’t find ourselves trapped in escalating unethical behaviour or not speaking up when it’s the most important to do so. 

For software engineers and others working with IT, Prospect Theory teaches us that we may not be fully rational in balancing the competing forces of risk and reward in our professional decisions. We may well find it harder to walk away from a job which we wouldn’t have taken if we’d known the circumstances before we accepted it. The loss aversion may well place us in a position where we fear stopping and raising the alarm to wrongdoing, instead of continuing as normal (even if the long-term consequences for us and others of carrying on may be far worse). 

By being aware of this bias, we can better objectively discharge our responsibilities as engineers to balance the competing forces of risk and reward more objectively. This is, ultimately, how we can help ensure that new technology serves the interests of humanity. 

Dr Junade Ali CEng FIET is an experienced technologist with an interest in software engineering management, computer security research and distributed systems. How to protect yourself from killer computers is now available on Amazon in e-book and paperback format.

Read more on CW500 and IT leadership skills

CIO
Security
Networking
Data Center
Data Management
Close