Over the past few weeks, software issues have been thrust into the forefront of public consciousness. In March, for example, Greggs, Sainsbury’s, Tesco and McDonald’s were unable to accept payments due to software problems.
The balance between risk and reward is a key aspect in engineering, including software engineering. According to the Engineering Council of the United Kingdom on risk: “Risk is inherent in the activities undertaken by engineering professionals, which means that members of the profession have a significant role to play in managing and limiting risk.”
However, people do not necessarily behave rationally when making decisions under risk. In 1979, psychologists Daniel Kahneman and Amos Tversky wrote a paper Prospect theory: Decision analysis under risk. This paper was cited in the decision to award Kahneman the 2002 Nobel Prize in Economics.
Backed by controlled studies, the theory seems to hold that people feel the pain of losses far greater than the pleasure of wins. For example; experiments have shown that the pain of losing $1,000 can only be offset by the satisfaction of winning $2,000.
Over the past few months, I’ve looked in depth at a number of catastrophic software failures, including those that have led to fatal car accidents, fatal radiation overdoses in hospitals, and miscarriages of justice. In particular, I investigated how these disasters materialized in real life and why they were not prevented earlier.
As I was writing this book, I realized that appearance theory kind of helps us understand why things go wrong. Robert Prentice, a professor at the University of Texas at Austin, describes the relationship this way: “The prospect theory describes how people tend to take much greater risks to avoid losing things compared to the risks they would take to get those things in the first place. Sometimes, in order to avoid loss, we consciously choose to lie. We cover up what could be a simple accidental mistake because we don’t want to suffer the consequences of that mistake.”
This concept was expertly illustrated by Derren Brown in his Netflix special The Pushwhich attempts to use the escalation of commitment from someone engaging in a minor unethical act to being forced to commit what they believe to be murder.
Another psychological factor is known as the “normalcy bias,” where people will not believe that a disaster is underway and act as if everything is normal, even in a disaster like the 1977 Tenerife airport disaster. The “bystander effect” has shown that people less likely to help others in the presence of other people.
Added to that is the risk of retaliation. In November 2023. Computer Weekly covered research I conducted research with Survation that found 53% of software engineers had suspected workplace misconduct, and 75% of those who reported misconduct said they faced retaliation after speaking up. The leading reason for those who did not speak up was fear of reprisal from management (59%), followed by fear of reprisal from colleagues (44%).
The influence of appearance theory can also be seen in public attitude towards software even before ITV Mr. Bates v. Post Office the drama arrived on TV screens in 2024. Between 29 September 2023 and 8 In October 2023, with Survation, I asked a representative sample of British adults what they thought was most important in computer systems. The public was most likely to say that data security, data accuracy, and avoiding serious errors are important to them “to a great extent” when using software systems, with getting the latest features quickly being the least important of the 10 different dimensions measured.
In the new book, I explored cases where a binary number reversal (eg 1 becomes 0), caused by cosmic rays in the atmosphere, would be enough to cause a potentially fatal outcome in computer systems. Therefore, it is impossible to predict every problem that might arise. However, the solution appears in what my friend Piet van Dongen often talks about at technology conferences – resilience engineering.
Rather than thinking of computer systems as technical systems in themselves, we can think of them as socio-technical systems in which both human and technical systems play a role in their security. In the operating room, it is not only technology that plays a safety role, but also doctors, nurses and other healthcare personnel. As one case study I researched revealed – in the cockpit of an airplane, the quick-thinking intervention of a pilot has the potential to save lives when computers go wrong.
In other words, people play a key role in preventing disasters, and that’s why it’s so important that software engineers and others who work in technology feel psychologically safe to raise the alarm when things go wrong and that we act when they do. By learning about our own cognitive biases, we can ensure that we don’t find ourselves trapped in escalating unethical behavior or not speaking up when it’s most important to do so.
As software engineers and others who work with IT, prospect theory teaches us that we may not be completely rational in balancing the competing forces of risk and reward in our professional decisions. It may be more difficult for us to walk away from a job that we would not have accepted if we had known the circumstances before accepting it. Loss aversion can put us in a position where we are afraid to stop and warn of wrongdoing, rather than continuing as normal (even if the long-term consequences for ourselves and others are far worse if we continue).
By being aware of this bias, we can better objectively carry out our responsibilities as engineers to more objectively balance the competing forces of risk and reward. This is, ultimately, how we can ensure that new technology serves the interests of humanity.