Researchers improve system reliability by reducing potential for human error

Interdisciplinary project aims for high dependability in computer system.

Researchers at Newcastle University are leading a project that aims to make computers as reliable as possible by examining the way that people use them and eliminating the margin for human error.

Researchers and the IT industry have made progress in achieving high dependability in computer hardware and software, but the ways computer systems work with people and business or social organisations are often disastrously unsuccessful and the cause of huge financial losses. The Dependability: an Interdisciplinary Research Collaboration (Dirc) project aims to understand the problems that arise in attempting to build systems involving complex interactions between computers and human beings.

Denis Besnard, research associate at Newcastle University's centre for software reliability, described the project as looking at the design and use of large, distributed computer systems.

He said, "Hardware is becoming more and more reliable, software developers are doing more and more work on reliability, but we have to make the relationship between humans and machines as natural as we can."

Besnard, whose PhD in psychology focused on human error, believes that the Dirc project will have a knock-on effect of making systems more robust. He said, "My background is in ergonomics and the interaction between humans and machines. I feel that making the dialogue between humans and machines more reliable will also make the systems more reliable."

When he was working on his PhD at the University of Aix-en-Provence in southern France, Besnard had a strong interest in studying how humans interact with one another. Now, however, he has responsibility for examining how humans interact with machines in "large socio-technical systems".

Besnard's work with the Dirc project covers three main areas:

Design resilience. Sub-system failures can be frequent, and coping with failure can require effort. Failure frequency data helps to direct design and deployment processes.

Evolution. The requirements on the system, context of use and common practice can change significantly over time. Mechanisms are needed to detect change and realign socio-technical systems to changed circumstances.

Configuration. During and after deployment, designers need to understand how well the system is suited to use, and users need to understand aspects of the design in order to configure the system - this is the way a system is "tuned" to meet its dependability requirements.

Business applications

Hacking is one example of where this research can be applied in a business context. Computer systems in the financial sector, for example, can be made more secure by analysing the habits of hackers, according to Besnard. He said, "Hackers are not always experts - for example, they sometimes re-use the same tools to hack into systems - if you discover the strategies that they use, you can make your systems more secure."

Besnard's work has led him to conclude that even expert hackers sometimes re-use the same methods of breaking into systems, although he acknowledged that it is a tougher task to catch them. He said, "They are often more sophisticated, so there is a lot of work to do in that area."

The first three years of the Dirc initiative have focused primarily on the academic side of the project. So far, researchers have published work in 250 academic publications and four conferences have been organised around the project.

Now, however, as the project enters its second stage, researchers will be looking to work much more closely with industry. Specifically, this is likely to include organisations in the aviation, electronics, healthcare and financial sectors.

Besnard said the project would now focus on uses that were specific to those industries. For example, the research could look at how a bank could make electronic transactions more secure.

The Dirc project's findings are likely to feed back into how computer systems are designed in the future. Besnard said, "There are still areas where the relationship between system designers and end-users could be improved."

He said users want reliability, ease of use and security from their systems. Besnard used the example of next-generation cockpit systems in planes, which have been made safer by taking into account how pilots interact with systems such as autopilot.

"The next generation of cockpit systems will have to anticipate the likely areas for human error, such as 'mode confusion', which is when pilots key information into systems which are in the wrong mode."

In the broader context of IT, the results of the Dirc project will mean that computer systems more closely reflect users' needs and practices. Besnard said, "One of the possible outcomes of this is more secure computing for users when they browse the internet, send e-mails or purchase goods online."

The £8m, six-year Dirc initiative, began in 2000. It is supported by the Engineering and Physical Sciences Research Council and government body British Trade International. Encompassing computing science, psychology, sociology and statistics, the project involves 50 researchers from five UK universities. In addition to Newcastle, Edinburgh, York, Lancaster and City University in London are also involved.

Read more on Data centre hardware