Letting your computer know how you feel

Brunel University researchers aiming to improve the user experience are investigating computer systems that recognise and respond...

 Brunel University researchers aiming to improve the user experience are investigating computer systems that recognise and respond to emotion

Imagine if a computer could sense if a user was having trouble with an application and intuitively offer advice. The irritating paperclip that embodies Microsoft's Office Assistant could be a thing of the past. The software industry has tried to make applications more intelligent but many fall far short of being genuinely useful. However, this could be about to change.

Kate Hone, a lecturer in the department of information systems and computing at Brunel University, is the principal investigator in a project that aims to evaluate the potential for emotion-recognition technology to improve the quality of human-computer interaction. Her study is part of a larger area of computer science called affective computing, which examines how computers affect and can influence human emotion. Hone described her research at Brunel as a human factor investigation. She said, "We are trying to build a system that recognises emotion to support human-computer recognition."

The project, called Eric (Emotional Recognition for Interaction with Computers) has three main goals. First, Hone is looking at the extent to which people will naturally express emotions when they know they are interacting with an emotion-detecting computer. Second, she wants to identify the conditions under which emotion detection can lead to improvements in system usability. And third, the research aims to provide "human factors" guidelines on the deployment of emotion recognition technology that can help the developers of such systems to meet the needs of real users.

"If you take facial expressions, which are a universal means of communication, you can use algorithms to detect if the face is looking unhappy," Hone said. By recognising facial expressions, Hone said Eric could be used in applications ranging from home entertainment to education.

For instance, she said in e-learning, the computer could detect how puzzled a pupil was. Potentially, the computer would be able to deal with user frustration by offering help when appropriate. The technology could be applied to computer games, interactive horror movies (are you scared enough?), or on a website, where the level of usability could be gauged.

Computers equipped with a video camera can be used to capture the image of a human face. One of the problems the researchers at Brunel need to overcome is how to map facial muscles. According to Hone, a technique known as Facial Action Coding System is one of the most highly developed methods for coding facial expression. It is based on the analysis of small facial movements, which are visible to human observers and discriminable from each other.

"Many of the approaches used in speech recognition can be applied to recognising emotion through facial recognition," Hone said. For example, both are described as "natural" modes of communication, with the implication that what is natural in the human-human context should also improve human interactions with computers.

Both technologies also face similar challenges. For instance, just as the characteristics of speech vary from person to person, so do the characteristics of emotional expression. This means systems must either be trained to work for individual users (as in speaker-dependent speech recognition) or attempt to work for all users (speaker- or user-independent).

Similarly, while speech recognition faces the problem of detecting discrete words within the continuous stream of speech, emotion recognition technology will face the problem of detecting discrete emotional states within a constantly varying input signal. Early speech recognisers were "isolated-word" recognisers and relied upon users deliberately pausing between utterances.

During the summer, Brunel University will be running a simulation designed to evaluate how people respond to computers that exhibit emotional recognition. Hone said that while it is possible to perform facial recognition when the subject is not moving, speed of processing is not yet fast enough for real-time recognition of facial expressions.

What is affective computing?
Affective computing can be defined as "computing that relates to, arises from, or deliberately influences emotion". A number of different types of research are encompassed within this term. For instance, some artificial intelligence researchers in the field of affective computing are interested in how emotion contributes to human and, by analogy, computer problem solving or decision making; others are concerned with enabling human-human communication of emotion through the medium of computer networks. Underlying many of these different strands of research is work on understanding the nature of emotion and how it should be represented.


CV: Kate Hone
Kate Hone is a lecturer in the department of IS and computing at Brunel University. She has previously been a lecturer in the school of computer science and IT at the University of Nottingham. Her PhD was on the human factors of speech recognition systems. She also holds degrees in experimental psychology and work design and ergonomics. She has previously received EPSRC Fast Stream funding for an 11-month project investigating user interactions with speech recognition systems. She is currently supervising two PhD students and is involved in the Millennium Homes project.

http://www.brunel.ac.uk/~csstksh/ eric.htm

Read more on Operating systems software