Computers that understand emotions

December 27, 2010

Professor Peter Robinson and "Charles Babbage" (University of Cambridge)

University of Cambridge researchers are exploring the role of emotions in human-computer interaction.

“We’re building emotionally intelligent computers, ones that can read my mind and know how I feel,” Professor Peter Robinson says. “Computers are really good at understanding what someone is typing or even saying. But they need to understand not just what I’m saying, but how I’m saying it.”

The research team is collaborating closely with Professor Simon Baron-Cohen’s team in the University’s Autism Research Centre. Because those researchers study the difficulties that some people have understanding emotions, their insights help to address the same problems in computers.

Facial expressions are an important way of understanding people’s feelings. One system tracks features on a person’s face, calculates the gestures that are being made and infers emotions from them. It gets the right answer over 70% of the time, which is as good as most human observers.

Other systems analyze speech intonation to infer emotions from the way that something is said, and analyze body posture and gestures.

Ian Davies, one of the research students in Professor Robinson’s team, is looking at applications of these technologies in command and control systems. “Even in something as simple as a car we need to know if the driver is concentrating and confused, so that we can avoid overloading him with distractions from a mobile phone, the radio, or a satellite navigation system.”

Merely understanding emotions is not enough. Professor Robinson wants computers to express emotions as well, whether they are cartoon animations, or physical robots.

PhD student Tadas BaltruĊĦaitis, another team member, works on animating figures to mimic a person’s facial expressions, while fellow PhD candidate Laurel Riek is experimenting with a robotic head modeled on Charles Babbage, which appears in the film. “Charles has two dozen motors controlling ‘muscles’ in his face, giving him a wide range of expressions,” Robinson explains. “We can use him to explore empathy, rapport building, and co-operation in emotional interactions between people and computers.”

Adapted from materials provided by the University of Cambridge