agsandrew - Fotolia

Voice assistance technology gets emotional

Many people who use Alexa get frustrated by her lack of understanding. But by 2022, voice technology will have evolved to understand emotion

Speaker-based virtual personal assistants (VPAs) are set to evolve to understand human emotions, according to analyst Gartner.

The company predicted that by 2022, personal devices will know more about an individual’s emotional state than his or her own family. “AI is generating multiple disruptive forces that are reshaping the way we interact with personal technologies,” it said.

“Emotion AI systems and affective computing are allowing everyday objects to detect, analyse, process and respond to people’s emotional states and moods to provide better context and a more personalised experience,” said Roberta Cozza, research director at Gartner.

But the technology is still being developed and existing technology has little understanding of users and how they feel. “We are still at the early stage of what will be an emotional system,” said Cozza.

However, commercial prototypes are now becoming available, she said. For instance, New York-based startup Emoshape has developed its own CPU optimised to handle emotional data. The technology has the potential to change computer games, virtual reality and augmented reality applications, she said.

According to Cozza, some users are starting to see value in voice technology. Compared with its Amazon rival, Alexa, Google Assistant “offers better query relevancy, but it needs to evolve to support emotion”, she said.

This is an area where all the major players, including Google, Amazon, Microsoft and Apple, are investing to respond to future competition, said Cozza. This is because current technology has no context around the user. “The next level of VPA will be to give more personalised answers or be more personal,” she added.

Read more about voice-based assistants

  • The final day of Re:Invent 2017 sees Amazon set out plans to help enterprises automate problematic workplace tasks using its voice assistant, Alexa.
  • Voice recognition has become a new frontier for customer relationship management. What does the new voice channel mean for understanding consumers and how will voice data integrate with existing customer intelligence?

This area of product development is being driven by startups and more specialised service providers, such as in healthcare, said Cozza.

The current wave of emotion AI systems is being driven by the proliferation of VPAs and other AI-based technology for conversational systems. As a second wave emerges, AI technology will add value to more and more customer experience scenarios, including educational software, video games, diagnostic software, athletic and health performance, and autonomous cars, according to Gartner.

In the near term, Amazon appears to be extending Alexa from a home-based device for consumers to the enterprise. At the end of last year, CTO Werner Vogels announced Alexa for Business, saying the technology has the potential to make it easy to manage the office environment.

Cozza said this means VPAs could find their way into meeting rooms, integrating with software such as Outlook and PowerPoint, introduce speakers and work in the background to facilitate meetings.

Next Steps

Experts cautious about Apple's mood-detecting AI research

Read more on Artificial intelligence, automation and robotics

Data Center
Data Management