Emotion Artificial Intelligence will personalize interactions

By Annette Zimmermann

Have you ever wondered whether one day technology will be able to identify emotions, just as humans can? What if your smart fridge could tell how you feel and suggest foods that match your emotion?

Unrealistic? Inconceivable? No. Artificial intelligence (AI) and affective computing are starting to make this possible. Devices enriched with AI, depth-sensing and neurolinguistic-programming technologies are starting to process, analyze and respond to human emotions.

In the future, more and more smart devices will be able to capture human emotions and moods in relation to certain data and facts, and to analyze situations accordingly.

An Emotion-Sensing Future Approaches

Emotion-sensing systems will appear in devices as a result of the rise of intelligent agents, such as virtual assistants. Current examples of intelligent agents include Apple’s Siri, Microsoft’s Cortana and Google Assistant. They use the technological approaches of natural-language processing and natural-language understanding, but they don’t currently perceive human emotions. Artificial emotional intelligence (“emotion AI”) will change that. The next steps for these systems are to understand and respond to users’ emotional states, and to appear more human-like, in order to enable more comfortable and natural interaction with users.

Annette Zimmerman. Image Credit: Gartner

Annette Zimmerman. Image Credit: Gartner

An intelligent agent can be anything that can perceive its environment through sensors and act on that perception through actuators. Personal assistance robots (PARs), such as Qihan Technology’s Sanbot and SoftBank Robotics’ Pepper, are being “humanized” by training them to distinguish between, and react to, humans’ varying emotional states. The aim is for PARs to respond with body language and verbal responses appropriate to the emotions of the humans they interact with. If, for example, Pepper detects that someone is disappointed with an interaction, the intention is that it will respond apologetically.

Emotion AI Is Already Here

Future smart devices will be better at analyzing and responding to users’ emotions, thanks to AI systems that use deep-learning technology to measure the facial and verbal expression of emotion. These systems will play an increasingly important role in how humans interact with machines.

The first steps are already being taken. The video game “Nevermind,” for example, uses “emotion-based biofeedback” technology from Affectiva to detect the player’s mood and adjust its levels and difficulty accordingly. Oliver is playing his latest game console, “The Nevermind Game”. He’s been playing for 20 minutes and the further he gets into the game, the darker the mood and the more difficult the logic puzzles become. The thriller game is sensing Oliver’s anxiety as well as when he relaxes, and adjusts the levels based on his mood.

In another field, in-car systems are emerging that adapt the responsiveness of braking systems to the driver’s perceived level of anxiety. Jeanne is already having a stressful morning by taking the kids to school after missing the bus, and she is on her way to the doctor as Clara, her new-born daughter, is unwell. She is short-tempered and agitated at the wheel. The car is detecting her anxious mood, and as she approaches a busy cross-road, makes the breaks more responsive to avoid any brutal stop.

In these cases, both video game and car are equipped with visual sensors and AI-based emotion-tracking software to enable real-time emotion tracking.

The Healthcare and Automotive Industries Are Driving Adoption of Emotion AI

Organizations in the automotive and healthcare industries are prominent among those evaluating whether, and how far, to adopt emotion-sensing features.

As the example above shows, car manufacturers are exploring the implementation of in-car emotion detection systems. These systems will detect the driver’s moods and be aware of their emotions, which in return, could improve road safety by managing the driver’s anger, frustration, drowsiness and anxiety.

In the healthcare arena, emotion-sensing wearables could potentially monitor the mental health of patients 24/7, and alert doctors and caregivers instantly, if necessary. They could also help isolated elderly people and children to monitor their mental health. These devices will allow doctors and caregivers to monitor patterns of mental health, and to decide when and how to communicate with people in their care.

Current platforms for detecting and responding to emotions are mainly proprietary and tailored for a few isolated use cases. It’s also used by many global brands over the past years for product and brand perception studies. We can expect technology and media giants to team up and enhance their capabilities in the next two years, and to offer tools that will change lives for the better.

The author is the  Research Vice President at Gartner.


Published Date: Jul 19, 2017 11:21 pm | Updated Date: Jul 19, 2017 11:21 pm