We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Artificial intelligence is already making our devices more personal — from simplifying daily tasks to increasing productivity. Emotion AI (also called affective computing) will take this to new heights by helping our devices understand our moods. That means we can expect smart refrigerators that interpret how we feel (based on what we say, how we slam the door) and then suggest foods to match those feelings. Our cars could even know when we’re angry, based on our driving habits.
Humans use non-verbal cues, such as facial expressions, gestures, and tone of voice, to communicate a range of feelings. Emotion AI goes beyond natural language processing by using computer vision and voice analysis to detect those moods and emotions. Voice of the customer (VoC) programs will leverage emotion AI technology to perform granular and individual sentiment analysis at scale. The result: Our devices will be in tune with us.
Conversational services
Digital giants — including Google, Amazon, Apple, Facebook, Microsoft, Baidu, and Tencent — have been investing in AI techniques that enhance their platforms and ecosystems. We are still at “Level 1” when it comes to conversational services such as Apple’s Siri, Microsoft’s Cortana, and Google Assistant. However, the market is set to reach new levels in the next one to two years.
Nearly 40 percent of smartphone users employ conversational systems on a daily basis, according to a 2017 Gartner survey of online adults in the United States. These services will not only become more intelligent and sophisticated in terms of processing verbal commands and questions, they will also grow to understand emotional states and contexts.
Today, there are a handful of available smartphone apps and connected home devices that can capture a user’s emotions. Additional prototypes and commercial products exist — for example, Emoshape’s connected home hub, Beyond Verbal‘s voice recognition app, and the connected home VPA Hubble. Large technology vendors such as IBM, Google, and Microsoft are investing in this emerging area, as are ambitious startups.
At this stage, one of the most significant shortcomings of such systems is a lack of contextual information. Adding emotional context by analyzing data points from facial expressions, voice intonation, and behavioral patterns will significantly enhance the user experience.
Wearables and connected cars
In the second wave of development for emotion AI, we will see value brought to many more areas, including educational software, video games, diagnostic software, athletic and health performance, and autonomous cars. Developments are underway in all of these fields, but 2018 will see many products realized and an increased number of new projects.
Beyond smartphones and connected-home devices, wearables and the connected car will collect, analyze, and process users’ emotional data via computer vision, audio, or sensors. The captured behavioral data will allow these devices to adapt or respond to a user’s needs.
Technology vendors, including Affectiva, Eyeris, and Audeering, are working with the automotive OEMs to develop new experiences inside the car that monitor users’ behavior in order to offer assistance, monitor safe-driving behavior, and enhance their ride.
There is also an opportunity for more specialized devices, such as medical wristbands that can anticipate a seizure a few minutes before the actual event, facilitating early response. Special apps developed for diagnostics and therapy may be able to recognize conditions such as depression or help children with autism.
Another important area is the development of anthropomorphic qualities in AI systems — such as personal assistant robots (PARs) that can adapt to different emotional contexts or individuals. A PAR will develop a “personality” as it has more interactions with a specific person, allowing it to better meet the user’s needs. Vendors such as IBM, as well as startups like Emoshape, are developing techniques to lend such anthropomorphic qualities to robotic systems.
VoC will help brands understand their consumers
Beyond enhancing robotics and personal devices, emotion AI can be applied in customer experience initiatives, such as VoC programs. A fleet of vendors already offer sentiment analysis by mining billions of data points on social media platforms and user forums. Some of these programs are limited to distinguishing between positive and negative sentiments while others are more advanced, capable of attributing nuanced emotional states — but so far, only in the aggregate.
We are still at an early stages when it comes to enhancing VoC programs with emotion AI. Technology providers will have to take a consultative approach with their clients — most of whom will be new to the concept of emotion AI. While there are only a few isolated use cases for emotion AI at the moment, we can expect it to eventually offer tools that transform virtually every aspect of our daily lives.
Annette Zimmermann is the research vice president at Gartner, a research and advisory company.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.