We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Over the past two weeks, emotions have run high around the evolution and use of emotion artificial intelligence (AI), which includes technologies such as voice-based emotion analysis and computer vision-based facial expression detection.
Video conferencing platform Zoom came under fire after saying it might soon include emotion AI features in its sales-targeted products. A nonprofit advocacy group, Fight for the Future, published an open letter to the company: It said Zoom’s possible offering would be a “major breach of user trust,” is “inherently biased,” and “a marketing gimmick.”
Meanwhile, Intel and Classroom Technologies are working on tools that use AI to detect the mood of children in virtual classrooms. This has led to media coverage with unfortunate titles such as “Emotion-Tracking Software Could Ding Your Kid for Looking Bored in Math.”
Finally, Uniphore, a conversational AI company with headquarters in Palo Alto, California and India, is enjoying unicorn status after announcing $400 million in new funding and a $2.5 billion valuation back in February. In January 2021, the company acquired Emotion Research Lab, which uses “advanced facial emotion recognition and eye-tracking technology to capture and analyze interactions over video in real-time to enhance engagement between people.”
Event
Transform 2022
Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.
Last month, it introduced its Q for Sales solution, which “leverages computer vision, tonal analysis, automatic speech recognition and natural language processing to capture and make recommendations on the full emotional spectrum of sales conversations to boost close rates and performance of sales teams.”
But computer scientist and famously fired, former Google employee, Timnit Gebru, who founded an independent AI ethics research institute in December 2021, was critical of Uniphore’s claims on Twitter. “The trend of embedding pseudoscience into ‘AI systems’ is such a big one,” she said.
What does this kind of pushback mean for the enterprise? How can organizations calculate the risks and rewards of investing in emotion AI? Experts maintain that the technology can be useful in specific use cases, particularly when it comes to helping customers and supporting salespeople.
Commitment to transparency is key
But, they add, an emotion AI investment requires a commitment to transparency. Organizations also need a full understanding about what the tools can and can’t do, as well as careful consideration around potential bias, data privacy and ROI.
Today’s evolving emotion AI technologies “may feel a little bit more invasive,” admitted Annette Zimmerman, a vice president and analyst at Gartner who specializes in emotion AI. “For the enterprise, I think transparency needs to be the top priority.” In December 2021, Zimmerman published a Gartner Competitive Landscape report for the emotion AI space. She pointed out that since the pandemic, organizations are “seeking to add more empathy in customer experiences.”
However, organizations also need to be sure the technology works and that the system is trained in a way that there is no bias introduced, she told VentureBeat. “For example, computer vision is very good at detecting obvious emotions like happiness and deep frustration,” she explained. “But for more subtle things like irony, or slightly annoyed versus very angry, the model needs to be trained, particularly on geographic and ethnic differences.”
Emotion AI could become key differentiator
Zimmerman, who highlighted Uniphore in her competitive landscape report, wrote that combining computer vision and voice-based emotion analytics “could become a key differentiator for the company.”
In an emailed comment to VentureBeat, Patrick Ehlen, vice president of AI at Uniphore, said, “it’s important to call out that meeting recordings and conversational intelligence applications have become mainstream in today’s business world.” The company’s intent with Q for Sales, he continued, “is to make virtual meetings more engaging, balanced, interactive and valuable for all parties.”
There are a few ways “we ensure there is no creepiness,” he added. “We ask for consent before the call begins, we don’t profile people on calls and we don’t perform facial ID or facial recognition.” In addition, he explained, all participants have the choice to opt-in rather than just opt-out with complete two-party consent at the beginning of each video meeting.
Ehlen also wanted to address “confusion about whether we are claiming to have developed AI that ‘detects emotions’ or knows something about people’s internal emotional states.” This is not Uniphore’s claim at all, he said: “Rather, we are reading the signals people sometimes use to communicate about their emotions, using combinations of facial expressions and tone of voice, for example.” For example, he explained, the phrase ‘Nice day, isn’t it?’ “might appear to communicate one thing if you only consider the text by itself, but if it comes with a sarcastic tone of voice and a roll of the eyes, this communicates something else.”
AI-driven emotional analysis is increasingly sophisticated
Sentiment analysis for text and voice has been around for years: Any time you call a customer service line or contact center and hear “this call is being recorded for quality assurance,” for example, you’re experiencing what has become highly sophisticated, AI-driven conversational analysis.
Zimmerman also highlighted Boston-based Cogito in Gartner’s Competitive Landscape as “a pioneer in audio-based emotion AI technology, providing real-time emotion analytics for call agent support/coaching, as well as stress-level monitoring.”
The company first provided AI solutions to the U.S. Department of Veteran Affairs – to analyze the voices of military veterans with PTSD to determine if they need immediate help. Then, they moved into the contact center space with an AI-driven sentiment analysis system that analyzes conversations and guides customer service agents in the moment.
“We offer real-time guidance in understanding how the call is going and the caller’s psychological state,” said Josh Feast, CEO of Cogito. “For instance, what’s the experience like for the parties on the call? What are fatigue levels? How is receptivity or motivation?”
Then, the solution provides the agent with specific cues, perhaps advising them to adjust the conversation pitch or speed. Or, it could provide recognition that the other party is distressed. “That provides an opportunity to show some empathy,” he said.
What enterprises need to know before investing in emotion AI
Give emotion AI C-level attention
“Executives need to know that emotion AI has great possibilities along with great responsibilities,” said Theresa Kushner, data and analytics practice lead at NTT DATA Services. “Managing these complicated AI algorithms is something that needs C-level attention and can’t be delegated to data scientist teams or to operations staff. They’ll need to understand the level of commitment that implementing and operationalizing a controversial technology such as emotion AI requires and be closely involved to ensure it doesn’t get out of hand.”
Consider the ROI
When talking to different vendors, make sure they really demonstrate the ROI, said Zimmerman: “You need to understand the benefit of investing in this particular technology – does it help me to increase customer satisfaction? Or does it help me to increase retention and reduce churn?” Uniphore’s Ehlen added that organizations should also look for a solution that can bring an immediate ROI. “Solutions in this realm should be able to help augment human interactions in real time and then become more intelligent and bespoke over time,” he explained.
Understand the algorithm and data collection
Questions about data collection and integration with other vendor solutions should always be top of mind, said Kushner, while when it comes to emotion AI specifically, organizations should make sure the technology doesn’t violate any of their ethical boundaries. “Consider asking if they can explain the AI algorithm that generates this emotional response? What data do they use for the emotional side of emotion AI? How is it collected? What will we have to collect to enrich that dataset?” It’s also important to understand the technology’s real capabilities and limitations, Ehlen added: “Is it single mode or multimode AI? Siloed or fused? This will determine the level of context and accuracy that you can eventually derive.”
Implement a test and learn framework
These days, emotion AI technology has evolved to the point that organizations are deploying large-scale projects. “That requires thinking carefully about change management, setting up a steering committee and, critically, implementing some type of test and learn framework,” Feast said, which can lead to new use case ideas. “For example, we have customers who tested our technology to give agents real-time guidance, but they also realized they could use it to signal when agents are getting tired and need a break.”
Balancing emotion AI’s risks and rewards
According to Gartner’s Zimmerman, emotion AI technology adoption still has a long way to go, particularly when it comes to Big Tech. “I assumed that, given some of the technology advances that Amazon has revealed and some discussions that Google has had, that many more devices would have this functionality, but they can’t. I think from a technology perspective they could do it, but maybe it is the privacy issues.”
Enterprise customers, too, have to weigh the risks and rewards of emotion AI. Kushner points out that a business may think they’d want to know how a customer really feels about their interaction with an online call center and employ emotion AI technology to find out. “But this risks alienating a customer if the emotion AI technology didn’t represent the customer’s feelings appropriately and customer support responds in a way that doesn’t fit the emotion the customer had expressed,” she said.
To strike the right balance, said Uniphore’s Ehlen, vendors and customers alike need to build on trust, which, in turn, is built on open communication and choice. “We are openly addressing what our solution can do and being clear on what it cannot do,” he said. “We are giving customers the choice to integrate this tool into their engagements or not. For those who do opt in, we follow industry best practices for data privacy and protection.”
The bottom line, said Feast, is that to succeed with emotion AI, enterprises need to make the technology use a win-win-win: “With every use case, I think organizations need to ask themselves ‘Is it good for the enterprise? Is it good for employees? Is it good for consumers?”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.