In 2022, Your Digital Devices Will Understand Your Emotions

SparkAmpLab Editorial Team
October 22, 2020

The idea that our devices will one day be able to detect our emotional state is no longer just some science fiction fantasy. Gartner predicts that by 2022, 10% of personal devices will have emotion AI capabilities. Personal devices will soon know more about an individual's emotional state than anyone in their immediate social circle.

To some, this will come as no surprise given our increasing reliance on technology. Like any addiction, the digital addiction poses all kinds of harmful effects that we’re fully aware of, and yet we can’t seem to resist it. 

But what would it be like if our devices could actually understand our emotions and interact with us based on this comprehension? This is what Rana el Kaliouby, founder and CEO of Affectiva and the first pioneer of emotion AI, is seeking to answer.

As el Kaliouby puts it in her TED talk, “Today's technology has lots of I.Q., but no E.Q.; lots of cognitive intelligence, but no emotional intelligence. So that got me thinking, what if our technology could sense our emotions? What if our devices could sense how we felt and reacted accordingly, just the way an emotionally intelligent friend would?” 

The Egyptian-American CEO, named one of Fortune Magazine’s 40 under 40, has headed the development of a facial and vocal recognition technology dubbed ‘Human Perception AI’. It can detect nuanced human emotions, such as complex cognitive states, human behaviors, activities, and interactions. The Boston-based startup has applied these results to a wide range of applications, from digital marketing to automated driving.

According to the company, a total of 9.7 millions faces have been analyzed so far using the Human Perception AI technology.

Affectiva Automotive AI

The success of the company’s ‘Automotive AI’ software launched back in 2018 is largely responsible for their latest funding round of $26 million, led by mobility technology company Aptiv, Motley Fool Ventures, and Japan-based CAC. According to the founder, the funding was used to grow the team and to strengthen machine learning and engineering capabilities. 

Automotive AI was built around the company’s original ‘Emotion AI’, which has since snowballed into what the company dubs ‘Human Perception AI’.  This more advanced technology can detect a driver’s state of alertness or drowsiness based on its processing of visual and sonic information. Most self-driving cars can sense what’s happening outside of the car, but since car accidents often happen due to in-cabin situations, understanding what’s going on inside the vehicle is just as important for AI to be effective in mitigating accidents.

For instance, if a child passenger were to suddenly start screaming and throwing things around the car, it’s likely to distract the driver. If emotion AI can interpret that behavior, it can interface with the vehicle’s safety systems and take control of the vehicle until the driver can return their focus to the road, thereby effectively reducing the risk of an accident. 

Affectiva’s game-changing role in the gaming industry 

In addition to revolutionizing our driving experience, Affectiva’s emotion-sensing technology can also be applied to changing the way we play games, if it hasn’t already done so. If you have played Flying Mollusk’s Nevermind, you’ll know what I mean.

The technology can measure a gamer’s emotional state by reading facial expressions, allowing adaptive games to respond to gamers’ emotions. If a gamer becomes stressed or fearful whilst playing a horror game, for example, the game might get even scarier. Thanks to Affectiva’s technology, gamers can now interact with games that can give them emotion-based experiences tailored to them. Such technology is certain to give game developers plenty of room for experimentation in future.

The future of emotion AI requires caution

As exciting as these developments sound, it’s already proving a cause of concern for some. An Accenture report acknowledges that emotion AI will be a forceful influence in the future and that the technology will change the relationship between businesses and consumers. “Emotional AI will not only offer new metrics to understand people; it will redefine products and services as we know them,” the report writers warned.

Caution is wise, notes the report. The use of AI to interpret sensitive human emotions and turn those emotions into data comes with risks.

After all, our emotions are very personal and most of us aren’t willing to give away our feelings quite so easily. Experts have warned about exploitation, emotional manipulation, and invasion of privacy. How safe will people feel knowing that the devices they use comprehend everything about their inner state? Not to mention, will users willingly accept that knowledge being turned into data points for companies to somehow commodify? 

As the Accenture report points out, technology, communications, and platform companies must manage emotion AI responsibly. Regardless, emotion AI is experiencing exponential growth, and will likely be rolling out sooner than you think.

Recent Updates