The ability to register the emotional response of customers or potential customers from their facial expressions or words they speak or write, is a growing business and another instance of how AI is bringing new capability to the table.
Emotion AI is the market seen by Affectiva, while emotion recognition is the market described by EMRAYS. Both are using powerful software incorporating AI to measure emotions people register when they view an ad or write a response.
Affectiva spun out of MIT’s Media Lab in 2009, co-founded by Dr. Rosalind Picard and Dr. Rana el Kaliouby, now the CEO. Dr. Picard, an engineer, had published the book Affective Computing in 1997. The book was instrumental in starting a new field that now has its own journal, international conference and professional society. Dr. Picard continues on the faculty of MIT, as a manager of Affectiva, as a writer an as an active inventor.
The two met when Dr. Picard visited Cambridge University, where Dr. Kaliouby was conducting research on bringing emotional intelligence to devices and digital experiences. She was working on a prototype algorithm that could read and respond to a user’s emotion in real time. The two hit it off. They tried to get a grant from the National Science Foundation to create wearable glasses that could serve as a social-emotional aid to those with autism. They eventually got the grant, which brought Dr. Kaliouby to MIT Media Lab.
Industry sponsors of the lab asked the two to try to apply their emotion recognition technology to their industries. Toyota wanted it to detect drowsy drivers. Procter and Gamble wanted to see how people reacted to new shower gel scents. Fox wanted to understand how viewers engage with TV content. Discussions on how to respond to those requests led to the spinout Affectiva,, actively supported by the MIT community.
Customers of Affectiva’s first products released in 2010 concentrated in media, advertising and market research firms including Kantar Millward Brown, a multinational market research firm. “They were the early adopters,” said Gabi Zijderveld, chief marketing officer for Affectiva, in a meeting at the recent AI World in Boston. “We could measure frame by frame how someone was reacting to advertisements, for instance,” she said.
Today the company’s products are used by over 1,400 brands to gather insight and analytics in consumer emotional engagement. Affectiva describes its products as an emotion AI science platform that uses computer vision, deep learning and the world’s largest emotion data repository.
Three years ago, Affectiva released a emotion recognition software developments kit (SDK) that enabled companies to “emotion-able.” their applications. And they released a mobile SDK; Affectiva is now supported on seven platforms.
The company has raised $26.3 million in five rounds.
Interest in Emotion Recognition Coming from New Markets
Today, more interest is coming from clients in education and healthcare interested in using AI to record emotions. “In the past eight to 10 months, we have seen a dramatic increase in queries from tier one companies. We are the leaders in emotion AI; they are coming to us,” Zijderveld said.
Interest is also coming from the self-driving car field, for what is called “driver state monitoring.” Several proof of concept projects are underway with self-driving car manufacturers, Zijderveld said.
During a demo at AI World the Affectiva software looked at my face (survived) and could measure sadness, surprise, anger, disgust and joy based on facial expressions (or non expressions). “We have analyzed two billion faces from 87 countries,” Zijderveld said. [Ed note: the population of the earth was 7.4 billion in 2016.] Subjects can get paid $25 for five hours of facial research.
Market research is still the largest revenue generator for Affectiva and automotive is the primary growth market, Zijderveld said. “Our product road map is based on building out this [automotive] part of the business,” she said.
EMRAYS Deciphers Emotions from Text in Multiple Languages
With roots in academia and business, EMRAYS has a focus on text. ‘We are the multilingual emotional recognition software and analysis company,” said Michiel Maandag, co-founder and CMO of EMRAYS, based in the Netherlands . “Our technology analyzes text in multiple languages to predict the natural, unbiased emotional responses of readers.”
EMRAYS has four founders and current officers: Ilia Zaitsev, CEO & Chief Science Officer, a computational linguist with a PhD from St. Petersburg State University, Russia; he now uses machine learning techniques to analyze language, developing models to extract the emotion and sentiment from text; Paul Tero, CTO and Chief Data Officer, with a Computer Science degree from UC Berkeley and research and a masters in Evolutionary and Adaptive Systems from the the University of Sussex; Cliff Crosbie, CXO, former retail and marketing executive at Nike, Nokia,IKEA and Apple; and Michiel Maandag, CMO, formerly brand director of Nokia and now an independent marketing professional working with category leading brands.
The company offers four products today: a Software-as-a-Service (SaaS) product for bloggers, journalists and writers in general to research and check the emotional impact of their output; a Social Listening product to apply emotional analysis; ad placement for media agencies and brands; and API and software development kit (SDK) for developers.
Originally named Emotions.Tech, with a name change to EMRAYS in July 2017, EMRAYS became part of NVIDIA’s inception program in 2017, teaming with independent UK search provider Mojeek to create an emotional search engine. The idea is to search according to how the user wants the results to make them feel.
EMRAYS CTO Paul Tero stated in a press release, “The ability to analyze digital content according to the emotion it provokes can redefine our relationship with technology. Imagine online advertisers able to position their adverts on emotional appropriate pages.”
Understanding emotions from text requires a lot of processing power. EMRAYS turned to GPUs from NVIDIA to power deep learning in order to rank, list and search web pages according to their emotional content. “We need that acceleration to keep up with the complexities of human emotion,” Tero stated.
Mojeek users can now search the web and select results for emotions including love, laughter, surprise, anger or sadness. In order to focus on the reader’s emotional reaction, the EMRAYS’ search tool does not just count the number of positive or negative words in a text, or parse the tone of the writer. Instead, they listen to millions of reactions on social media each day. They use this data to train artificial neural networks. The networks learn to predict what kind of emotional reaction a piece of written content might prompt in a human reader. EMRAYS says it has analyzed over one billion emotional data points.
In this way, social media platforms Facebook and Twitter produce a volume of information each day that provides EMRAYS with training data, to help improve the accuracy of its neural network. “We based our system on NVIDIA GPUs because they allow us to process each page’s meta data tags in less than a millisecond, about 50 times faster than with GPUs,” Tero stated.
EMRAYS today can process emotional data in English, Norwegian, Dutch, Swedish and Russian.
The company is early stage, self-funded and with some successful proof of concept work done with big brands in the agency and corporate space.
Test users, especially journalists, are invited to access EMRAYS’ test site to see how the software works. For writers, “It’s like a grammar checker for the emotions,” said co-founder and CMO Michiel Maandag in an interview with AI Trends.
For media agencies who have relied on sentiment analysis, EMRAYS says it has a better approach. “With sentiment analysis, you really don’t know what it means if the outlook is positive. We can actually understand the real emotions behind it, because we know how text affects people,” Maandag said.
Other Players in Emotion Recognition
Last year the consumer-research company Nielsen bought Innerscope, which uses biometrics such as brain scans and galvanic skin response (GSR) to measure subconscious emotional responses to media and marketing.
Startup company Emotient used AI to predict “attitudes and actions based on facial expressions.” Apple acquired Emotient in 2016.
According to an article in Hubspot, Emotient has a patent for a method of collecting and labeling up to 100,000 facial images a day, supporting a computer’s ability to recognize facial expressions. It’s reasonable to believe that Emotient’s emotion recognition technology will start appearing in iPhones and iPads, then possibly used as a platform for more targeted and dynamic engagement when users are in their browsers.
From Wikipedia: nViso provides real-time emotional recognition for Web and mobile applications. Visage Technologies AB offres emotion estimation as part of their Visage SDK for marketing and scientific research. Eyeris is an emotion recognition company that works with embedded systems manufacturers, including automakers, as well as with social robotics companies and video content creators. .
How the Software is Being Used
Companies using emotion analysis to test audience reaction to their marketing include Unilever, P&G, Mars, Honda, Kellogg, and Coca Cola, according to Hubspot. For example, Kellogg’s used Affectiva’s software to help determine ads for Crunchy Nut cereal, with the goal of generating high engagement rates with the audience. Viewers were shown multiple versions of a commercial featuring animals. A version of the ad featuring a snake produced the most laughs, but low engagement rates when viewing the ad a second time.
The facial recognition software revealed that an alternative version of the ad featuring an alien produced the desired engagement levels. Kellogg’s therefore decided to rollout the alien-based ad instead, helping to drive the cereal’s sales.
Emotion AI and emotion recognition are positioned to mature and grow as AI markets, their capabilities like to extend to a wide range of computer interactions. Get ready to relax into having your emotions probed, measured and documented routinely.
- By John P. Desmond, AI Trends Editor