Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions. Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects […]
Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions.
Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an “a,” not an “e”). In psychology, an “affect” is a term used to describe the experience of feeling or emotion.
If you’ve seen “Solo: A Star Wars Story”, then you’ve seen the poster child for artificial emotional intelligence: L3-37.
Lando Calrissian’s droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion. Lando (played by Donald Glover) is also injured during the getaway.
The “woke robot” demonstrates the ability to simulate empathy by interpreting the emotional state of a human, adapting its behavior to him, and giving an appropriate response to those emotions.
Now, this example might lead some video marketers and advertisers to think that emotion AI is science fiction. But, it is very real.
A number of companies are already working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. This includes Affectiva, an emotion measurement technology company that spun out of MIT’s Media Lab in 2009, and Realeyes, an emotion tech company that spun out of Oxford University in 2007.
So, how do their technologies help brands, agencies, and media companies improve their advertising and marketing messages? Let’s tackle this question by examining how affective computing works.
How Does Artificial Emotion Intelligence Work?
Brands know emotions influence consumer behavior and decision making. So, they’re willing to spend money on market research to understand consumer emotional engagement with their brand content.
Affectiva uses a webcam to track a user’s smirks, smiles, frowns, and furrows, which measure the user’s levels of surprise, amusement, or confusion.
It also uses a webcam to measure a person’s heart rate without wearing a sensor by tracking color changes in the person’s face, which pulses each time the heart beats.
Affectiva has turned this technology into a cloud-based solution that utilizes “facial coding” and emotion recognition software to provide insight into a consumer’s emotional responses to digital content. All a brand or media company needs are some panelists with standard webcams and internet connectivity.
As viewers watch a video, Affectiva’s product, Affdex for Market Research, measures their moment-by-moment facial expressions of emotions. The results are then aggregated and displayed in a dashboard.
Affdex for Market Research also provides video marketers and advertisers with norms that leverage Affectiva’s extensive emotion database and tie directly to outcomes such as brand recall, sales lift, purchase intent, and likelihood to share.
These norms benchmark a video or ad against ones from competitors – by geography, product category, media length, and repeat views. About one-third of the Fortune Global 100, including brands such as Kellogg’s and Mars as well as media companies like CBS, have used Affdex for Market Research to optimize their content and media spend.
By comparison, Realeyes uses webcams as well as computer vision and machine learning to measure how people feel as they watch video content online.
First, a brand, agency, or media company selects a specific geography and audience segment that it wants to test.
Next, the Realeyes system provides 300 target viewers, who watch videos on their own device anytime they choose.
Then, the system’s algorithms process and analyze facial expressions in the cloud and show results on a dashboard within 24 hours.
The reports provided by Realeyes combine both creative testing and media planning insights to enable video marketers and advertisers to understand how consumers feel about their video content.
This enables brands (such as Coca-Cola, Hershey’s, and Mars), agencies (like Ipsos, MarketCast, and Publicis), as well as media companies (such as Oath, Teads, and Turner) to optimize their content and target their videos at the right audiences.
Read the source article at SearchEngineJournal.com.