Emotion Tracking AI for Autonomous Vehicle Drivers Launched by Affectiva
Emotion Tracking AI for Autonomous Vehicle Drivers Launched by Affectiva

Emotion Tracking AI for Autonomous Vehicle Drivers Launched by Affectiva

Affectiva today announced the launch of its Automotive AI service that lets creators of autonomous vehicles and other transportation systems track users’ emotional response. The Automotive offering is Affectiva’s third service for tracking the emotional response of product users and is part of a long-term strategy to build emotional profiles drawn from smart speakers, autonomous vehicles, and other platforms with a video camera or conversational interface.

An MIT Media Lab spinoff, Affectiva launched a voice analysis tool last year for the makers of AI assistants and social robots, but its computing services have been available to marketers and advertisers since 2010.

Affectiva can pick up emotions like joy, surprise, fear, or anger from a person’s face and things like laughter and arousal levels from their voice.

To develop its solution for cars, Affectiva has been working with OEMs, vehicle safety system providers like Autoliv, and robotaxi startup Renovo over the past 18 months, an Affectiva spokesperson told VentureBeat in an email.

Inside vehicles, Affectiva can use a combination of traditional RGB cameras and — starting today — near infrared cameras to create confidence scores based on things like the number of eyelid blinks per minute. It also uses AI models to identify drowsiness, yawning, and other signs of driver fatigue.

Factors like weather conditions or whether the car is on a highway or residential street could also be used to determine when a driver should take control.

In semi-autonomous vehicles, Affectiva will focus on monitoring drivers to increase safety and facilitate the handoff between human and machine.

“It needs to know whether the driver is paying attention or not. Is the driver tired? Is the driver asleep? Is the driver watching a movie? It really needs to understand the state of the driver, or the copilot in this case, so that it can safely transfer control back to a human driver,” CEO Rana el Kaliouby told VentureBeat in an interview.

Read the source article at VentureBeat.com.