Forget Reading the Road, This Automotive AI Reads You

by Tony Kontzer

Stories are rife of the obstacles that developers of self-driving cars must overcome — challenges related to sensors, machine learning, real-time data analysis and reaction times. But we haven’t heard much about the passengers.

Affectiva, maker of “emotion AI” software, wants to change that. The company is building a platform that would establish profiles of in-car occupants, with a goal of logging their emotional, physiological and mental states.

“It could be understanding a nod, or it could be understanding that there are children in the back and you want the car system to stop presenting content to them once they’ve fallen asleep,” said Abdo Mahmoud, Affectiva’s product manager, during a well-attended session at the GPU Technology Conference.

Mahmoud added that it could also be detecting non-verbal feedback from passengers and, say, slowing down to reduce someone’s anxiety.

To do this, Affectiva tracks 20 different facial expressions. It uses a combination of GPU-powered convolutional neural networks and support vector machines that can interpret joy, sadness, disgust, anger and a host of other emotions. The software also can detect things such as inattentiveness due to eating, texting or putting on makeup. Or cognitive states such as boredom, confusion or frustration.

Mahmoud said the company now boasts the world’s largest emotion repository. It’s analyzed 5.3 million faces from 75 countries, and turned that into a database of more than a million images used to train the company’s network.

Affectiva also did a detailed study of 44 drivers so that it could collect real driving data and evaluate its technology. It discovered in the process that drivers have neutral expressions more than half of the time. Affectiva also found that understanding the context of a reaction is very important to categorizing expressions.

The company also has studied drowsiness by analyzing head movements, facial expressions, eye closures and blink rates as drivers tire. It’s data that “can be very good at predicting when an accident will happen,” Mahmoud said.

While the company has accumulated exhaustive data, that information would take on new life if it could be gathered from actual drivers in larger numbers, something Affectiva wants to do.

“Manifestations of expressions in the wild are much different than they are in the lab,” said Mahmoud. “We need to start putting cameras in cars now.”

Of course, collecting data on passengers’ emotional, physiological and mental states is touchy business. That’s especially true in regions such as Europe that have stringent privacy protections preventing such data from being stored on servers.

Mahmoud addressed that concern, saying that Affectiva’s models work without any cloud-based support. That makes it possible for data to remain inside the vehicle. Even then, the company is focused on getting permission from passengers before analyzing them.

“Capturing the permission of the person being recorded is an inherent assumption and a very important consideration,” he said.

Affectiva’s work is focused on the collection and categorization of facial expressions. But Mahmoud said it’s working with partners that can take that data and generate reactions. It’s a problem that’s too complex for the company to tackle itself.

Those interested in trying out the software themselves can download the company’s free AffdexMe app or visit the company’s booth on the GTC exhibition floor.