Affectiva Launches A.I. Tech to Help Cars Sense Your Emotions

(Page 2 of 2)

still needs to be worked out. But she says it probably would at least involve sending an alert to the ride-sharing company.

If autonomous vehicles become ubiquitous, Affectiva intends to shift its in-car sensing technology to focus on improving the riding experience. For example, if the software determines from people’s facial expressions or vocal cues that they’re becoming nauseous or uncomfortable, it could tell the vehicle’s computer to slow the car down, adjust the driving style, or even stop to take a break, el Kaliouby says.

In “the reviews of the first iterations of autonomous vehicles, they’re pretty uncomfortable to ride,” she says. “No brand wants to be associated with that.”

Affectiva and other autonomous vehicle advocates believe the “vehicle is evolving to become the entertainment hub of the future,” el Kaliouby says. To that end, her company’s in-vehicle software could also be designed to automatically serve up videos, music, or other content based on people’s moods.

The bigger picture here is that automakers, especially luxury brands like BMW, are starting to grapple with the prospect of driving getting taken out of the equation for people, el Kaliouby says.

Carmakers have “always been about the driving experience,” she says. “That’s becoming commoditized. They’re trying to figure out what their role is in this emerging ecosystem.”

One question is why vehicle passengers would want to rely on Affectiva’s software to sense that they want the car to slow down, change the cabin temperature, or play the latest hit song from Beyoncé. Wouldn’t it be easier or more reliable to tap a button on the dashboard or speak the command to a virtual assistant? El Kaliouby argues “it would be so much better if the car had the context of who you are, and it understood your emotional state.”

She says Affectiva’s software could also improve the interactions passengers have with in-vehicle virtual assistants.

“If you’re getting annoyed, you want this conversational interface to understand that and react accordingly,” she says. “It could acknowledge it’s not getting it right and try to do a better job.”

Single PageCurrently on Page: 1 2 previous page