Backyard Brains Logo

Neuroscience for Everyone!

+1 (855) GET-SPIKES (855-438-7745)


items ()

Neuroscience Pushes Autonomous Cars Forward – Backyard Brains CEO Discussion With TED & Lexus

The future of AV

Can you imagine riding on an autonomous car that knows your emotions and thoughts better than you do?

Neither can we, but our co-founder and CEO Dr. Greg Gage can, and he isn’t freaked out by the prospect. On the contrary, he’s quite optimistic about it. How come?

Human-centered technology is the keyword here. As Senior TED Fellows, Greg and artist Sarah Sandman were invited by Lexus to give their two cents on the future of AV (autonomous vehicles) operated by AI rather than humans. They both think that there is a possibility of a car that is not driven by humans yet remains human-driven – or rather, in Greg’s vision, emotion-driven!

Can a Car Feel You?

Emotions are, Greg says in his latest TED video, one of the major evolutionary inventions that we’ve developed in order to better interact with one another. A human-centered autonomous vehicle should therefore be equipped to detect not only obstacles on the road and other external signals, but also the passenger’s state of mind. Are you tired? The seat’s already lowering down into a bed and tucking you in with some chill-out music. There are sensors picking up your bodily signs such as blood pressure or EKG, so the car can give you exactly what you need before you even know you’re needing it.

This is not science fiction but merely smooth teamwork between the car’s brain and your own. It’s no secret, though, that this car – any car – will be a lot smarter than you are. But no reason to feel bad about it! If you allow it insight into your emotions, it will go one step further from technology for its own sake to technology for human sake.

The future of AV
“Does it know what it thinks it knows?” (Source: TED video)

Devices can’t read our minds just yet. But Greg doesn’t think it’ll be all doom and gloom once they learn how to do it. We reached out to him to tell us a wee bit more about the broader impact of AV that detects and interprets human emotion. “I think that advanced technology will be forced to interpret our emotional state to better communicate and make better predictions of our needs. It could even help us communicate better with other humans by modeling healthy behaviour within its interaction,” Greg told us while tinkering with the DIY beehive in his backyard. “In an autonomous car, the occupants would not be in direct control. So any detected emotional states of passengers (aggressive or otherwise) would be shielded from the control systems. I feel there could be some feedback in extreme situations, for example, if someone is in an emergency and needs to get to a doctor. But overall, this technology is mostly inward focused on the passenger’s experience.”

So the goal is to improve human-to-human interaction by way of an advanced car-to-human interplay. Plug us in!

Another thoughtleader and TED Senior Fellow, Sarah Sandman envisions a car that we’ll happily leave our homes for – one that would fulfill our social as well as physical and emotional needs. As you’re dropping your kid to school, you could be spending some quality time with them instead of keeping your eyes locked on the road. And how about having a cozy cup of coffee with other passengers instead of silently sitting next to each other on a train, everyone glued to their smartphone? Commuting could finally become communal!

https://www.youtube.com/watch?v=_tRhsR7oCMQ&feature=youtu.be

If we create a human-centered artificial intelligence, we won’t have to worry about evil robots chasing us around the scorched earth as they whip us into submission. How about (re)claiming our own future instead of envisioning dystopian havoc?


No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.