Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Should Self-Driving Cars Have Drivers Ready To Take Over?

A member of the media tests a Tesla Motors Model S car with an Autopilot system. Regulators and manufacturers are debating whether self-driving cars should have a licensed driver inside as a safety precaution.
Bloomberg
/
Bloomberg via Getty Images
A member of the media tests a Tesla Motors Model S car with an Autopilot system. Regulators and manufacturers are debating whether self-driving cars should have a licensed driver inside as a safety precaution.

The day when you'll be chauffeured to work by your car may not be far off.

Right now, the legal groundwork is being laid to make way for the self-driving car around the nation. NPR's Robert Siegel is talking to several key players this week about the emerging world of self-driving cars.

In the latest conversation, he spoke with Brian Soublet, deputy director and chief legal counsel for the California Department of Motor Vehicles — an agency that robotic car advocates have accused of squelching innovation before it even gets on the road.


Interview Highlights

On the California DMV's questions about self-driving cars

We're concerned about how safe the vehicles are. Could the vehicles operate in all the various weather conditions that we're used to? Will the sensors be able to detect the changes in the road surfaces? What would happen if there was an emergency failure of the autonomous technology? What would the vehicle be able to do? Will they obey all of the traffic laws? Who would have liability exposure if there was an accident?

On the department's proposed regulation that self-driving cars have a licensed driver inside

That is at least initially what we put out and what we are calling a draft of our deployment regulations. ... By statute, we haven't had any testing of a completely driverless vehicle. Our approach was that there needed to be a driver in it. What we would contemplate in the future is some testing that would involve a vehicle with no driver in it.

If you think about it, the person is the backup to the automated systems, the concern being: What happens if there is a failure of the technology? We don't want to see vehicles just stopping in the roadway. There has to be some contemplation of how the vehicle would be controlled so it doesn't become a danger to other motorists.

On the ethical questions of driverless cars

What is the vehicle going to do when it is faced with two bad choices? How is it going to make that decision? We don't necessarily have an answer for that. That is one of our troubling points when we deal with the manufacturers. How are you going to program the vehicle to make what we would consider to be the ethical choice? We often use the example of a shopping cart full of groceries versus the baby stroller with a child in it. How does the car know what to do in that scenario, especially if it's a no-win scenario, the car has got to hit one of them? How does the car know which is the right thing to hit?

On the vision for driverless cars

I think we're going to get there. One of the things that people need to realize is the average age of a vehicle on the streets right now is about 11 years old, so there is going to be a transitional phase when the highly automated vehicle is sharing the roadway with vehicles that aren't automated. I spoke to a junior high school class a couple of months ago and I told the students who were about 12, 13 years old that the first car that they'd probably buy with their own money is going to be a highly automated vehicle.

On Wednesday, Siegel will talk to Chris Urmson, lead engineer on Google's self-driving car project. Siegel previously spoke with U.S. Transportation Secretary Anthony Foxx about the legality of the technology.

Copyright 2021 NPR. To see more, visit https://www.npr.org.