By Andrew Nusca
Posting in Cities
Google's self-driving cars reportedly had their first public collision, sparking questions about who's responsible when an autonomous vehicle goes berserk.
The folks at auto blog Jalopnik this afternoon posted what they say are photos of Google's first (update: second!) public self-driving car crash.
Here's what we know thus far:
- It happened near Google's Mountain View, Calif. headquarters.
- It's a minor collision -- apparently one autonomous Toyota Prius rear-ended another.
- Autonomous vehicles are legal in California.
- Google says the driver had switched the vehicle into manual mode.
- Earlier this year, the state legislature of Nevada created a special license that allows autonomous vehicles on the state's freeways.
Google's self-driving vehicles use various sensors and algorithms to get the job done.
Here's our exclusive look at the project in a video from May:
And here's Stanford Artificial Intelligence Lab director Sebastian Thrun explaining the system, and the motive behind it, at TED 2011:
In his post, Jalopnik's Justin Hyde asks legitimate questions about potential fender-benders with autonomous cars. Who gets the ticket? How can we determine with accuracy whether the vehicle was in autonomous mode or not?
And how will the courts deal with these issues?
Related on SmartPlanet:
- Google's self-driving car
- Google's self-driving car, in action
- Will Google’s self-driving car lead to more sprawl?
- Google pushes to make self-driving cars street legal
- Self-driving cars: Will you trust a robot chauffeur?
- How just a few talking cars can end traffic jams
Aug 5, 2011
Your blogs and its stuff magnetize me to return again n again. http://quote44.com
Remarkable blog! I have no words to praise, it has really allured me. http://cheapautosinsurances.beep.com/
Well I agree that it is quite difficult to know that who should be blame for an autonomous car accident and how to determine who is guilty. http://www.bestautolenders.com/blog/bad-credit-car-dealers-in-houston/
In California autonomous vehicles are legal and I guess there must be some rules and regulations for blaming who has been liable for the accident. http://www.carinsurancerates.com/quote.html
Thatâs really a nice one, I have seen many blogs but they are outdated so Iâm pleased to see this blog now. http://automotorinsure.com/cheap-car-insurance/
Who's to blame? Current law already makes that clear. It's the prerson at the end of the line. In this case, it wasn't the Google car. It was in the middle of a three car pile up. There were two other cars in the next lane apparently also hit. But who pays? That is for the insurance companies to work out. These rules are all already in place. Who pays if there is nobody in the autonomous car? The owner's insurance company. If a manufacturers product doesn't drive well, the State can disallow it from being street legal. The same as is currently done with the cars that people drive. We don't usually notice it, but in the US, cars are licensed for use on a state by state basis. This will be no different. Instead of making it harder to get driverless cars, it will actually make it easier. First will come standards, then the insurance companies will start charging more for new drivers (they already do), then for drivers who may be found to have driven while impaired (think drunk, drugged, or on a cell phone) and finally, they will charge more for any driver who is human. Meanwhile, the car autopilots will be improving.
I assume these cars can or do use the same data recording hardware that all new cars have today. It doesn't even have to be integrated with the autonomous driving hardware. It shouldn't be a problem to determine who or what was in control of the vehicle at the time of the crash unless the car is completely totaled. I understand that there is a movement to require the data recording hardware to be as sturdy as the black boxes on airplanes. As far as I know, tickets can only be issued to people. As long as the car can be immediately switched into manual mode, by, for example, hitting the brakes, the person sitting in the driver's seat will ultimately bear the responsibility. Although the Toyota braking issue of a couple of years ago (allegedly caused by a software bug) was proved to be bogus, it did provide a test case. People who got into an accident were still liable; it was up to them to prove that faulty software was the cause.
Some human driving one of Google's test cars banged into another. This is really bad reporting. Go for your run. Clear your head. ("Who" was driving is black box data.) --- BTW, I think this is the second crash. One of their cars was rear-ended while stopped at a light, if I remember correctly.
I've never heard of a cruise control module getting a speeding ticket. The more interesting question will come when vehicles begin driving our streets with no human aboard. But if and when that happens we should have a convincing track record for autonomous cars with a human observer.
I appreciate your comment, but please do read the article first before making corrections! Right at the top: "Google says the driver had switched the vehicle into manual mode." As for the first vs. second debate, it seems you're right -- one of Google's autonomous cars was rear-ended by another driver at some point. (Here's one piece of coverage: http://www.scientificamerican.com/article.cfm?id=google-driverless-robot-car) I'll update the post.