Self-driving cars may not have hit the roads for everyday use yet, but they’re no longer the stuff of science fiction stories. Vehicles in Google’s Waymo project have racked up more than 4 million miles on actual roads, while many other companies (including Audi, GM, Mercedes, Uber, and even Apple) have some kind of autonomous vehicle research initiative underway.
Car manufacturers continue to slip new technology that comes out of these programs, such as autonomous emergency braking, into their latest models while others continue to work toward the release of a fully autonomous vehicle.
Every safety improvement and advance is welcome, but every once in a while something happens that should make us all stop and ask what’s going to happen—and who’s going to be responsible—when something goes wrong with one of these vehicles.
Crash Raises Questions
The dramatic rear-end crash of a Tesla Model S into a parked fire truck in California earlier this year raises these kind of alarm bells. Fortunately, there were no injuries. The driver of the Tesla said he was using the car’s Autopilot feature (although this has not yet been confirmed), which should have prevented such a crash. An incident in Florida in 2016 led to a driver’s death when Autopilot failed to detect a tractor-trailer truck crossing the highway.
Tesla touts Autopilot as if it is already a fully self-driving system, but the fine print notes that not all features are available or allowed for use because of regulatory considerations. It’s not yet clear if the California crash can be blamed on the system having weaknesses, the features being used improperly by the driver, or some other issue.
Currently all these self-driving and driver assist features are meant to be used only as instructed. Tesla, for example, tells drivers to only use the system on limited access highways and to stay alert, and the system warns drivers when their hands are off the wheel for more than a few seconds.
Responsibility a Work in Progress
Who has legal responsibility when one of these systems fails still needs to be worked out. Existing law would hold the operator or driver responsible, but it seems more likely that liability could be parceled out to owners, manufacturers, and even programmers.
Personal injury and product liability law could both come into play. In a bold step, Volvo has already declared that it will accept full liability for its self-driving vehicles when they hit the streets (which might be as early as 2019). Not all manufacturers have taken this step, and few laws have yet addressed the issue.
Car Accident Lawyer
The ongoing debate over the liability issues surrounding autonomous vehicles is something that most people don’t spend much time thinking about, and when you’ve been hurt in a car crash, that kind of thing is not what you want to focus on.
That’s the kind of detail you leave to the attorneys at Kohan & Bablove, Injury Attorneys, who put all their time and energy into understanding all the issues of automobile accident law—so you don’t have to.
When you need help with your car accident case, call our experienced team at 1-844-404-2400 or contact us online to schedule a no-cost, no-obligation consultation to discuss your case. We’ve helped hundreds of clients reach successful settlements, and we can help you.
By Donna Hicks