Recent news reports have covered the recent statistics over autonomous cars and their relative safety. And recently NHTSA got into a well-publicized dispute over Tesla’s claim that its new Model 3 is the safest car ever: Tesla has repeatedly stated that Model 3 occupants have “the lowest probability of injury of all cars the safety agency has ever tested.” But the National Highway Traffic Safety Administration has told the manufacturer that it “is impossible to say based on the frontal crash results or overall vehicle scores whether the Model 3 is safer than other 5-Star rated vehicles.”
Autonomous vehicles and other innovations in the relative safety of trendsetting automobiles leads us to revisit the question: who is responsible when autonomous vehicles crash?
Tesla has been an innovator that has been willing to make bold moves in auto technology. In fact, it announced last month that its Autopilot feature will lead to lower insurance rates for consumers, in their projections. Tesla acknowledges that drivers must manually enable the feature and that drivers “must maintain control and responsibility” for the vehicle, even when using the Autopilot system. “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert”, Tesla said in a statement after the first fatality in a Tesla Autopilot vehicle in 2017.
Autonomous vehicles – whether partially autonomous or fully self-driving vehicles – will eventually experience failure that results in injury or death.
“For years people have been saying that the technology is ready, and it’s one of my pet peeves, because no it’s not,” said Bryant Walker Smith, a law professor at the University of South Carolina and an expert on autonomous driving issues.
– CBS News
In Missouri – and, I would suspect, most states – product defect laws would govern liability in these crashes. Missouri’s product defect statute permits a victim to hold a manufacturer or seller liable for injuries caused when the product was in a defective condition and unreasonably dangerous when used as reasonably anticipated. If the autonomous system failed to properly control the vehicle, then the vehicle was in a defective and unreasonably dangerous condition and the manufacturer or seller should be held liable.
Every person, business, corporation, and even car maker is responsible when they or their product injures or kills someone. As someone who has tried auto defect cases across the country, every car maker I have litigated against ultimately says that they stand behind their car and will accept responsibility if there’s something wrong with it. Then they say there’s nothing wrong with it. What about the times when it’s not so clear?
“There is going to be a moment in time when there’s going to be a crash and it’s going to be undetermined who or what was at fault,” said David Strickland, former head of the National Highway Traffic Safety Administration and now a partner at Venable LLP law firm in Washington. “That’s where the difficulty begins.”
Sorting out the fault in such cases may ultimately require a detailed and complex forensic computer analysis. Was the code the problem? Was it an algorithm? Faulty hardware? Was the mapping programming defective? Did a camera (or radar) fail? Was the highway not marked properly? In our practice we have uncovered a wide variety of parties responsible for deaths and serious injuries on American highways. Now that autonomous driving is being treated as an eventuality, the legal system will be looking closely at how the manufacturers of those cars are held responsible in coming years as collisions lead to casualties.