This weekend saw an accident that killed two occupants in a Tesla Model S reportedly driving in Autopilot mode, reenergizing legal issues once relegated to science fiction: When a self-driving car crashes, who is liable — the driver or the tech?
As vehicles hit the road with more sophisticated driver-assistance features, disputes about safety and liability come to the forefront, and attorneys are working on legal theories to better determine liability between man and machine.
The frontier is rife with issues: Automobiles now gather far more data than ever before, which will affect the evidence-gathering and fact-finding phases of any litigation, shifting the focus away from eyewitness testimony to what’s recorded on the vehicles themselves — including vehicles that weren’t part of the accident.
Also, while the federal government regulates safety features, states regulate who is allowed to drive. A conflict between the two seems inevitable as technology starts to blur the definition of driver, attorneys said.
As the National Highway Traffic Safety Administration and the National Transportation Safety Board investigate the Tesla crash, experts say the question of liability will shift bit by bit from the person behind the wheel to the company that made the wheel.
Experts cite one framework that illustrates the nuanced range of liability involving driverless cars and trucks: a chart detailing the level of automation for vehicles (the six point scale of automation), published in 2019 by SAE International, previously known as the Society of Automotive Engineers.
The chart serves as “the industry’s most-cited reference for automated-vehicle (AV) capabilities” and provides consumers with a standard for levels of driving automation, SAE says.
It features six tiers, from Level Zero, with no automation, to Level 5, with full vehicle autonomy.
Between those levels is the shifting burden of liability between driver and automaker, experts say.
During litigation, questions might focus on whether the car made decisions it shouldn’t have — based on a reasonable person’s expectations or the analysis of an expert working from an industry standard. However, among the bigger questions about self-driving car accidents is whose regulations and laws come into play.
Does the NHTSA regulation take the lead, considering the automated driving system as a “safety feature” that it governs? Or does state licensing law prevail, considering the system as if it were a person behind the wheel?
Last Saturday’s Tesla accident in Texas is likely to bring renewed attention to auto companies’ disclaimers and warnings about automated driving.
No one was in the driver’s seat of the Tesla at the time of the crash, according to local authorities; one of the car’s occupants was in the front passenger seat, the other in the back seat. Their deaths raise questions over whether the vehicle’s so-called Autopilot function was engaged, especially with no one behind the wheel. Safety features require a driver behind the wheel during Autopilot, according to a statement from Tesla owner Elon Musk earlier this week. The Texas accident might spur regulators to take a more active role, such as by making sure advertising about autonomous vehicles doesn’t mislead consumers into thinking they can let the car do all the driving just yet.