Tesla on autopilot killed two people. Is the driver guilty?

On December 29, 2019, a Honda Civic was stopped at the intersection of Artesia Boulevard and Vermont Avenue in Gardena. It was just following midnight. The traffic light was green.

As the car proceeded through the intersection, a 2016 Tesla Model S on autopilot exited a freeway, ran a red light, and collided with the Civic. The driver of the Civic, Gilberto Alcázar López, and his passenger, María Guadalupe Nieves-López, died instantly.

Nearly two years later, Los Angeles County prosecutors filed two counts of involuntary manslaughter once morest the driver of the Tesla, 27-year-old Kevin George Aziz Riad. Experts believe this is the first felony prosecution in the United States of a driver accused of causing a death while using a partially automated driver assistance system.

The case represents a milestone in the increasingly confusing world of automated driving.

“It’s a wake-up call for drivers,” said Alain Kornhauser, director of the self-driving vehicle program at Princeton University. “It certainly makes us suddenly aware that we are responsible, not only for our own safety but for that of others.”

Although automated capabilities are meant to help drivers, systems with names like Autopilot, SuperCruise and ProPilot can mislead consumers into thinking cars are capable of much more than they really are, Kornhauser said.

Yet even as fully autonomous cars are being tested on public roads, automakers, tech companies, engineering standard-setting organizations, regulators and lawmakers have failed to make it clear to the public – and in some cases with each other – what the technical differences are, or who is subject to legal responsibility when people are injured or killed.

Riyadh, a driver for a limousine service, has pleaded not guilty and is out on bail while the case is pending. His attorney did not respond to a request for comment Tuesday.

If Riyadh is found guilty, “it’s going to give everyone who owns one of these vehicles the chills and realizes they’re responsible,” Kornhauser said. “Just like when I was driving a ’55 Chevy, I’m the one responsible for making sure it stays between the white lines on the road.”

After the deadly collision in Gardena, the National Highway Traffic Safety Administration opened an investigation into the crash to determine what had happened. Although court documents filed in Los Angeles do not mention Autopilot by name, the agency is expected to soon release findings reflecting the technology was activated.

A parade of similar investigations in the years since has continued to question the safety and reliability of automated driving features, including a broader investigation into as many as 765,000 Tesla cars made between 2014 and 2021.

Last year, the NHTSA ordered dozens of auto and technology companies to report crash data from automated vehicles in order to better monitor their safety.

No commercially available vehicle today can completely drive itself, the agency reported. Tesla’s Autopilot feature is classified as “Level 2” vehicle autonomy, meaning the vehicle can control steering and acceleration, but a human in the driver’s seat can take control at any time.

“Whether an automated driving system [Nivel 2] activated or not, all available vehicles require the human driver to be in control at all times, and all state laws make the human driver responsible for the operation of their vehicles,” said an NHTSA spokesperson. “Certain advanced driver assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of collisions that do occur, but as with all technology and equipment in motor vehicles, drivers must use them correctly. and responsibly.”

The outcome of the case might set a precedent for an industry that finds itself torn between cutting-edge technology and basic human error.

Bryant Walker Smith, a professor at the University of South Carolina and an expert in the law related to automated motor vehicles, said that while companies routinely face civil liability, they “rarely face criminal liability for the decisions they make.” design”.

However, Smith takes issue with the statement that drivers – and only drivers – are responsible at or below automation level 2. For example, if the airbag If a healthy car suddenly exploded and caused an accident, the driver would not be at fault, he said.

“In contrast, if an automaker sells a dangerous car, it might be held civilly liable. And, in theory at least, they might face criminal liability. However, this is a rare possibility,” he said, noting that when the state of Indiana prosecuted Ford for an accident involving its Pinto model more than four decades ago, the company was acquitted.

A representative for Tesla, which has dissolved its media relations department, might not be reached.

It is clear to many legal experts that the responsibility for Level 2 systems like Autopilot lies squarely with the driver, not with the companies that market technologies that can lead consumers to believe that features are more capable than they actually are.

But the California Department of Motor Vehicles is wrestling with confusion over Tesla’s full self-driving feature, a cutting-edge version of Autopilot that claims to do exactly what the name says: provide full autonomy, to the point where need no human to drive.

But while other self-driving car developers like Waymo and Argo use test drivers who follow strict safety guidelines, Tesla is conducting its tests using its own customers, charging car owners $12,000 for the privilege.

And while the other autonomous technology companies are required to report crashes and system failures to the Department of Motor Vehicles under its test permit system, the agency has been allowing Tesla to waive those regulations.

Following pressure from state lawmakers, prompted by terrifying videos posted on YouTube and Twitter pointing to Full Self-Driving malfunctions, the DMV said earlier this month that it was “reviewing” its position on Full Self-Driving. Tesla technology.

The agency is also conducting a review to determine whether Tesla is violating another DMV rule with its fully self-driving systems, one that prohibits companies from marketing their cars as self-driving when they aren’t.

That review began eight months ago; the DMV described it in an email to The Times as “in progress.”

Amidst the confusion regarding automated vehicles, what is less murky are the true tragedies that stem from accidents.

In 2020, Arizona authorities filed negligent homicide charges once morest the driver of an Uber SUV who struck and killed a pedestrian during a test of fully autonomous capabilities. The person killed in that collision, Elaine Herzberg, is believed to be the first self-driving vehicle fatality.

In Los Angeles, the families of López and Nieves-López have filed lawsuits once morest Riyadh and Tesla.

Arsen Sarapinian, an attorney for the Nieves family, said Tuesday that they are closely monitoring the criminal case, awaiting the results of the NHTSA investigative report and hoping that justice will be served.

But, Sarapinian said, “neither the pending criminal case nor the civil suit will bring Ms. Nieves or Mr. Lopez back to life.”

If you want to read this article in English, click here.

Leave a Replay