Two people were killed in the 2019 crash when the car abruptly left the freeway and ran a red light, according to reports.
The Civic’s occupants, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, were killed in the crash, it added. Riad, a 27-year-old limousine driver, has pleaded not guilty and is free on bail.
The crash is not the first of its severity involving automation. Rafaela Vasquez, the backup driver in a fatal self-driving crash involving an Uber autonomous vehicle, was charged with negligent homicide in the 2018 crash in Tempe, Ariz., which killed a pedestrian pushing a bike across a dark thoroughfare.
The charges against Riad, however, reflect a likely first for a system available directly to consumers. The National Highway Traffic Safety Administration estimates Tesla Autopilot is available on 765,000 vehicles.
Autopilot is a driver-assistance program primarily used on highways, including features such as traffic-aware cruise control and Autosteer, which aims to keep cars within marked lane lines. The system is not autonomous, and is intended to be used with a fully attentive driver behind the wheel.
“While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car,” Tesla says on its website.
Tesla did not immediately respond to a request for comment. Tesla has argued that Autopilot is safer than typical car operation, when crash statistics between the modes of driving are compared. Elon Musk has called Autopilot “unequivocally safer.”
The Washington Post could not immediately access the court documents, which were not available digitally. The Los Angeles Times also reported on them. Attempts to reach Riad’s attorneys were not immediately successful.
Tesla has come under fire from safety advocates who see the Autopilot branding as overstating the capabilities of the system. More recently Tesla has faced scrutiny after offering a more advanced feature suite, called “Full Self-Driving,” along with a software beta that expands the driver-assistance capabilities to city and residential streets.
Federal regulators have also recently homed in on Autopilot over reports of crashes while it was activated. Over the summer the National Highway Traffic Safety Administration, the country’s top federal auto safety regulator, launched a formal probe into a dozen crashes involving parked emergency vehicles while Autopilot was active. One person was killed and at least 17 people were injured in the crashes.
Earlier in the summer the agency began requiring developers to report within one day certain crashes involving driver-assistance — such as those including a hospital-treated injury or a pedestrian or cyclist.
There have been several fatal crashes while Autopilot was activated. In 2016, a speeding Tesla collided with a tractor-trailer, killing the driver, when the Tesla did not register the trailer’s side against a brightly lit sky.
An Apple engineer was killed in 2018 when his Tesla slammed into a highway barrier while on Autopilot. The driver had a game active on his iPhone at the time, according to safety investigators who cited overreliance on the system in the crash.
Riad and Tesla also face civil litigation in the crash. Maria Luz Nieves, the mother of victim Maria Guadalupe Nieves-Lopez, sued Riad and the limousine company run by his father in 2020 alleging wrongful death. Nieves alleged product liability — including negligence and design and manufacturing defects — on Tesla’s part.
“Plaintiff alleges that the Incident was caused, in whole or in part, by design and manufacturing defects, and inadequate warnings, of the 2016 Tesla Model S, including but not limited to its features and systems" such as Autopilot, according to the suit.
Nieves’ attorney, Arsen Sarapinian, said a jury trial was scheduled in the civil suit for July 2023, and because it was running concurrently to the criminal case he declined to comment.
SOURCE: The Washington Post