Multiple civil cases — and a federal investigation — contend that Tesla’s technology invites ‘drivers to overly trust the automation’
SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.
At least eight lawsuits headed to trial in the coming year — including two that haven’t been previously reported — involve fatal or otherwise serious crashes that occurred while the driver was allegedly relying on Autopilot. The complaints argue that Tesla exaggerated the capabilities of the feature, which controls steering, speed and other actions typically left to the driver. As a result, the lawsuits claim, the company created a false sense of complacency that led the drivers to tragedy.
Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla. In Tennessee, an intoxicated man allegedly using Autopilot drives down the wrong side of the road for several minutes before barreling into an oncoming car, killing the 20-year-old inside.
Tesla maintains that it is not liable for the crashes because the driver is ultimately in control of the vehicle. But that contention is coming under increasing pressure, including from federal regulators. Late Thursday, the National Highway Traffic Safety Administration (NHTSA) launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”
Meanwhile, in a twist, Tesla this month settled a high-profile case in Northern California that claimed Autopilot played a role in the fatal crash of an Apple engineer, Walter Huang. The company’s decision to settle with Huang’s family — along with arulingfrom a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.
“A reckoning is coming as more and more of these cases are going to see the light of a jury trial,” said Brett Schreiber, a lawyer with Singleton Schreiber who is representing the family of Jovani Maldonado, 15, who was killed in Northern California when a Tesla in Autopilot rear-ended his family’s pickup truck in 2019.
Tesla did not respond to multiple requests for comment on the lawsuits.
The outcomes of the cases could be critical for the company. Tesla’s stock has lost more than a third of its value since the beginning of the year. Last week, the company reported a steeper-than-expected 55 percent plunge in first-quarter profit as it struggles with falling sales of electric vehicles and stiff competition from China. To allay investors’ concerns, Musk has made lofty promises about launching a fully autonomous “robotaxi” in August. Soon, he said during Tuesday’s earnings call, driving a car will be like riding an elevator: You get on and get out at your destination.
“We should be thought of as an AI or robotics company,” Musk told investors. “If somebody doesn’t believe Tesla is going to solve autonomy, I think they should not be an investor in the company. But we will.”
Meanwhile, the company has defended itself in court documents by arguing that its user manuals and on-screen warnings make “extremely clear” that drivers must be fully in control while using Autopilot. Many of the upcoming court cases involve driver distraction or impairment.
Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”
But the Huang case also potentially involved a distracted driver: Huang was allegedly playing a video game when his Tesla plowed into a highway barrier in 2018. Tesla has not said why it decided to settle the lawsuit, and details of the settlement have not been disclosed in court documents.
More fatal crash details emerge
Meanwhile, federal regulators appear increasingly sympathetic to claims that Tesla oversells its technology and misleads drivers. Even the decision to call the software Autopilot “elicits the idea of drivers not being in control” and invites “drivers to overly trust the automation,” NHTSA said Thursday, revealing that a two-year investigation into Autopilot had identified 467 crashes linked to the technology, 13 of them fatal.
NHTSA did not offer specific information about those crashes. But two fatal crashes from 2022 are detailed in lawsuits that have not been previously reported.
In Phoenix, Iwanda Mitchell, 49, was driving a Teslain May 2022 when she struck a Toyota Camry that had stalled on the highway, according to court documents and dash-cam footage obtained by The Post. According to the Mitchell family’s lawyer, Jonathan Michaels with MLG Attorneys at Law, Autopilot and the car’s other features — including forward collision warning and automatic emergency braking — failed to result in Mitchell’s Tesla taking evasive action and prevent the vehicle from barreling into the stalled sedan.
Mitchell was then struck and killed by an oncoming vehicle when she got out of her car.
Tesla did not respond to a request for comment regarding this case. In response to the complaint in January 2024, Tesla said it denies the allegation and “has not yet had an opportunity to inspect” Mitchell’s vehicle.
About a month later in Sumner County, Tenn., Jose Roman Jaramillo Cortez drank two beers and three tequila shots after his shift at a local restaurant, and then hopped into his Tesla Model 3, court documents say. He plugged his address into the Tesla’s GPS and flicked on Autopilot, it said.
According to the lawsuit filed in June 2023 and dash-cam footage obtained by The Post, the car then pulled onto the wrong side of the road. After driving south in a northbound lane for several minutes, the Tesla rammed into a car driven by Christian Malone, 20, who died from the impact.In its response to the complaint, Tesla said “the crash was caused by the negligence and/or recklessness of the driver.”
Trial dates for both cases will be set later next year, Michaels said.
In another case — set for trial in November in Key Largo, Fla. — a Tesla in Autopilot allegedly failed to detect an approaching T-intersection while its driver searched for a dropped phone. The Tesla barreled through flashing lights and a physical barricade before crashing into a vehicle parked on the side of the road, killing a woman and seriously injuring a man.
In court documents, Teslahas argued that the driver was ultimately responsible for the trajectory of the car. Tesla also states in user manuals that Autopilot may not operate as intended “when unable to accurately determine lane markings” or when “bright light is interfering with the camera’s view.”
When these cases head to trial, juries may be asked to consider whether Tesla’s many driver warnings are sufficient to spare the company from liability. Ross Gerber, CEO of Gerber Kawasaki Wealth and Investment Management, said the last thing the company needs is a highly publicized courtroom battle that focuses attention on such questions.
At a trial, “the defense would dig into the weeds … and it would become very clear that the perception of the Autopilot software was very different from the reality,” Gerber said. “Every day would be a headline, and it would be embarrassing.”
So far, Tesla has faced a jury only once over the role Autopilot may have played in a fatal crash. In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames. Lee died of his injuries, while his fiancée and her son were severely injured.
Because of the extensive damage to the car, Tesla said it could not be proved that Autopilot was engaged at the time of the crash. During the trial, Michael Carey, the attorney for Tesla, argued the technology was not at fault, and that the crash “is classic human error.” According to a toxicology report taken after the crash, Lee had alcohol in his system but it was within the legal limit in California.
“This case is not about Autopilot. Autopilot didn’t cause the crash,” Carey said during opening statements. “This is a bad crash with bad injuries and may have resulted from bad mistakes — but you can’t blame the car company when that happens. This is a good car with a good design.”
Ultimately, Tesla’s arguments prevailed, and a jury found the company not liable.
But the company appears to face headwinds in some other cases. Last year, Florida Circuit Judge Reid Scottupheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path. The car plowed under the truck at full speed, killing Banner on impact.
In the ruling, Scott said the family’s lawyers “sufficiently” presented evidence to reasonably seek punitive damages at trial, which could run millions of dollars.
The plaintiffs’ evidence included that Tesla “knew the vehicle at issue had a defective Autopilot system,” according to the order. Citing other fatal crashes involving Autopilot, Scott wrote that there is a “genuine” dispute over whether Tesla “created a foreseeable zone of risk that posed a general threat of harm to others.”
Tesla’s appeal of the ruling is pending.
Change in defense strategy?
As the spate of lawsuits churns forward, Tesla has shown a fresh willingness to settle such cases — despite Musk’s vow on Twitter in 2022 to never settle “an unjust case against us even if we will probably lose.”
In addition to settling the Huang case, Tesla “indicated” that it was open to discussing a potential settlement in the Riverside case as it was being presented to a jury last fall, said Michaels, the MLG lawyer who represented Lee’s family.
The month-long trial featured testimony from an accident reconstructionist, a top engineer at Tesla and a paramedic who responded to the crash and said it was among the most horrificaccidents he had ever seen. Michaels said he declined to engage in settlement talks because he wanted to continue to “make this a really public issue.” He said he also “did not have confidence in our ability to come to an agreeable amount.”
Tesla and its lawyer in the case, Carey, did not respond to a request for comment.
After four days of deliberations, the jury decided the case in Tesla’s favor.
Though he lost, Michaels said the trial attracted media attention and gave other lawyers with cases against Tesla insight into the company’s defense strategy. Plus, he said, his law firm’s phone has since been blowing up with potential clients.
“We walked away from guaranteed money,” Michaels said, “but that wasn’t what it was about.”