x
Technology - August 1, 2025

Tesla Faces $242.5 Million in Damages for Fatal 2019 Autopilot Crash: Jury Determines Company Partly Liable

In a Miami court, a jury has found Tesla partially liable for a fatal 2019 crash involving its Autopilot system, ordering the tech company to pay $242.5 million in damages to the family of the deceased and an injured survivor.

The damages consist of $42.5 million in compensatory damages, with the remaining $200 million in punitive damages aimed at deterring similar conduct from Tesla. The jury assigned 33% liability to Tesla for the accident.

If Tesla does not appeal the decision, it would be responsible for the full $200 million in punitive damages, as these damages are typically capped at three times compensatory damages in such cases.

Tesla has announced its intention to appeal the verdict. The plaintiffs’ lawyers expect Tesla to pay the entire $200 million, bringing total payments to approximately $242.5 million.

The trial, which started on July 14 in the Southern District of Florida, centered around determining accountability for a deadly crash in Key Largo, Florida. The victim was George McGee, a Tesla owner driving his Model S electric sedan while using the company’s Enhanced Autopilot system.

While operating the vehicle, McGee dropped his mobile phone and reached for it. He testified during the trial that he believed Enhanced Autopilot would apply the brakes if an obstacle was detected. Unfortunately, his Model S accelerated through an intersection at over 60 miles per hour, striking a stationary parked car and its owners who were on the other side of their vehicle.

Naibel Benavides, age 22, succumbed to her injuries on the scene. Her body was discovered approximately 75 feet from the point of impact. Her boyfriend, Dillon Angulo, survived but sustained multiple broken bones, a traumatic brain injury, and psychological effects.

Brett Schreiber, counsel for the plaintiffs, stated in an email, “Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans.” He added, “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way.”

Following the verdict, the plaintiffs’ families embraced each other and their lawyers, with Dillon Angulo reportedly displaying visible emotions as he hugged his mother.

Tesla responded to CNBC by stating, “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.” The company added, “We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator – which overrode Autopilot – as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash.”

The verdict comes at a time when Elon Musk is trying to convince investors that Tesla can lead the way in autonomous vehicles, with self-driving systems considered safe enough for operation on public roads in the U.S. Tesla’s shares dipped 1.8% on Friday and have dropped 25% for the year, marking the largest decline among tech’s megacap companies.

The verdict could serve as a precedent for Autopilot-related lawsuits against Tesla, with around a dozen active cases currently underway that involve similar claims concerning incidents where Autopilot or Full Self-Driving (Supervised) was in use just before a fatal or injurious crash.

The National Highway Traffic Safety Administration initiated an investigation into possible safety defects in Tesla’s Autopilot systems in 2021, with the agency opening a second probe to evaluate the effectiveness of Tesla’s “recall remedy” for issues concerning the behavior of its Autopilot system, particularly around stationary first responder vehicles.

The NHTSA has also warned Tesla that some of its social media posts may mislead drivers into thinking that its cars can function as robotaxis, despite owners’ manuals stating that the cars require hands-on steering and a driver who is attentive to steering and braking at all times.

A site tracking Tesla-involved collisions, TeslaDeaths.com, has reported at least 58 deaths resulting from incidents where Tesla drivers had Autopilot engaged just before impact.