Tesla’s controversial Autopilot driver assistance system has been involved in 736 crashes since 2019, the Washington Post reports. A similar report published last summer found fewer than 300 such crashes.
“When authorities first released a partial accounting of accidents involving Autopilot in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 of them since last May,” the Post says, “and five serious injuries.”
Tesla’s Three Systems
Tesla sells three different partial automation systems.
Autopilot is standard on all Tesla models, though it has sometimes been an added-cost option. It includes an intelligent cruise control that matches the car to the speed of the surrounding traffic. A lane-centering function helps keep the vehicle in the center of its lane.
Enhanced Autopilot is a $6,000 option. It adds navigating highway on- and off-ramps and interchanges on top of what Autopilot can do. It also adds a self-parking system and includes a “summon” function that lets owners call the car to them at parking lot speed from nearby.
Full Self-Driving Capability is now a $15,000 option. Tesla says it will read and react to traffic lights and stop signs and steer around some turns with the driver’s “active supervision.” While most automakers test their partial automation systems on closed tracks, Tesla has rolled out Full Self-Driving to hundreds of thousands of drivers and asked them to sign waivers to use it. The company calls this approach “beta testing” – a development strategy common to the software world but almost unheard of in systems that can affect life safety.
The federal government forced changes to the Full Self-Driving system in February. But several investigations of all three systems are reportedly ongoing – including, according to Reuters, a criminal inquiry.
There Are No Self-Driving Cars
Manufacturers use misleading names to describe their early attempts at automated driving. Kelley Blue Book refers to these technologies as driver assistance systems because they require the driver’s active supervision at all times. But automakers market them under names like Autopilot (Tesla), Drive Pilot (Mercedes-Benz), ProPilot Assist (Nissan), and Pilot Assist (Volvo), which may imply that the driver can just let the car drive.
An Aggressive Push Toward a Self-Driving Future
Many automakers are developing systems they hope will lead to autonomous cars, but few have been as public about it as Tesla. The company has cut prices to the bone on its lowest-priced vehicles, and CEO Elon Musk has been public about the strategy – their goal is to get as many Teslas on the road as possible so the company can make money on software downloads.
Musk has said that self-driving software’s success is “really the difference between Tesla being worth a lot of money or worth basically zero.”
Tesla has marketed the systems aggressively, making some misleading claims. In 2016, the company released a video allegedly showing a car using Autopilot to navigate traffic lights, with the caption, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
A Tesla engineer later testified under oath that the video was staged, with a driver controlling the car.
“Enormous Gulf” With Other Automakers in Accidents
Subaru had the second-highest number of crashes attributed to driver assistance systems in the Posts’ review of National Highway Traffic Safety Administration data – 23.
“The enormous gulf probably reflects wider deployment and use of automation across Tesla’s fleet of vehicles, as well as the wider range of circumstances in which Tesla drivers are encouraged to use Autopilot,” the Post reports.
Tesla has an army of loyal buyers who identify with the company – a deeper relationship with fans than most automakers can claim, including a subset of owners who make videos of themselves testing the driver assistance systems and, sometimes, foiling their vehicles’ failsafe features.
In one accident studied in detail by Post reporters, authorities claimed the car’s owner had “fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands.” Moving at 45 mph, the car struck a child stepping off a school bus.
Washington Post Report Concerns Autopilot
The Post’s analysis concerned only the standard system – Autopilot. “NHTSA’s data includes some incidents where it is “unknown” whether Autopilot or Full Self-Driving was in use,” the Post reports.
The 17 fatal crashes, the paper found, follow specific patterns. “Four involved a motorcycle. Another involved an emergency vehicle. Meanwhile, some of Musk’s decisions — such as widely expanding the availability of the features and stripping the vehicles of radar sensors — appear to have contributed to the reported uptick in incidents,” the Post reports.
More than a dozen accidents involved “reports of Teslas slamming into parked emergency vehicles while in Autopilot,” when flashing lights may confuse sensors about whether an emergency vehicle is stationary.
Conflicting Claims, Limited Data
Musk has repeatedly claimed Tesla’s driver supports are safer than a human driver. But the company has blocked NHTSA from releasing data that would enable researchers to study the claim.
“In the section of the NHTSA data specifying the software version, Tesla’s incidents read — in all capital letters — “redacted, may contain confidential business information.” the Post explains.
“Tesla is having more severe — and fatal — crashes than people in a normal data set,” former NHTSA senior safety adviser Missy Cummings told the Post.
“We need to understand if it’s due to actually worse crashes or if there’s some other factor such as a dramatically larger number of miles being driven with Autopilot on,” added Philip Koopman, a Carnegie Mellon University professor who has conducted research on autonomous vehicle safety for 25 years.