In 2019, a young couple drove their Tesla Model 3 on an Indiana Interstate. Derrick (25) and Jenna (23) Monet used the sedan’s driver-assistance system, also known as the Autopilot function. Without warning, the Tesla inexplicably veered into a parked fire truck.
Derrick fractured his neck, spine, and shoulder. He broke a leg and ribs. Jenna succumbed to her injuries and died in the hospital.
This was one of more than a dozen such incidents since 2018. Tesla cars with an advanced driver assistance system (ADAS) – autonomous self-driving systems—have been crashing into first-responder vehicles like the collision of polar-opposite magnets.
In 2020, Ben Morris began to fear his car. Also, a Tesla owner, Morris issued complaints with the National Highway Traffic Safety Administration (NHTSA) after his Tesla began phantom braking while using the ADAS. Morris feared he would be rear-ended in traffic after the car would abruptly stop without any other cars in front of them and for no apparent safety reasons.
The NHTSA has received more than 750,000 complaints concerning phantom braking since last year. Formal investigations into these defects have been opened by safety regulators, calling into question how this modern marvel of automotive technology is being developed and marketed to their consumers.
In May, the NHTSA sent a 14-page letter to Tesla requesting both consumer and field reports regarding phantom braking. Crash reports and detailed information on any property damage, injuries, and deaths were also requested. The investigation also studied the correlation of active ADAS and automatic emergency braking systems during the incidents
Initially, the agency’s investigation looked at 354 complaints of phantom braking in the Tesla Model 3 and Model Y, covering more than 416,000 vehicles manufactured between 2021 and 2022. At the time, there were no reports of injuries or collisions.
After consumers reported the cars would repeatedly decelerate rapidly and without warning, putting the cars in danger of being rear-ended. These phantom braking events frequently occurred during the same drive cycle.
The NHTSA’s probe examined the vehicles’ autonomous features, including:
The agency also analyzed:
The NHTSA’s investigation appears to focus on how the sensors and cameras in Tesla’s automated systems detect specific areas and items, including:
This is the latest in a long history of battles between Tesla CEO Elon Musk and U.S. government agencies like the NHTSA. Musk has referred to the NHTSA and their attempts to regulate self-driving technology as the “fun police.”
The last three years have seen four formal investigations into Tesla’s autonomous systems, including Autopilot and Full Self-Driving—both features require human supervision. One of these investigations is evaluating multiple incidents where Tesla cars operating on Autopilot crashed into parked emergency vehicles.
Two more of these defect probes came after the National Transportation Safety Board (NTSB) pleaded with Tesla to continue to test and modify safeguards on their automated features and the self-driving software used in their vehicles.
There were no substantial laws to create oversight for driver-assistance and self-driving software in 2016. Then, the driver of a Tesla Model S was killed when the autopilot car careened into the trailer of an 18-wheeler.
Two months later, NHTSA released a set of guidelines. Regulatory oversight still has not been introduced, leaving Full Self-Driving (FSD) systems controlled solely by automakers.
The marketing of vehicles with FSD can also be misleading because Tesla cars are not actually capable of driving themselves. Tesla still charges $12,000 for the feature, even though human supervision is still needed by the wheel.
Tesla has blazed their own path in FSD, opting not to install cameras behind the steering wheels to ensure drivers are paying attention to the road like Ford and GM. Other auto companies also limit the use of their FSD to roadways that they have mapped and tested.
Tesla is not the only car company with active investigations. Since the beginning of 2021, more than 20 types of vehicles have been recalled. New models of Hondas have also had complaints detailing phantom braking, which could affect more than 1.7 million Honda vehicles.
Since 2016, the NHTSA has investigated all the automakers which contain automated driving systems that were either in use or suspected of being used during a crash. The agency has logged 34 crashes so far with a Tesla involved in 28 of these crashes. There have been 15 fatalities in these crashes, and 14 deaths occurred in Tesla-related collisions.
An autonomous vehicle crash can be quite different from the ordinary, everyday crash. When a self-driving car crashes, the determination of liability can seem like shifting sand. It could be one of the involved parties, or it could be several different parties. These could include:
Because there are so many considerations for who is at fault in an autonomous vehicle crash, a complicated investigation may be needed. These investigations tend to last longer as various aspects of the incident are examined to truly discover how and why a crash happened.
To help with these investigations, the NHTSA now requires any crash involving an autonomous driving system be reported by the car companies, parts providers, and software suppliers. All reported collisions must have either a death, a medically treated injury, a towed-away vehicle, a deployed airbag, or involve a bicyclist or pedestrian.
According to Consumer Reports, if entities like automakers and parts or software suppliers fail to report the incident within 24 hours of learning about it, they can be fined or subject to civil action.