By Robert Molloy, PhD, Director, NTSB Office of Highway Safety
The National Transportation Safety Board (NTSB) met on February 25 to consider the 2018 collision of a Tesla Model X, operating with partial driving automation, with a damaged crash attenuator in Mountain View, California. The car steered out of its travel lane and into a gore area, where it collided with the damaged highway safety hardware. The driver didn’t notice the errant path the vehicle had taken because he was interacting with a game application on his work phone.
It was a tragic event for both the driver and his loved ones, and the tragedy was compounded because the event was utterly preventable. In this crash, the driver behaved as if his partially automated vehicle were self-driving when it wasn’t. The driver’s resulting distraction, tragically, led to his death. But it’s rare that a crash is the result of a single factor. At the NTSB, we try to identify all the factors contributing to a crash so we can propose multiple methods to prevent a similar crash in the future. The NTSB doesn’t apportion blame or liability; we look for ways to prevent the next occurrence.
In this crash, we identified or reiterated several ways to prevent a similar tragedy:
- Because drivers using portable electronic devices while driving often crash, we recommended that device manufacturers find a way to lock people out of their devices while they’re driving.
- Because “Autopilot,” Tesla’s automated vehicle control suite, is only designed for certain conditions, we reiterated our recommendation to disable it when those conditions are not met.
- Because Tesla’s proxy measure for driver engagement—torque on the steering wheel—was previously found ineffective, we reiterated a recommendation that Tesla find an effective measure of driver engagement.
- Because this vehicle crashed into objects that it “did not detect, and [were] not designed to detect,” (a crash attenuator) we recommended that the National Highway Traffic Safety Administration (NHTSA) rate collision avoidance systems under its 5-star rating program, incorporating such objects into its assessment.
- Because we found that misuse of Tesla’s automation was foreseeable, we recommended that NHTSA evaluate Tesla Autopilot-equipped vehicles to determine if the system’s operating limitations, foreseeability of driver misuse, and ability to operate the vehicle outside the intended operational design domain pose an unreasonable risk to safety, and to ensure that Tesla takes corrective action if safety defects are identified.
- Because the crash attenuator that the Tesla crashed into had not been repaired, and because lane markings were worn in the area of the crash, we made recommendations to state agencies responsible for maintaining highway infrastructure.
- Because Apple, the driver’s employer, had no distracted driving policy, we recommended that it adopt one.
- Because many other companies also don’t have such a policy, and because transportation accidents are a leading cause of workplace injury and death, we recommended that the Occupational Safety and Health Administration review and revise its distracted driving initiatives and add new enforcement strategies.
- Because it is important to have ready access to data that fits defined parameters to assess crashes involving automated vehicle control, we reiterated recommendations to require standardized data reporting, including incidents, crashes, and vehicle miles traveled, with such systems enabled. This recommendation would also allow the NTSB and NHTSA to evaluate real data on the safety of level 2 automation, not just industry claims.
When we investigate a crash, we aren’t looking for a driver, a company, or an agency to blame; we’re looking for all the ways the next crash can be prevented. When prevention is the goal, those drivers, companies, and agencies are often happy to help make the changes needed to ensure safety. We hope all parties will heed the lessons learned from this tragic crash and take the steps we’ve recommended to increase the safety of the traveling public.