Tesla isn’t preventing the misuse of its Autopilot feature like it should, according to the National Transportation Safety Board (NTSB), which is calling the company out because of it.
During a hearing on Tuesday about a March 2018 Tesla crash that resulted in the driver’s death due to misuse of the Autopilot feature, the NTSB said that Tesla needs to do more to improve the safety of its Autopilot feature.
According to multiple reports, the NTSB made Autopilot safety recommendations to six automakers — including Volkswagen, BMW AG, and Nissan — in 2017, and Tesla is the only one that has yet to respond.
The board also determined that the driver of the 2018 crash was playing a video game on his phone instead of paying attention to the road. During Tuesday’s hearing, the NTSB said that while the driver was definitely distracted and was relying solely on the Autopilot function, the car’s forward-collision warning and automatic emergency brake system did not activate, according to CNBC.
“If you own a car with partial automation, [you do] not own a self-driving car. So don’t pretend you do,” said NTSB Chair Robert Sumwalt during the hearing.
Tesla’s Autopilot has come into question before, but mostly regarding drivers’ actions rather than Tesla’s technology. Drivers have fallen asleep behind the wheel, letting their Tesla model take control through the Autopilot feature, but the company cautions that this is definitely not what drivers should be doing.
“While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times, and maintain control of your car. Before enabling Autopilot, the driver first needs to agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your vehicle,’” Tesla’s support page on Autopilot reads.
Tesla had its first Autopilot fatality in 2016, but the NTSB reported that the driver was at fault for not paying attention. However, the NTSB also said that Tesla “could have taken further steps to prevent the system’s misuse,” according to Reuters.
Digital Trends reached out to Tesla to comment on Tuesday’s hearing, as well as to find out what it is doing to ensure that drivers use the Autopilot feature properly. We will update this story when we hear back.
- Tesla debuts ‘Full Self-Driving’ beta, but it comes with a warning
- New Tesla self-driving software could reduce driver interventions by one third
- Volkswagen ID.4 vs Tesla Model Y
- Tesla Model 3 vs. Tesla Model Y
- 2020 Tesla Model S vs. 2020 Tesla Model 3: Which one comes out on top?