The driver of a Tesla Model S blames the company’s Autopilot autonomous-driving system for a recent crash on Interstate 5 in California. Tesla disputes that claim, though, saying the system worked as designed.
Arianna Simpson told Ars Technica that Autopilot was to blame for a crash she had on April 26. She was driving on the I-5 with Autopilot engaged when the car in front stopped abruptly. She expected Autopilot to respond by applying the brakes, but it didn’t. As a result, she applied the brakes too late and rear-ended the other car. No one was hurt.
Yet Tesla says the vehicle’s data logs absolve Autopilot. Those logs show Simpson touching the brake pedal, and Tesla’s contention seems to be that this was done before she attempted to fully brake to a stop. Manual control inputs deactivate the automated systems, so if Simpson tapped the brake pedal at some point before the crash, she could have shut down Autopilot inadvertently.
“Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times,” a Tesla statement said. Tesla still considers Autopilot to be in the beta stage, and advises drivers to take precautions like keeping their hands within reach of the wheel at all times.
Simpson is unhappy with how the company has handled the situation. She described herself as “super pro-Tesla and self-driving things in general,” but told Ars Technica that Tesla lacks empathy and was “pretty awful throughout the process.”
The Autopilot snafu comes shortly after a crash in Utah that a driver blamed on Tesla’s Summon autonomous-parking feature. The car drove itself into a parked trailer, and owner Jared Overton blamed a malfunction. Tesla countered that Overton ignored multiple warnings that Summon was activated, citing data logs as in the California crash.