Skip to main content

Was negligence or bad marketing to blame in the latest Autopilot crash?

Tesla Model S crash in China
Image used with permission by copyright holder
A 33-year-old car owner has blamed Tesla’s marketing department after his Model S hit a parked Volkswagen while driving on Autopilot mode. The fender-bender took place recently in Beijing, China.

Footage from a dashcam installed in the Model S shows the electric sedan traveling in the left lane of a highway in the Chinese capital. Cars ahead of it move right to avoid a Volkswagen Santana Vista stopped with its hazard lights on, but the Model S continues going straight and side-swipes it. Pictures taken after the crash show that the Model S’ entire left side is dented and scratched, though the damage is relatively minor.

Tesla driver crashes his car in autopilot mode because he thought it was a self-driving vehicle

After reviewing the vehicle logs, Tesla confirmed that Autopilot was engaged at the time of the crash, but accused the driver of not using the software properly. The company again warned that motorists need to keep both hands on the steering wheel at all times, and remain alert in order to take over in the event of an emergency. Luo Zhen, the driver, acknowledged that his hands weren’t on the steering wheel — in fact, he admitted that he was looking at either his phone or the S’ touchscreen at the time of the accident. However, he said Tesla told him the Model S was a self-driving car when he purchased it.

Recommended Videos

“Tesla’s sales department told me that this is an autopilot car, that’s how it was, and they demonstrated during the test drive. The demonstrator took his hands off the steering wheel and then took his feet off the accelerator and brake to demonstrate this function,” Zhen remembers.

Please enable Javascript to view this content

Read more: Jaguar’s drive toward an electric car should put Tesla on notice

Four Tesla owners in Beijing, Shanghai, and Guangzhou told industry trade journal Automotive News that the Model S was presented as a self-driving car, and that the salesman took his or her hands off the wheel during the test drive to show what the system is capable of. To complicate matters, Tesla allegedly uses the term “zidong jiashi,” a term which translates to “self-driving” and is also the name of the autopilot software found in commercial airliners.

While Tesla declined to comment on the allegations that Autopilot is presented in a misleading way in China, a spokesperson for the company emphasized to Digital Trends that the Model S is not a self-driving car and should not be operated as such under any circumstances.

“As clearly communicated to the driver in the vehicle,” the spokesperson said, “Autosteer is an assist feature that requires the driver to keep his hands on the steering wheel at all times, to always maintain control and responsibility for the vehicle, and to be prepared to take over at any time.”

Updated 8/11/2016 by Ronan Glon: Added official statement from Tesla.

Ronan Glon
Former Digital Trends Contributor
Ronan Glon is an American automotive and tech journalist based in southern France. As a long-time contributor to Digital…
Tesla Cybertruck customers receive more bad news
Elon Musk Unveils the Tesla Cybertruck

Tesla Cybertruck customers who were hoping to be driving their futuristic pickup along the highway this year may now have to wait until at least 2023.

The latest of several delays has been caused by engineers tweaking the design of the Cybertruck. Initial production is now targeted for the first quarter of next year, a source told Reuters on Thursday.

Read more
Tesla pulls latest Full Self-Driving beta less than a day after release
The view from a Tesla vehicle.

False collision warnings and other issues have prompted Tesla to pull the latest version of its Full Self-Driving (FSD) beta less than a day after rolling it out for some vehicle owners.

Tesla decided to temporarily roll back to version 10.2 of FSD on Sunday following reports from some drivers of false collision warnings, sudden braking without any apparent reason, and the disappearance of the Autosteer option, among other issues.

Read more
Tesla’s Autopilot can be easily tricked, engineers find
Tesla emblem preview image

Engineers at Consumer Reports (CR) said this week they were able to "easily" trick Tesla’s Autopilot system into thinking someone was in the driver’s seat, meaning the car could be driven without anyone behind the wheel.

CR engineers performed the demonstration on a private road using a Tesla Model Y vehicle. The non-profit consumer organization said it decided to conduct the test after hearing about Saturday’s fatal crash in Spring, Texas, involving a Tesla Model S that apparently had no one behind the wheel.

Read more