Skip to main content

Consumer Reports joins in asking Tesla to turn off Autopilot, Tesla still says no

tesla model s autopilot expected update crash death
Image used with permission by copyright holder
It’s beginning to feel like piling on. Six days after Consumer Watchdog sent a letter to Tesla saying the electric car maker should turn off the semi-autonomous Autopilot feature, Consumer Reports has made the same statement and received the same answer, according to Electrek. In both cases Tesla said no.

The Consumer Reports article and other recent attention paid to Tesla’s Autopilot and Autosteer follow a fatal accident that occurred on May 7 with Autopilot engaged, and at least two other accidents in which the drivers claimed Autopilot was turned on but failed to prevent a collision. In some cases, Tesla’s data records for the involved vehicles showed that Autopilot mode either was not turned on or that alerts were sounded and displayed but ignored by the driver.

All Tesla vehicles with Autopilot hardware transmit data continuously to the manufacturer, whether Autopilot is engaged or not. Tesla has maintained from its introduction that Autopilot is a “beta” program and that drivers need to be ready to take over when needed, with or without an alarm from the system.

In its July 14, 2016, article Tesla’s Autopilot: Too Much Autonomy Too Soon, Consumer Reports called on Tesla to take four specific actions:

“Disable Autosteer until it can be reprogrammed to require drivers to keep their hands on the steering wheel. Stop referring to the system as ‘Autopilot’ as it is misleading and potentially dangerous. Issue clearer guidance to owners on how the system should be used and its limitations. Test all safety-critical systems fully before public deployment; no more beta releases.”

Consumer Reports wrote that its experts, “believe that these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice — create a potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to react quickly to emergency situations.”

Laura MacCleery, Consumer Reports’ vice president of consumer policy and mobilization said, “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security.”

Consumer Reports also quoted a report from Google’s Self-Driving Car Project in which it described the Handoff Problem, “People trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax.”

Tesla has said it is writing a blog post to help owners have a better understanding about what it can and cannot do and how they should use it. Tesla also responded to Consumer Reports with the following, according to Electrek, “Tesla is constantly introducing enhancements, proven over millions of miles of internal testing, to ensure that drivers supported by Autopilot remain safer than those operating without assistance. We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”

Editors' Recommendations

Bruce Brown
Digital Trends Contributing Editor Bruce Brown is a member of the Smart Homes and Commerce teams. Bruce uses smart devices…
Tesla’s Autopilot is in the hot seat again over driver misuse
tesla v10 update preview video netflix youtube caraoke cuphead release date smart summon

Tesla isn't preventing the misuse of its Autopilot feature like it should, according to the National Transportation Safety Board (NTSB), which is calling the company out because of it. 

During a hearing on Tuesday about a March 2018 Tesla crash that resulted in the driver's death due to misuse of the Autopilot feature, the NTSB said that Tesla needs to do more to improve the safety of its Autopilot feature.

Read more
Andrew Yang broke Tesla’s one big Autopilot rule in campaign commercial
andrew yang drives a tesla wrong in commercial

Democratic presidential candidate Andrew Yang has talked about the automation industry in his campaign, but he could stand to learn a few things when it comes to how to drive Autopilot.

Yang’s latest presidential campaign commercial talks focuses on the automation industry, including him driving a Tesla. There’s one catch though: When he’s behind the wheel of the Tesla Model X, he takes his hands off while on autopilot, which if you know anything about Tesla models, that’s a big no-no. 

Read more
Hacker finds Tesla is working on a neighborhood-friendly Autopilot
TESLA-Autopilot-Display

Tesla's Autopilot suite of semiautonomous technology is a work in progress, and the company is putting a lot of effort into making it better and smarter every year. Autopilot-equipped Tesla models are about to learn how to recognize stop signs and traffic lights, according to a hacker known as Green who cracks open the automaker's secret files as a hobby.

Posting on Twitter, Green explained Tesla expanded its repertoire of 3D assets with a stop sign on a pole, and several traffic lights. If you need a brief refresher course, the 3D assets are used to show the driver what the car is doing while it's traveling on Autopilot mode. For example, if your Model S is in a construction zone, the hardware that powers the system detects traffic cones, and the software displays them on the instrument cluster. The technology shows lane markings, too.

Read more