Tesla in recent days rolled out a long-awaited update to its Full Self-Driving (FSD) mode that gives its vehicles a slew of driver-assist features.
But in a stark warning to owners who’ve forked out for the premium FSD feature, Tesla said that the software is still in beta and therefore “may do the wrong thing at the worst time.” It insisted that drivers should keep their “hands on the wheel and pay extra attention to the road.”
Release Notes pic.twitter.com/VOi9DX0QFT
— Tesla Raj (@tesla_raj) July 10, 2021
The message comes after a number of fatal accidents over the years involving Tesla vehicles where the driver may not have been paying attention while the car was in Autopilot or FSD mode.
Critics of the California-based electric-car company have long said that using terms such as Autopilot and Full Self-Driving can lead some drivers into believing that their vehicle is fully autonomous, prompting them to take their eye off the road.
Drivers are supposed to keep their hands on the wheel at all times, with Tesla incorporating a safety system that emits warnings and eventually slows the car to a halt if it detects that the driver is not touching the wheel. But videos online, as well as a recent test by a leading U.S. consumer organization, suggest the system can be easily tricked into thinking someone has their hands on the wheel or is in the driver’s seat.
Consumer Reports carried out its test shortly after a Tesla Model S crashed into a tree in Spring, Texas, in April, killing two men inside. A police report said the two occupants were found in the back seat, with no one else in the vehicle. The suggestion was that the vehicle was operating in Autopilot (the owner hadn’t purchased the FSD feature) at the time of the crash, though Tesla chief Elon Musk said at the time that vehicle data logs indicated that the car was not in Autopilot mode when the accident occurred.
Notably, a preliminary report into the crash released by the National Transportation Safety Board said that in attempting to repeat the same moments leading up to the fatal crash, its investigators were unable to engage the Autosteer element of Autopilot along the stretch of road where the accident happened, and that security footage at the home of the Model S owner showed both men climbing into the front seats of the vehicle just minutes before the crash occurred. Several investigations into the accident are continuing.
The National Highway Traffic Safety Administration revealed in June that since 2016 it has launched 30 investigations into Tesla crashes involving 10 fatalities in which the driver assistance system may have been in use. To date, it has ruled out the involvement of Tesla’s Autopilot system in three of the crashes and published reports on two of the accidents.
Running preproduction software is both work & fun. Beta list was in stasis, as we had many known issues to fix.
Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid.
Safety is always top priority at Tesla.
— Elon Musk (@elonmusk) July 9, 2021
In a tweet posted last week, Musk said that the recent beta update for FSD mode “addresses most known issues,” but also warned that “there will be unknown issues.” He urged drivers to “please be paranoid,” adding, “Safety is always top priority at Tesla.”
- Is Tesla Full Self-Driving worth it?
- Tesla hopes full self-driving beta will be out globally by the end of 2022
- Elon Musk eyes 2024 for Tesla robotaxi sans steering wheel, pedals
- How a big blue van from 1986 paved the way for self-driving cars
- We now know what the self-driving Apple Car might look like