Skip to main content

Tiny red stickers managed to trick Tesla’s Autopilot software

Image used with permission by copyright holder

The promise of autonomous vehicles includes the idea that cars with robotic pilots at the helm will be able to keep us safer. The more self-driving cars on the road, the less likely the possibility a human-caused error occurs. Of course, if those autonomous systems suffer from issues of their own, then they just put us at risk in other ways. Researchers at Tencent Keen Security Lab recently published a paper that shows some of the flaws in self-driving technology — specifically the software used in Tesla’s Autopilot.

The most troubling attack highlighted in the paper was one of the simplest to execute. According to the paper, the researchers were able to trick Tesla’s Autopilot functionality into changing lanes and losing its position on the road by adding noise to lane markings. In one instance, small, red stickers were placed on the ground to make the vehicle believe it needed to change lanes. The attack could force a vehicle into oncoming traffic. The attack worked in daylight and didn’t require any sort of interference like snow or rain that might make things more difficult for the Autopilot system to process the road.

Some of the hacks discovered by the security experts sound like the stuff of movies. In one example, after bypassing several layers of protection designed to keep attackers out, the researchers were able to write an app that allowed them to hijack the Tesla steering functionality. With the app, an attacker could use a video game controller or smartphone to steer a vehicle. The commands would override Tesla’s Autopilot systems as well as the steering wheel itself. The attack had some limitations on cars that recently shifted from reverse to drive, but could fully take over a car in park or cruise control.

“This kind of attack is simple to deploy, and the materials are easy to obtain,” the researchers wrote. “Our experiments proved that this architecture has security risks and reverse lane recognition is one of the necessary functions for autonomous driving in non-closed roads. In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.”

The security researchers at Tencent Keen Security Lab informed Tesla of the issues and Tesla has reported the problems have been addressed in recent security patches.

Editors' Recommendations

AJ Dellinger
AJ Dellinger is a freelance reporter from Madison, Wisconsin with an affinity for all things tech. He has been published by…
Tesla’s Autopilot is in the hot seat again over driver misuse
tesla v10 update preview video netflix youtube caraoke cuphead release date smart summon

Tesla isn't preventing the misuse of its Autopilot feature like it should, according to the National Transportation Safety Board (NTSB), which is calling the company out because of it. 

During a hearing on Tuesday about a March 2018 Tesla crash that resulted in the driver's death due to misuse of the Autopilot feature, the NTSB said that Tesla needs to do more to improve the safety of its Autopilot feature.

Read more
Andrew Yang broke Tesla’s one big Autopilot rule in campaign commercial
andrew yang drives a tesla wrong in commercial

Democratic presidential candidate Andrew Yang has talked about the automation industry in his campaign, but he could stand to learn a few things when it comes to how to drive Autopilot.

Yang’s latest presidential campaign commercial talks focuses on the automation industry, including him driving a Tesla. There’s one catch though: When he’s behind the wheel of the Tesla Model X, he takes his hands off while on autopilot, which if you know anything about Tesla models, that’s a big no-no. 

Read more
Hacker finds Tesla is working on a neighborhood-friendly Autopilot
TESLA-Autopilot-Display

Tesla's Autopilot suite of semiautonomous technology is a work in progress, and the company is putting a lot of effort into making it better and smarter every year. Autopilot-equipped Tesla models are about to learn how to recognize stop signs and traffic lights, according to a hacker known as Green who cracks open the automaker's secret files as a hobby.

Posting on Twitter, Green explained Tesla expanded its repertoire of 3D assets with a stop sign on a pole, and several traffic lights. If you need a brief refresher course, the 3D assets are used to show the driver what the car is doing while it's traveling on Autopilot mode. For example, if your Model S is in a construction zone, the hardware that powers the system detects traffic cones, and the software displays them on the instrument cluster. The technology shows lane markings, too.

Read more