Skip to main content

Tesla driver error caught on dashcam; autopilot needed help but didn’t get it

tesla driver error autopilot hitting barrier
vid.me
Might as well face it, dashcams are everywhere you drive. A Tesla owner learned that recently, perhaps to his embarrassment, when dashcam footage on Reddit contradicted a post the driver made previously about how his Tesla on Autopilot hit a highway construction barrier after misreading the road, as reported by Electrek.

The driver posted, “I was driving in the left lane of a two-lane highway. The car is AP1 (first generation Autopilot) and I’ve never had any problems until today. Autopilot was on and didn’t give me a warning. It misread the road and hit the barrier. After the airbags deployed there was a bunch of smoke and my car rolled to a grinding stop. Thankfully no one was hurt and I walked away with only bruises.”

A comment on his own post suggested what might have happened, “LOOKS LIKE OP OMITTED TO POINT OUT THAT THERE WAS A CONSTRUCTION ZONE AND FAILED TO TAKE CONTROL WHEN THE CONSTRUCTION BARRIER SUDDENLY APPEARED.”

Three days later another post in the same subreddit included the following gif with a comment that read in part, “This is probably one of the more interesting examples I have seen. Terrible road work design, this might be one of the most aggressive lane shifts I have ever seen. Being an AP1 car with only radar and the single front camera I can see how AP had no idea this was coming especially given the close following distance.”

Tesla not merging correctly, hits barrier.

This car had the first-generation autopilot with fewer detection devices than on newer models. Even with later generations like the Tesla Model X P100D (which our own Alex K. described as “what cars will look like in the future“), however, Tesla has been consistent in statements that drivers using autopilot must keep their hands on the steering wheel and be alert to take control when needed.

Granted the lane change barrier appeared quickly with no warning and the roadway lane markers give no indication of the shift. Drivers could easily be confused and autonomous vehicle systems rated anything less than four or five would be hard-pressed to detect this problem. In the end, however, the driver needs to be aware and ready to take over — that didn’t happen.

Car manufacturers, technology companies, and ride-hailing services remain convinced that self driving cars are the future of commuting, in spite of problems like these. Companies like Google’s Waymo have bet billions on the technology, and are even duking it out in court. Whichever company owns self-driving tech may reshape the entire auto industry — in spite of the bumps encountered on the road along the way.

Editors' Recommendations

Bruce Brown
Digital Trends Contributing Editor Bruce Brown is a member of the Smart Homes and Commerce teams. Bruce uses smart devices…
A fleet of Teslas drove 300,000 miles. Here’s what broke and what didn’t
Tesla Model 3

How reliable is a Tesla? Tesloop, a shuttle service based in California, found out by pushing Tesla electric cars to the breaking point. The company's fleet of seven Teslas has racked up a collective 2.5 million miles, with individual cars surpassing 300,000 miles, according to Quartz. That makes these cars among the highest-mileage Teslas on the road -- and shows what issues crop up after hard use.

The Tesloop fleet includes a mix of Model S, Model 3, and Model X electric cars. They typically operate between San Diego, Los Angeles, and destinations in between. Cars can accumulate 17,000 miles a month, and sometimes need to be charged twice a day, according to Quartz. While the sample size is small, it's a rare opportunity to evaluate high-mileage Tesla electric cars, Quartz noted.

Read more
Don’t trust Tesla’s new autonomous lane-changing feature, Consumer Reports warns
the current state of autonomous vehicles tesla autopilot

Update 5/24/19: Consumer Reports clarified its position on Tesla's automatic lane change functionality on Twitter early Thursday. Later that day, Consumer Reports senior director of auto testing Jake Fisher appeared on CNBC to further clarify CR's position on the functionality.

Last month's updates to the Navigate on Autopilot feature for Tesla vehicles, which enabled automatic lane changing, may put you at risk of a ticket -- or in danger of an accident, Consumer Reports warned on Wednesday

Read more
Tiny red stickers managed to trick Tesla’s Autopilot software
tesla drops full self driving option autopilot

The promise of autonomous vehicles includes the idea that cars with robotic pilots at the helm will be able to keep us safer. The more self-driving cars on the road, the less likely the possibility a human-caused error occurs. Of course, if those autonomous systems suffer from issues of their own, then they just put us at risk in other ways. Researchers at Tencent Keen Security Lab recently published a paper that shows some of the flaws in self-driving technology -- specifically the software used in Tesla's Autopilot.

The most troubling attack highlighted in the paper was one of the simplest to execute. According to the paper, the researchers were able to trick Tesla's Autopilot functionality into changing lanes and losing its position on the road by adding noise to lane markings. In one instance, small, red stickers were placed on the ground to make the vehicle believe it needed to change lanes. The attack could force a vehicle into oncoming traffic. The attack worked in daylight and didn't require any sort of interference like snow or rain that might make things more difficult for the Autopilot system to process the road.

Read more