Skip to main content

Stickers on street signs can confuse self-driving cars, researchers show

Engineers developing autonomous cars certainly have their work cut out as they try to perfect the technology to make the safest vehicles possible, but it’s often the unexpected issues that pop up along the way that can leave them scratching their heads.

A few months ago, for example, it was revealed that bird poop had been causing havoc with the sensors on autonomous cars, with a direct hit obscuring their ability to “see,” making the vehicle about as safe as a human driver tootling along with their eyes closed. While Waymo has overcome the poop problem with the development of tiny water squirters and wipers that spring into action the moment the gloop hits the sensor, another issue has just reared its ugly head that clearly requires urgent attention if we’re ever to see self-driving technology rolled out in a meaningful way.

Recommended Videos

Interested in testing the all-important sensors that help a car to make sense of its surroundings and make decisions at speed, security researchers at the University of Washington recently tampered with a street sign — under lab conditions, of course — to see if it would confuse the technology.

It did.

The researchers said that by printing off some stickers and attaching them in a particular way to different street signs, the alterations were able to confuse cameras that are used by “most” autonomous vehicles, Car and Driver reported.

Rather worryingly, the team managed to confuse a self-driving car into thinking a regular “stop” sign was a 45-mph speed limit sign, simply by adding a few carefully placed stickers to it (pictured).

The sign alterations can be very small and go unnoticed by humans because the camera’s software is using an algorithm to understand the image, and interprets it in a profoundly different way to how a human does. So the sign used in the test clearly continues to show the word “stop,” despite the addition of the graffiti-like stickers that serve to trick the car into thinking it means something else.

The researchers suggest that if hackers are able to access the algorithm, they could use an image of the road sign to create a customized, slightly altered version capable of confusing the car’s camera.

The implications of such confusion aren’t hard to imagine. A self-driving car speeding through a stop sign that it mistook for a speed limit sign could put it in the path of an oncoming vehicle, though in such a scenario the self-driving tech in both cars should prevent a catastrophic collision. So, in such cases, tampering with street signs has the potential to cause huge amounts of chaos on the roads rather than anything more serious.

But what happens if the entire sign is fake having been put up by pranksters — something that does happen  from time to time. How will the driverless car be able to tell the difference between a fake sign and a genuine one? While the car’s mapping technology will add to its knowledge of its immediate surroundings, information on temporary signs for construction or incidents may have to be transmitted to driverless cars ahead of time to avoid issues. The technology could also take into account contextual information, prompting it to ignore, say, a (fake) 80 mph sign in a residential area.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
Dubai Police to deploy driverless patrol cars with AI smarts
Dubai's autonomous patrol car.

While U.S. firms like Waymo and Cruise focus on ridesharing services with their autonomous vehicles, the United Arab Emirates' coastal city of Dubai is aiming to take the technology to another level by deploying it in police patrol cars.

Dubai Police recently announced plans to use fully electric, self-driving patrol cars in residential areas, local media reported this week.

Read more
Cruise autonomous vehicle drives over woman just after she was hit by another car
A Cruise autonomous car.

An autonomous vehicle (AV) operated by Cruise ran over a pedestrian in San Francisco on Monday night just after she’d been hit by another car, the San Francisco Chronicle reported.

According to witnesses, the force of the initial impact knocked the woman into the path of the Cruise robotaxi, leaving her pinned under one of its wheels. The driver in the other car reportedly fled the scene.

Read more
An autonomous car in San Francisco got stuck in wet concrete
A Cruise autonomous car.

A self-driving car operated by General Motors-backed Cruise got stuck on Tuesday when it drove into a patch of wet concrete.

The incident happened in San Francisco and occurred just days after California's Public Utilities Commission made a landmark decision when it voted to allow autonomous-car companies Cruise and Waymo to expand their paid ridesharing services in the city to all hours of the day instead of just quieter periods.

Read more