Skip to main content

Could thermal-imaging sensors help make self-driving cars safer?

FLIR Systems

Recently we published a story about the history of notable car accidents involving self-driving and semi-self-driving vehicles. While a significant number of these do involve human error, there are still multiple examples of crashes in which the vehicle was unable to properly read its surroundings. Thermal imaging company FLIR Systems thinks one way to make autonomous vehicles safer would be to give them the ability to use thermal reading tech to better deal with challenging lighting and weather conditions.

To help with adoption of these sensors, the company has made available an open-source dataset of 10,000 labelled infrared light images showcasing how pedestrians, animals, bicycles and other vehicles can be classified using the tech in difficult conditions, ranging from total darkness to fog and smoke to haze and glare from the sun. Using thermal cameras, combined with the dataset, it’s possible to recognize objects more than 200 meters away, the equivalent of 4 times the distance of typical car headlights.

“Navigation is limited by the perception accuracy provided by the vehicle sensor suite,” Mike Walters, vice president of micro-camera product management at FLIR Systems, told Digital Trends. “Visible cameras do not work well in a number of challenging lighting situations such as absolute darkness, driving into the sun, and in many types of fog — while thermal cameras, which principally see heat and not visible light, are unaffected by these adverse conditions. Therefore, thermal cameras provide more accurate situational awareness and perception under these conditions, which, in turn, improves safety and navigation.”

At present, thermal cameras are found in a number of luxury passenger cars. Some high-end automakers, such as Porsche and BMW, already fit vehicles with thermal imaging sensors made by FLIR. However, they are not part of the standard suite of sensors helping power today’s most prominent self-driving vehicles. That may sound like an obvious oversight, but current self-driving cars don’t rely on only one imaging system by which to see.

The self-driving cars being tested on the road right now figure out their surroundings using a combination of regular cameras, ultrasound, radar, lidar and more. Information from all of these sensors helps inform the decisions the car makes. In some cases, this array of sensors may lead to redundancy — but few people, either passengers or pedestrians, are ever going to complain about being too safe.

As a result of this, it’s difficult to calculate how much of an improvement using technology like FLIR’s thermal long-wave infrared (LWIR) cameras could offer. Still, hopefully datasets such as the one newly released this week will give carmakers the opportunity to more easily build the algorithms that will let them find out.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Tesla hopes full self-driving beta will be out globally by the end of 2022
Beta of Tesla's FSD in a car.

At the Tesla AI Day 2022 event, the electric car maker revealed some key statistics about the Full Self Driving (FSD) tech that is currently still in the beta testing phase. The company divulged that the number of FSD beta testers has gone up from 2,000 last year to roughly 1,60,000 users in 2022, despite a few regulatory hiccups and incidents that raised questions about its safety.

Tesla still hasn’t provided a timeline for when the FSD package will formally exit the beta phase, but it doesn’t seem too far off. In a TED interview this year, Musk claimed that the FSD system, which now costs $15,000, will most likely be out by the end of 2022 for all customers. There are also plans for a global rollout by the end of this year, pending regulatory approval, of course.

Read more
Nvidia’s Drive Concierge will fill your car with screens
An interior view of Nvidia's Drive Concierge in-car infotainment system, showing various in-car displays in use.

At Nvidia’s GTC show today, the company announced two new systems in its in-car computing efforts, including a new product that could outfit your vehicle with an array of AI-powered screens and dashboards.

The first announcement is a new in-car infotainment system that includes graphics and visuals for drivers alongside game and movie streaming for passengers. Dubbed Drive Concierge, Nvidia says it will make driving “more enjoyable, convenient and safe.”

Read more
Cruise’s robot taxis head to Arizona and Texas
A passenger getting into a Cruise robotaxi.

Cruise’s autonomous cars are heading to Texas and Arizona before the end of this year.

The General Motors-owned company plans to launch ridesharing pilots in Austin and Phoenix in what will be its first expansion of the service outside of San Francisco.

Read more