Recently we published a story about the history of notable car accidents involving self-driving and semi-self-driving vehicles. While a significant number of these do involve human error, there are still multiple examples of crashes in which the vehicle was unable to properly read its surroundings. Thermal imaging company FLIR Systems thinks one way to make autonomous vehicles safer would be to give them the ability to use thermal reading tech to better deal with challenging lighting and weather conditions.
To help with adoption of these sensors, the company has made available an open-source dataset of 10,000 labelled infrared light images showcasing how pedestrians, animals, bicycles and other vehicles can be classified using the tech in difficult conditions, ranging from total darkness to fog and smoke to haze and glare from the sun. Using thermal cameras, combined with the dataset, it’s possible to recognize objects more than 200 meters away, the equivalent of 4 times the distance of typical car headlights.
“Navigation is limited by the perception accuracy provided by the vehicle sensor suite,” Mike Walters, vice president of micro-camera product management at FLIR Systems, told Digital Trends. “Visible cameras do not work well in a number of challenging lighting situations such as absolute darkness, driving into the sun, and in many types of fog — while thermal cameras, which principally see heat and not visible light, are unaffected by these adverse conditions. Therefore, thermal cameras provide more accurate situational awareness and perception under these conditions, which, in turn, improves safety and navigation.”
At present, thermal cameras are found in a number of luxury passenger cars. Some high-end automakers, such as Porsche and BMW, already fit vehicles with thermal imaging sensors made by FLIR. However, they are not part of the standard suite of sensors helping power today’s most prominent self-driving vehicles. That may sound like an obvious oversight, but current self-driving cars don’t rely on only one imaging system by which to see.
The self-driving cars being tested on the road right now figure out their surroundings using a combination of regular cameras, ultrasound, radar, lidar and more. Information from all of these sensors helps inform the decisions the car makes. In some cases, this array of sensors may lead to redundancy — but few people, either passengers or pedestrians, are ever going to complain about being too safe.
As a result of this, it’s difficult to calculate how much of an improvement using technology like FLIR’s thermal long-wave infrared (LWIR) cameras could offer. Still, hopefully datasets such as the one newly released this week will give carmakers the opportunity to more easily build the algorithms that will let them find out.
- Big driverless buses are now serving passengers in Scotland
- Robo-bus fleet aims to carry 10,000 passengers per week
- What’s the difference between Tesla Autopilot and Full Self-Driving?
- Apple’s rumored car could cost the same as a Tesla Model S
- Ford and VW close down Argo AI autonomous car unit