Skip to main content

AdaSky uses military thermal imaging tech for self-driving cars

Stephen Edelstein/Digital Trends

Future self-driving cars will only be as good as their sensors, and those sensors tend to fall into three categories. Cameras, radar, and lidar dominate the industry, but Israeli startup AdaSky wants to introduce a fourth option. AdaSky believes thermal imaging — currently used by world militaries to find targets in low-visibility conditions — could help self-driving cars “see” in conditions that would blind other sensor types.

Thermal imaging — also known as infrared, for the slice of the light spectrum it uses — detects the heat emitted by people and objects. You may have seen its black-and-white images, with people illuminated by an alien glow from their body heat, on a news broadcast. Because it doesn’t rely on light in the visible spectrum, thermal imaging works in direct sunlight, rain, fog, or snow, where cameras or lidar might fail, Raz Peleg, AdaSky’s director of sales, told Digital Trends.

“When you’re talking defense, securing your country, your soldiers, you’re not thinking of ‘most of the time.’ You’re thinking of saving lives, of making sure that your mission is done, at all times,” Peleg, a former F-16 pilot, said. “Usually the enemy is thinking of corner cases — early morning, mist, harsh weather. It is proven that thermal sensing works in those corner cases.”

Whether on the battlefield or on the road, low visibility can be a real danger. A recent AAA study found that pedestrian detection systems don’t work well in the dark — exactly the time when the risk of collisions is elevated. Peleg claimed AdaSky’s tech could improve the performance of those systems. He said it’s impervious to the blinding flashes of oncoming cars’ headlights, and can “see” people partially obscured by objects, such as fences.

Riding along in a Ford Fusion Hybrid equipped with one of AdaSky’s prototype sensors, we see black-and-white images of what the sensor is “seeing” displayed on a laptop. People appear as glowing white objects against the black background, as do the engines of other vehicles. Peleg taps a key, and the roadway is highlighted in green. AdaSky has a proprietary machine learning algorithm that can tell where the road is, he explained. It’s based on differences in temperature, as well as the way different substances radiate heat.

Stephen Edelstein/Digital Trends

Because the images are black and white, AdaSky’s sensor can’t read traffic lights, Peleg noted. In a production car, conventional cameras would take care of that. Peleg wouldn’t outright say that thermal imaging could replace other sensor types, but he was quick to note that AdaSky’s sensor will cost “hundreds” of dollars, compared to thousands for most current lidar units, and will be durable enough to last the lifetime of an average car. It’s also small enough to be packaged behind the grille of a car, similar to current radar and camera setups used for driver aids, Peleg said.

Thermal imaging is also “passive,” Peleg noted. Unlike radar and lidar, it doesn’t transmit a signal in order to detect nearby obstacles. It just picks up the heat generated by the people and things around it. That means a thermal imaging sensor will draw less power than other sensors, which could be an important consideration for electric cars, Peleg said.

“If you need to transmit something, radar or lidar, you’re consuming a lot of electricity.”

AdaSky plans to market its sensors for both self-driving cars and driver-assist systems, Peleg said. So far, the startup has a deal with an American vehicle manufacturer to demonstrate its tech on trucks. When asked if that meant commercial trucks, Peleg pointed to a nearby pickup truck and said it would something more like that. AdaSky will also focus on getting the public and regulators to trust its technology, before attempting mass production, Peleg said.

“We are still in the education phase.”

Updated on October 23, 2019: Added AdaSky claim that sensors will last the lifetime of the average car.

Editors' Recommendations

Stephen Edelstein
Stephen is a freelance automotive journalist covering all things cars. He likes anything with four wheels, from classic cars…
From Paris to NYC, Mobileye will bring self-driving cars to metropolises
A self-driving vehicle from Mobileye's autonomous test fleet navigates the streets of Detroit. (Credit: Mobileye, an Intel Company)

A Tesla in Autopilot mode can ply the highways of Northern California without issue, but when it comes to congested cities packed with erratic vehicle traffic, bikes, and pedestrians, cameras don’t always cut it. Or they didn’t, anyway. After years of testing, Intel-owned Mobileye intends to embrace the madness of the metropolis by rolling out self-driving cars in cities across the world.

On Monday, the first day of CES 2021, the company announced that Tokyo, Shanghai, Paris, Detroit, and New York City will all see fleets of Mobileye-powered vehicles rolled out in early 2021, if all goes well (regulatory issues are still being ironed out in NYC).

Read more
Waymo ditches the term ‘self-driving’ in apparent dig at Tesla
waymo takes its self driving cars to florida for testing in heavy rain

Autonomous car company Waymo says it will stop using the term “self-driving” in a move that many will see as a swipe at Tesla.

Alphabet-owned Waymo said that starting this year it will refer to its driving technology as “fully autonomous.”

Read more
To reach level 4 autonomy, these self-driving cars head to winter boot camp
Sensible 4 winter driving

Is there a more magical seasonal sight than snowflakes falling on banks of snow under a white sky, the only bursts of color to break up the merry scene being a jolly holly bush or a Christmas robin hopping across the top of a frozen fence? Maybe not if you’re a human. If you’re a self-driving car, on the other hand, that scene is pretty darn terrifying.

Autonomous vehicles are increasingly great at parsing street scenes and safely navigating according to either camera images or bounced Lidar inputs. Unfortunately, snow is an issue for both cameras and laser scanners due to noise (read: falling snow) blocking the sensors, and white-out conditions preventing the camera from seeing surroundings properly.

Read more