Skip to main content

Smart algorithm presents new way of seeing, may aid medical imaging

All Photons Imaging : MIT Media Lab, Camera Culture Group
What does medical imaging have in common with self-driving cars?

A new imaging technology developed by researchers at the Massachusetts Institute of Technology could hold the answer. The technique may represent a breakthrough, as it permits researchers to recover visual information from light that has been scattered due to interactions with the environment, like dense fog and drizzle or even human tissue. This visible light information holds considerably more information than X-rays or ultrasound waves.

“One of the problem with X-rays is that it’s very hard to distinguish between different types of tissue,” Guy Satat, a graduate student at MIT’s Media Lab, told Digital Trends. “That’s why biopsies and more invasive procedures are needed. But if it was possible to create a system which could augment X-rays, or replace them in some cases, it would mean that some of these invasive procedures could be reduced or avoided altogether. That’s really the Holy Grail for our research.”

Image used with permission by copyright holder

The same principle, Satat said, is true for foggy or drizzly conditions, which present current autonomous car technology with a major challenge.

“The problem with seeing through these weather conditions is that you also have an optical scattering effect,” he continued. “As a result of this optical scattering, it’s not possible to achieve the necessary contrast to distinguish between different objects the autonomous car is seeing.”

The MIT-developed technology is essentially a camera with a smart algorithm that uses both light and time to create an image of what it is that has scattered photons in a particular way. By working out how long it it takes individual photons to reach the camera lens, the algorithm can then reverse-engineer a conclusion about what the light has passed through to get there.

In addition to better self-driving car visual recognition systems, and medical imaging which doesn’t require patients to undergo doses of radiation or surgery, other applications for the technique could include helping satellite images “see” through clouds.

In one long-term example Satat enjoyed discussing, he noted that it could even be possible to have a portable device capable of looking through fruit in supermarkets to tell how ripe it is.

Right now, the work is still in its relatively early research stages, but it certainly hints at some fascinating use-cases going forward. What’s not to get excited about?

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
MIT’s shadow-watching tech could let autonomous cars see around corners
mit shadow look around corners sensing 0

Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that -- by looking for changes in shadows on the ground to reveal if a moving object is headed their way.

"ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight," Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. "First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes."

Read more
AdaSky uses military thermal imaging tech for self-driving cars
adasky is an israeli startup making thermal imaging sensors for self driving cars sensor

Future self-driving cars will only be as good as their sensors, and those sensors tend to fall into three categories. Cameras, radar, and lidar dominate the industry, but Israeli startup AdaSky wants to introduce a fourth option. AdaSky believes thermal imaging -- currently used by world militaries to find targets in low-visibility conditions -- could help self-driving cars "see" in conditions that would blind other sensor types.

Thermal imaging -- also known as infrared, for the slice of the light spectrum it uses -- detects the heat emitted by people and objects. You may have seen its black-and-white images, with people illuminated by an alien glow from their body heat, on a news broadcast. Because it doesn't rely on light in the visible spectrum, thermal imaging works in direct sunlight, rain, fog, or snow, where cameras or lidar might fail, Raz Peleg, AdaSky's director of sales, told Digital Trends.

Read more
Hyundai backs A.I. camera company to aid development of self-driving cars
hyundai cradle invests in netradyne for autonomous driving camera tech

Self-driving cars are only as good as their sensors, so automakers are constantly searching for better versions of this tech. That led Hyundai's Cradle venture capital arm to invest in Netradyne, a company that has developed cameras with artificial intelligence software. Hyundai hopes Netradyne's tech could be used in future autonomous-driving and driver-assist systems, by creating better digital maps for those systems to use.

Netradyne currently makes dash cams for fleet vehicles, which are used to monitor driver behavior and road conditions. Hyundai believes data collected by these cameras can be used to create maps that can be used by driver-assist systems and, eventually, autonomous cars. These maps allow self-driving cars to figure out where they are and are also vital to assist systems that keep human drivers in charge. Cadillac mapped thousands of miles of North American highways before releasing its Super Cruise system, for example.

Read more