What does medical imaging have in common with self-driving cars?
A new imaging technology developed by researchers at the Massachusetts Institute of Technology could hold the answer. The technique may represent a breakthrough, as it permits researchers to recover visual information from light that has been scattered due to interactions with the environment, like dense fog and drizzle or even human tissue. This visible light information holds considerably more information than X-rays or ultrasound waves.
“One of the problem with X-rays is that it’s very hard to distinguish between different types of tissue,” Guy Satat, a graduate student at MIT’s Media Lab, told Digital Trends. “That’s why biopsies and more invasive procedures are needed. But if it was possible to create a system which could augment X-rays, or replace them in some cases, it would mean that some of these invasive procedures could be reduced or avoided altogether. That’s really the Holy Grail for our research.”
The same principle, Satat said, is true for foggy or drizzly conditions, which present current autonomous car technology with a major challenge.
“The problem with seeing through these weather conditions is that you also have an optical scattering effect,” he continued. “As a result of this optical scattering, it’s not possible to achieve the necessary contrast to distinguish between different objects the autonomous car is seeing.”
The MIT-developed technology is essentially a camera with a smart algorithm that uses both light and time to create an image of what it is that has scattered photons in a particular way. By working out how long it it takes individual photons to reach the camera lens, the algorithm can then reverse-engineer a conclusion about what the light has passed through to get there.
In addition to better self-driving car visual recognition systems, and medical imaging which doesn’t require patients to undergo doses of radiation or surgery, other applications for the technique could include helping satellite images “see” through clouds.
In one long-term example Satat enjoyed discussing, he noted that it could even be possible to have a portable device capable of looking through fruit in supermarkets to tell how ripe it is.
Right now, the work is still in its relatively early research stages, but it certainly hints at some fascinating use-cases going forward. What’s not to get excited about?