Skip to main content

A.I. can remove distortions from underwater photos, streamlining ocean research

Light behaves differently in water than it does on the surface — and that behavior creates the blur or green tint common in underwater photographs as well as the haze that blocks out vital details. But thanks to research from an oceanographer and engineer and a new artificial intelligence program called Sea-Thru, that haze and those occluded colors could soon disappear.

Besides putting a downer on the photos from that snorkeling trip, the inability to get an accurately colored photo underwater hinders scientific research at a time when concern for coral and ocean health is growing. That’s why oceanographer and engineer Derya Akkaynak, along with Tali Treibitz and the University of Haifa, devoted their research to developing an artificial intelligence that can create scientifically accurate colors while removing the haze in underwater photos.

As Akkaynak points out in her research, imaging A.I. has exploded in recent years. Algorithms have been developed that can tackle everything from turning an apple into an orange to reversing manipulated photos. Yet, she says, the development of underwater algorithms is still behind because of how the water obscures many of the elements in the scene that the A.I. uses.

When light hits the water, it’s both absorbed and scattered. That creates what’s called backscatter, or haze that prevents the camera from seeing the scene in full detail. The light absorption also prevents color from reproducing accurately under water.

This researcher created an algorithm that removes the water from underwater images

To tackle the problem, Akkaynak trained the software using sets of underwater images the team shot themselves, using gear that’s readily available — a consumer camera, underwater housing, and a color card. First, she’d find a subject. In particular, Akkaynak was looking for coral with a lot of depth and dimension, since the farther away objects are underwater, the more those objects are obscured. Akkaynak would then place the color card near the coral, and then photograph the coral from both multiple distances and multiple angles.

Using those images as a data set, the researchers then trained the program to mathematically look at images and remove the backscatter and adjust the color, working on a pixel level. The resulting program, called Sea-thru, can correct the haze and color detail automatically. The software still requires multiple images of the same subject to work because the process uses a known range map to estimate and correct the backscatter. The researchers say, however, that the color card is no longer a necessity.

The resulting photos aren’t the same as the images that could be generated using tools like Lightroom’s dehaze slider and color correction tools. “This method is not Photoshopping an image,” Akkaynak told Scientific American. “It’s not enhancing or pumping up the colors in an image. It’s a physically accurate correction, rather than a visually pleasing modification.”

The team’s goal is to use large volumes of image data for research, explaining that, without the program, much of the work that requires color and detail must be done manually, since too many details are obscured in the photographs. “Sea-thru is a significant step towards opening up large underwater datasets to powerful computer vision and machine learning algorithms, and will help boost underwater research at a time when our oceans are [under] increasing stress from pollution, overfishing, and climate change,” the research paper concludes.

Editors' Recommendations