Skip to main content

This camera eliminates the ocean waves so scientists can study coral

Seeing Beneath the Waves with Technology

Think space is the final frontier? As NASA researcher Ved Chirayath points out, scientists know more about the surface of the moon and Mars combined than the ocean floor. With a camera and artificial intelligence, however, Chirayath is working to change that. Fluid Cam is a camera system that uses fluid lensing technology in order to eliminate the distortion of waves to see more detail and study the coastal ocean systems.

Fluid Cams use a mix of hardware like the custom optics and software in order to remove the movement of the water from the image. Without the disruption of the waves, the scientists viewing the image can see the detail on the coral, down to the centimeter, along with noting if the ocean floor there is rocky or sandy, NASA says. The camera can’t see the deepest ocean floors, but removing the waves allows researchers to see details they otherwise couldn’t in otherwise clear waters, ideal for looking at coral.

NASA has been experimenting with Fluid Cams since 2015, conducting tests on how the cameras work and creating benchmarks for the camera system. The Fluid Cam has already flown on a drone (like that whale snot drone) but now NASA researchers are facing the next challenge: Managing all that data. After all, the Fluid Cam has 550 MB per second coming off the camera (enough to fill the typical laptop in 200 seconds).

Chirayath says the next step is to label the images already collected in order to develop an artificial intelligence program capable of going through the mass amount of images from Fluid Cam and collecting data on the corals. Once researchers can quickly assess all that data, the plan is to use a Fluid Cam on a satellite to monitor the health of the coral reefs.

The camera technology, however, could be used to understand more than just coral health. “We’re pushing new boundaries every day and we are going to be able to do new science at an unprecedented level,” Chirayath said. “We can actually take a multi-spectral light source and couple it with these cameras to create a whole type of new remote sensing that can be used even on Mars rovers to imaging satellites going by Pluto.”

The research is being funded by a grant from the Earth Science Technology Office.

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Fujifilm’s most-hyped camera has just started shipping
Fujifilm's X100VI camera, released in 2024.

The latest iteration of Fujifilm’s X100 camera started shipping on Wednesday.

The X100VI is -- as the name cleverly suggests -- the sixth in the series. Early reviews have been mostly positive as the camera builds on the successes of the already impressive earlier models going all the way back to the original X100, which launched in 2011.

Read more
How to resize an image on Mac, Windows, and a Chromebook
Windows 11 set up on a computer.

Resizing an image is something we’re all going to have to do at some point in our digital lives. And whether you’re using Windows, macOS, or you’re rocking a Chromebook, there are ways to scale images up and down on each PC. Fortunately, these are all relatively simple methods too.

Read more
Watch an acclaimed director use the iPhone 15 Pro to shoot a movie
acclaimed director uses iphone 15 to shoot movie shot on pro midnight

Shot on iPhone 15 Pro | Midnight | Apple

As part of its long-running Shot on iPhone series, Apple recently handed acclaimed Japanese director Takashi Miike (Audition, 13 Assassins, The Happiness of the Katakuris) an iPhone 15 Pro to shoot a short film.

Read more