Skip to main content

This clever new technique could help us map the ocean floor — from the sky

Stanford University

A friend of mine who works in games design recently showed me a 3D model of the Earth, rendered in great detail using topographically accurate satellite data, so that we could soar through canyons and our respective neighborhoods at high speed like a pair of joyriding Supermen. “Let’s see if we can go underwater,” he said, exhilarated, as we flew out over the Pacific.

Recommended Videos

We couldn’t. The model, so stunningly accurate on land, apparently had zero data with which to model the undersea environment. It was an unrendered void beneath the water’s glassy surface, as if this was some subaquatic version of The Truman Show, and we had reached the end of the world.

Neither of us was particularly surprised. The shock would have been if the oceans had been rendered. Where would that information have come from? And how accurate would it have been? It would have meant the model’s creators knew something that even the world’s foremost oceanographers do not.

For all the justifiable excitement around exploring space in the 2020s (Elon Musk is “highly confident” that humans will be rocketing toward Mars by 2026), our planet’s oceans remain a largely uncharted and unknown domain that’s much closer to home. Water covers around 71 percent of Earth’s surface, with the freshwater stuff we drink accounting for a minuscule 3 percent, little more than a rounding error. But the overwhelming majority of the Earth’s oceans — up to 95 percent — are an unexplored mystery.

While we’re still a long way off from a Google Street View equivalent for the undersea world, a new project being carried out by researchers at Stanford University could pave the way for just such a thing in the future — and a whole lot more besides. Picture being able to fly an airplane over a stretch of water and see, with absolute clarity, what’s hiding beneath the waves.

It sounds impossible. As it turns out, it’s just really, really difficult.

The issue with lidar, the trouble with sonar

“Imaging underwater environments from an airborne system is a challenging task, but one that has many potential applications,” Aidan James Fitzpatrick, a graduate student in Stanford University’s department of and electrical engineering, told Digital Trends.

The obvious candidate for this imaging job is lidar. Lidar is the bounced laser technology most famous for helping (non-Tesla) autonomous vehicles to perceive the world around them. It works by emitting pulsed light waves and then measuring how long they take to bounce off objects and return to the sensor. Doing this allows the sensor to calculate how far the light pulse traveled and, as a result, to build up a picture of the world around it. While self-driving cars remain the best-known use of lidar, it can be used as a powerful mapping tool in other contexts as well. For example, researchers used it in 2016 to uncover a long-lost city hidden beneath dense foliage cover in the Cambodian jungle.

 

Lidar isn’t appropriate for this kind of mapping, though. Although advanced, high-power lidar systems perform well in extremely clear waters, much of the ocean — especially coastal water — tends to be murky and opaque to light. As a result, Fitzpatrick said, much of the underwater imaging performed to date has relied on in-water sonar systems that use sound waves able to propagate through murky waters with ease.

Unfortunately, there’s a catch here, too. In-water sonar systems are mounted to, or towed by, a slow-moving boat. Imaging from the air, using a flying airborne vehicle, would be more effective since it could cover a much greater area in less time. But it’s impossible since sound waves cannot pass from air into water and then back again without losing 99.9999 percent of their energy.

What comes to PASS

Consequently, while lidar and radar systems have mapped the entire Earth’s landscape (emphasis on the “land”), only around 5 percent of the global waters have been the subject of similar imaging and mapping. That’s the equivalent of a world map that only shows Australia, and leaves the rest of it dark like some unexplored Age of Empires map.

“Our goal is to propose a technology which can be mounted on a flying vehicle to provide large-scale coverage while using an imaging technique that is robust in murky water,” Fitzpatrick said. “To do this, we are developing what we have coined a Photoacoustic Airborne Sonar System. PASS exploits the benefits of light propagation in air and sound propagation in water to image underwater environments from an airborne system.”

Stanford University

PASS works like this: First, a special custom laser system fires a burst of infrared light that is absorbed by the first centimeter or so of water. Once laser absorption has occurred, the water thermally expands, creating sound waves that are able to travel into the water.

“These sound waves now act as an in-water sonar signal that was remotely generated using the laser,” Fitzpatrick continued. “The sound waves will reflect off underwater objects and travel back toward the water surface. Some of this sound – only about 0.06 percent – crosses the air-water interface and travels up toward the airborne system. High-sensitivity sound receivers, or transducers, capture these sound waves. The transducers [then] convert the sound energy to electrical signals which can be passed through image reconstruction algorithms to form a perceptible image.”

The things that lie beneath

So far, PASS is a work in progress. The team has demonstrated high-resolution, three-dimensional imaging in a controlled lab environment. But this, Fitzpatrick acknowledged, is in a “container the size of a large fish tank,” although the technology is now “close to the stage” where it could be deployed over a large swimming pool.

Stanford University

There is, of course, a slight difference between a large swimming pool and the entirety of Earth’s oceans, and this will require considerably more work. In particular, a big challenge to be solved before testing in larger, more uncontrolled environments is how to tackle imaging through water with turbulent surface waves. Fitzpatrick said that this is a head-scratcher, but it’s one that “surely has feasible solutions,” some of which the team is already working on.

“PASS could be used to map the depths of uncharted waters, survey biological environments, search for lost wreckages, and potentially much more,” he said. “Isn’t it strange,” he added, “that we have yet to explore the entirety of the Earth we live on? Maybe PASS can change this.”

Combining light and sound in order to solve the air-water interface would be a game changer. And after that? Bring on the army of mapping drones to finally help show us what lies beneath the ocean’s surface.

A paper describing the PASS project was recently published in the journal IEEE Access.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
ChatGPT’s awesome Deep Research gets a light version and goes free for all
Deep Research option for ChatGPT.

There’s a lot of AI hype floating around, and it seems every brand wants to cram it into their products. But there are a few remarkably useful tools, as well, though they are pretty expensive. ChatGPT’s Deep Research is one such feature, and it seems OpenAI is finally feeling a bit generous about it. 

The company has created a lightweight version of Deep Research that is powered by its new o4-mini language model. OpenAI says this variant is “more cost-efficient while preserving high quality.” More importantly, it is available to use for free without any subscription caveat. 

Read more
Star Wars legend Ian McDiarmid gets questions about the Emperor’s sex life
Ian McDiarmid as the Emperor in Star Wars: The Rise of Skywalker.

This weekend, the Star Wars: Revenge of the Sith 20th anniversary re-release had a much stronger performance than expected with $25 million and a second-place finish behind Sinners. Revenge of the Sith was the culmination of plans by Chancellor Palpatine (Ian McDiarmid) that led to the fall of the Jedi and his own ascension to emperor. Because McDiarmid's Emperor died in his first appearance -- 1983's Return of the Jedi -- Revenge of the Sith was supposed to be his live-action swan song. However, Palpatine's return in Star Wars: Episode IX -- The Rise of Skywalker left McDiarmid being asked questions about his character's comeback, particularly about his sex life and how he could have a granddaughter.

While speaking with Variety, McDiarmid noted that fans have asked him "slightly embarrassing questions" about Palpatine including "'Does this evil monster ever have sex?'"

Read more
Waymo and Toyota explore personally owned self-driving cars
Front three quarter view of the 2023 Toyota bZ4X.

Waymo and Toyota have announced they’re exploring a strategic collaboration—and one of the most exciting possibilities on the table is bringing fully-automated driving technology to personally owned vehicles.
Alphabet-owned Waymo has made its name with its robotaxi service, the only one currently operating in the U.S. Its vehicles, including Jaguars and Hyundai Ioniq 5s, have logged tens of millions of autonomous miles on the streets of San Francisco, Los Angeles, Phoenix, and Austin.
But shifting to personally owned self-driving cars is a much more complex challenge.
While safety regulations are expected to loosen under the Trump administration, the National Highway Traffic Safety Administration (NHTSA) has so far taken a cautious approach to the deployment of fully autonomous vehicles. General Motors-backed Cruise robotaxi was forced to suspend operations in 2023 following a fatal collision.
While the partnership with Toyota is still in the early stages, Waymo says it will initially study how to merge its autonomous systems with the Japanese automaker’s consumer vehicle platforms.
In a recent call with analysts, Alphabet CEO Sundar Pichai signaled that Waymo is seriously considering expanding beyond ride-hailing fleets and into personal ownership. While nothing is confirmed, the partnership with Toyota adds credibility—and manufacturing muscle—to that vision.
Toyota brings decades of safety innovation to the table, including its widely adopted Toyota Safety Sense technology. Through its software division, Woven by Toyota, the company is also pushing into next-generation vehicle platforms. With Waymo, Toyota is now also looking at how automation can evolve beyond assisted driving and into full autonomy for individual drivers.
This move also turns up the heat on Tesla, which has long promised fully self-driving vehicles for consumers. While Tesla continues to refine its Full Self-Driving (FSD) software, it remains supervised and hasn’t yet delivered on full autonomy. CEO Elon Musk is promising to launch some of its first robotaxis in Austin in June.
When it comes to self-driving cars, Waymo and Tesla are taking very different roads. Tesla aims to deliver affordability and scale with its camera, AI-based software. Waymo, by contrast, uses a more expensive technology relying on pre-mapped roads, sensors, cameras, radar and lidar (a laser-light radar), that regulators have been quicker to trust.

Read more