Skip to main content

How a clever photography trick is bringing Seattle’s shipwrecks to the surface

photogrammetry underwater wrecks global underwater explorers
Kees Beemster Leverenz

Seattle is an isthmus.

On the east side of the city lies freshwater Lake Washington, while on the west you’ll find the salty waters of the Puget Sound. Created when a glacier inched its way across the land thousands of years ago, Lake Washington is home to algae, zooplankton, and some PCB-contaminated fish. Thanks to its ocean access, the Sound is occasionally visited by orcas.

At the bottom of these two bodies of water, however, the landscape starts to change. Divers have found swords, tequila bottles, bags of garbage, and old laptops. There’s also more historically significant objects, too, like planes and shipwrecks.

Even for those with the gear and training to dive over 100 feet in frigid water, getting a sense for what these wrecks really look like can be a challenge. “The visibility is quite poor, so we’re not able to see very far,” Kees Beemster Leverenz told Digital Trends. “And on top of that, there isn’t almost any light that penetrates down past the first couple of dozen feet, maybe 70 feet or so.” Beemster Leverenz is a Microsoft software developer by day, diver by night and on many weekends. He’s part of the Global Underwater Explorers (GUE), a nonprofit that educates divers and helps conserve aquatic environments. Using photogrammetry, he hopes to bring some of these sunken vessels to surface in the form of 3D models.

Mars attacked

In 2011, a team that included some GUE divers located the Mars in the Baltic Sea. Sunk during a battle in 1564, the Swedish warship could hold as many as 900 sailors. It’s massive and, thanks to the dark, cold Nordic waters, pretty well preserved. There’s no way to recover the 200-foot, three-masted ship, but researchers were excited to learn more about the famed wreck. Instead of sending a bunch of scientists 250 feet below, they devised a way to bring the ship to life with photogrammetry.

GUE need four 33,000-lumen light bars to even make a dent in the dimness 100+ feet from the surface.

By taking laser scans and thousands of photos of the planks, cannons, masts, and so on, Professor Johan Rönnby of Södertörn University and his team were able to capture the ship from every angle. Then, software pieces the photos together to make a 3D model that researchers can spin and zoom in on, giving them the ability to see details but also get a sense of how the ship looked when it was whole.

When Beemster Leverenz heard about the Mars project, he decided to use some of the techniques on Seattle-area wrecks. There were plenty to choose from. In Lake Washington alone, there are at least seven plane wrecks, a dozen coal cars that slid overboard a barge, and hundreds of boats. Over the decades, divers have discovered many of them, guided by the National Oceanic and Atmospheric Administration sonar data.

Deep, dark sea

Like the Baltic, Lake Washington is dark and chilly. It’s also full of sediment. Stir up the muck on the bottom, and you might as well surface for the day. Your photos are just going to show cloudy water, cast greenish yellow by the light.

Conditions in Lake Crescent, about 100 miles northwest of Seattle, are very different from Lake Washington. Thanks to the clear water and ambient light, Kathryn Arant, another GUE diver, was able to quickly snap the 200 or so images needed for the photogrammetry of a 1927 Chevrolet lying on its side in 170 feet of water.

[iframe-embed url=”https://sketchfab.com/models/805e79f2ab444e0a8574e3d384e217e0/embed?autostart=1&autospin=0.1″ size=”xlarge” height=”500px”]
A 3D model of the Warren car in Washington’s Lake Crescent gives viewers a look at a recently solved mystery. Kees Beemster Leverenz

The car was first found in 2002, solving the mystery of what happened to a young couple, Russell and Blanch Warren, who went missing in 1929. Because of the winding, unpaved roads around Lake Crescent, it had been assumed their car went into the water. With Arant’s images and the Agisoft Photoscan software, the result is a model that shows the Warren car down to speedometer and still-inflated tires.

The car was one of GUE Seattle’s first attempts at photogrammetry. It took Beemster Leverenz and his fellow divers a few tries to get the hang of the process. They started out using GoPros, protected with underwater housing. Quickly, they realized they needed better cameras and more light. They purchased 33,000-lumen light bars that will dazzle you if you look into them when turning them on. Despite their intense brightness, they need four to even make a dent in the dimness over 100 feet from the surface. “We’re able to turn what appears to be really bad visibility into so-so visibility,” said Beemster Leverenz.

Connect the dots

“I like to say that the easiest thing that you could ever document is a dome that doesn’t have any little bits that stick out, that has no wings or propellers to make things difficult,” said Beemster Leverenz. The Warren car was pretty close. Planes are harder. Divers need to balance getting all the details with not overwhelming the software. “It’s important to be frugal where you can with photos,” he said.

For one plane wreck, a PBM Mariner, the GUE team took about 5,500 photos. There’s only one of these planes left intact — above sea level, anyway —- at the Pima Air and Space Museum in Arizona. The flying boat was difficult to transport on land, so most were scrapped. One sunk in Lake Washington in 1949. Navy divers tried to surface the plane in the 1990s but only succeeded in breaking the tail off. The majority is still about 70 feet underwater.

It’s also virtually in the Pima museum, thanks to the GUE’s photogrammetry efforts. Along with Dr. Megan Lickliter-Mundon, an underwater aviation archaeologists, they created a 3D model of the rare plane, which sits alongside the recovered tail.

Those party cups are ubiquitous, showing up in as specks of red in some of GUE’s photogrammetry models.

Recreating wrecks like the PBM Mariner and another sunken plane, the PB4Y-2, requires a lot of photos, which in turn takes a lot of processing power. First, the software analyzes the photos and starts lining them up. It recognizes certain objects, a rudder, a wing flap, and starts mapping them out, using photos of the same object taken at different angles. This is called a point cloud, which Beemster Leverenz compares to connect the dots. The shape is there; it’s just not filled in.

Next, the computer connects those dots into a mesh. “The mesh doesn’t actually have the color to it,” he said. “It’s really similar [to] putting together a plastic model before you painted it.” The white mesh looks like a plane, but it doesn’t have the details and definition needed to distinguish certain parts. The third step is to layer the details from the photos on top of the mesh, the sort of “color in” process.

For GUE’s latest project, the PB4Y-2, Beemster Leverenz was able to recruit a non-diver to help. Patrick Goodwin works for Dice, which makes the Battlefield video game series. He and Beemster Leverenz have a mutual friend and happened to start discussing photogrammetry via voice chat while playing a video game together. Dice uses photogrammetry to realistically bring real-world objects and places — like the Alps — into games. Goodwin optimizes models to make them wieldy. If they’re overly detailed, they’ll be too overloaded with data to spin and allow you to see the wreck from every angle. The plane’s rivets, for example, don’t need to be built into the model when they can be projected on top instead. It’s like the difference between painting individual stripes or slapping on a decal.

Render of Sunken World War II-era patrol bomber, the Consolidated PB4Y-2 Privateer.
Using a photogrammetry, a technique for extracting 3D information from photographs, Beemster Leverns and Dice developer Patrick Goodwin were able to generate a high-quality 3D model of a sunken Consolidated PB4Y-2 Privateer. Kees Beemster Leverenz and Patrick Goodwin

In addition, Goodwin is helping render some of the environment around the wreck. “If you want to make a model of a blank white room, you can’t do it,” Beemster Leverenz said. The software needs contrast to create the model. The plane itself has that, but the ground it rests on doesn’t. “It’s just sort of a flat greenish, yellowish nothingness,” he said. But it’s necessary to provide context. Without it, “you end up with a model of an airplane that doesn’t look like it’s actually been crash into anything,” he added. Sometimes the contrast comes from unexpected places — a crinkled Target bag or a red solo cup. Those party cups are ubiquitous, showing up in as specks of red in some of GUE’s photogrammetry models.

Though everyone survived both the PB4Y-2 and PBM Mariner sinkings, the fact that man-made objects litter these aquatic floors is depressing — even if they are being reclaimed by marine life. There are ways to use photogrammetry to help nature as well, Beemster Leverenz said. The Marine and Science Technology Center in Des Moines, Washington has considered creating an artificial reef in Puget Sound — to replace waterlogged VW Beetles and other substitute marine environments. Photogrammetry could be a non-destructive way to measure the reef’s growth over time. Hopefully, it will stay free of solo cups.

Editors' Recommendations

How to clean up and organize your photos
A camera next to a laptop, with a cat relaxing in the background.

Most people enjoy capturing photos, but few take any pleasure in the labor-intensive task of getting them into any kind of order on computers or other devices. However, just as spring cleaning your house or emptying the McDonald's wrappers out of your footwell can be a rewarding and cathartic experience -- as well as a necessary one -- so too can be cleaning up the similarly messy pile of photos stuffed haphazardly on your PC.
Where to store your photos
How are your photos currently stored? Randomly crammed into your pictures folder? Wadded onto an external hard drive? Burned onto a billion DVDs? A drawer full of assorted SD cards mixed with paper clips, pencils, and other detritus? Or do you just have them in the phone or camera that was used to capture them?

Wherever your photos are, your first step should be figuring out where they should be. I recommend an external drive, whether a hard drive, solid-state drive, or a larger drive array. You could keep them on an internal hard drive or SSD, but there’s more that can go wrong with a laptop, desktop, or mobile device, and it’s not always easy to recover your photos if something goes wrong. An external storage solution is more secure and can be easily shared between different devices.

Read more
How 5G is changing journalism
best series finales the newsroom finale

There's little doubt that 5G is starting to touch every area of our lives -- from online classrooms to 5G-powered bots supplying medication to remote citizens. It’s no surprise then that 5G is also changing the way our newsrooms work. 
Once widely available, 5G tools and the faster speeds they deliver will help journalists in at least three ways, professor John Pavlik of Rutgers University. First, he says, “5G can enable journalists working in the field to report more effectively from their digital devices, particularly with regard to high-bandwidth news gathering, such as photogrammetry, and other immersive applications for augmented reality and virtual reality (e.g., volumetric video capture), as well as high-resolution video from mobile devices.”
Second, 5G can enable news organizations to operate effectively without relying on a central, physical newsroom by supporting high-speed internet file sharing. Finally, 5G can help improve newsrooms by supporting better engagement with the public.
The best example of how 5G has made journalism more effective can be seen with the latest collaboration between The New York Times and Verizon. In 2019, the two companies came together to build a 5G Journalism Lab. Tools born out of this collaboration include environmental photogrammetry, Beam, and Eclipse.
Environmental photogrammetry
“Environmental photogrammetry involves taking thousands of still photographs and stitching them together as one large 3D model, giving readers the ability to immersively navigate the space as if they were actually there,” explains Sebastian Tomich, senior vice president and global head of advertising and marketing solutions for The New York Times.
This technology was first used in a 2020 story that toured the Los Angeles mansion where gamer conglomerate FaZe Clan lived and worked. “An article that employs environmental photogrammetry uses as much data as streaming an hourlong television show,” Marc Lavallee, head of research and development for the Times, said in a press release. “Making this kind of reading experience seamless for our readers requires cutting-edge networks like 5G.”
Beam and Eclipse 

Talking about their first proprietary photography app, Beam, Tomich said it allows Times journalists working in the field to capture and automatically upload high-resolution images to the newsroom with nothing but their smartphone and camera. 
Building upon the advances of Beam, the Eclipse app leverages Verizon 5G to expand video journalism. Eclipse can use 5G to transmit professional video files that meet The Times’s quality standards at a speed that competes with uploads of mobile phone videos, which have file sizes roughly 14 times smaller, Tomich said. It allows video journalists to get material into their editors’ hands in close to real time, rather than hours later.
“This "always on" connection facilitated by Beam and Eclipse enables deeper coordination between the newsroom and photo and video journalists in the field,” he said. “With the ability to review footage in near real time, editors can now request additional photos or videos while the journalist is still on the scene.”
Real-life applications
These tools developed by the 5G lab aren’t just ideas inside four walls. The team has already started implementing them to improve the speed and quality of journalism. 
For instance, when the team went to cover the 2020 Oscars red carpet arrivals, Verizon set up a 5G network at the event. Using Beam, a Times photographer roamed the red carpet freely, without interruption or regard for file transfer limits. “He ended up sending eight times more photos than the previous year’s photographer, with an average upload time of 2.1 seconds,” Tomich said. “With Beam, shooting IS filing.”
However, creating powerful tools isn’t always enough for effective real-world practices. Factors like awareness, availability, and access to resources play a huge role in shaping journalism. As newsrooms and 5G providers are waking up to the transformational power of 5G-powered, Pavlik suggests three ideas to better capitalize on the 5G tools available on the market.
He advises newsrooms to: 

Read more
How your smart home can enhance your holiday party
iRobot's Roomba J7 robot vacuum cleaning near a Christmas tree.

How many times have you hosted a party for friends or family and stressed in the last hour to get everything perfect? Finally, it's your time to have the cool house on the block and show your guests a wonderful time. And smart home devices can help.

With a simple "Hey Siri, let's party!" your lights turn on, the music plays sweet melodies, and the blow-up Santa in the yard waves at the neighbors. With the abundance of smart home accessories available today, a daunting task such as hosting a Christmas party can be made simple and, better yet, automated. Let see what some of the best devices are to help and how you can use them to make your holidays easier.

Read more