Seattle is an isthmus.
On the east side of the city lies freshwater Lake Washington, while on the west you’ll find the salty waters of the Puget Sound. Created when a glacier inched its way across the land thousands of years ago, Lake Washington is home to algae, zooplankton, and some PCB-contaminated fish. Thanks to its ocean access, the Sound is occasionally visited by orcas.
At the bottom of these two bodies of water, however, the landscape starts to change. Divers have found swords, tequila bottles, bags of garbage, and old laptops. There’s also more historically significant objects, too, like planes and shipwrecks.
Even for those with the gear and training to dive over 100 feet in frigid water, getting a sense for what these wrecks really look like can be a challenge. “The visibility is quite poor, so we’re not able to see very far,” Kees Beemster Leverenz told Digital Trends. “And on top of that, there isn’t almost any light that penetrates down past the first couple of dozen feet, maybe 70 feet or so.” Beemster Leverenz is a Microsoft software developer by day, diver by night and on many weekends. He’s part of the Global Underwater Explorers (GUE), a nonprofit that educates divers and helps conserve aquatic environments. Using photogrammetry, he hopes to bring some of these sunken vessels to surface in the form of 3D models.
In 2011, a team that included some GUE divers located the Mars in the Baltic Sea. Sunk during a battle in 1564, the Swedish warship could hold as many as 900 sailors. It’s massive and, thanks to the dark, cold Nordic waters, pretty well preserved. There’s no way to recover the 200-foot, three-masted ship, but researchers were excited to learn more about the famed wreck. Instead of sending a bunch of scientists 250 feet below, they devised a way to bring the ship to life with photogrammetry.
GUE need four 33,000-lumen light bars to even make a dent in the dimness 100+ feet from the surface.
By taking laser scans and thousands of photos of the planks, cannons, masts, and so on, Professor Johan Rönnby of Södertörn University and his team were able to capture the ship from every angle. Then, software pieces the photos together to make a 3D model that researchers can spin and zoom in on, giving them the ability to see details but also get a sense of how the ship looked when it was whole.
When Beemster Leverenz heard about the Mars project, he decided to use some of the techniques on Seattle-area wrecks. There were plenty to choose from. In Lake Washington alone, there are at least seven plane wrecks, a dozen coal cars that slid overboard a barge, and hundreds of boats. Over the decades, divers have discovered many of them, guided by the National Oceanic and Atmospheric Administration sonar data.
Deep, dark sea
Like the Baltic, Lake Washington is dark and chilly. It’s also full of sediment. Stir up the muck on the bottom, and you might as well surface for the day. Your photos are just going to show cloudy water, cast greenish yellow by the light.
Conditions in Lake Crescent, about 100 miles northwest of Seattle, are very different from Lake Washington. Thanks to the clear water and ambient light, Kathryn Arant, another GUE diver, was able to quickly snap the 200 or so images needed for the photogrammetry of a 1927 Chevrolet lying on its side in 170 feet of water.
The car was first found in 2002, solving the mystery of what happened to a young couple, Russell and Blanch Warren, who went missing in 1929. Because of the winding, unpaved roads around Lake Crescent, it had been assumed their car went into the water. With Arant’s images and the Agisoft Photoscan software, the result is a model that shows the Warren car down to speedometer and still-inflated tires.
The car was one of GUE Seattle’s first attempts at photogrammetry. It took Beemster Leverenz and his fellow divers a few tries to get the hang of the process. They started out using GoPros, protected with underwater housing. Quickly, they realized they needed better cameras and more light. They purchased 33,000-lumen light bars that will dazzle you if you look into them when turning them on. Despite their intense brightness, they need four to even make a dent in the dimness over 100 feet from the surface. “We’re able to turn what appears to be really bad visibility into so-so visibility,” said Beemster Leverenz.
Connect the dots
“I like to say that the easiest thing that you could ever document is a dome that doesn’t have any little bits that stick out, that has no wings or propellers to make things difficult,” said Beemster Leverenz. The Warren car was pretty close. Planes are harder. Divers need to balance getting all the details with not overwhelming the software. “It’s important to be frugal where you can with photos,” he said.
For one plane wreck, a PBM Mariner, the GUE team took about 5,500 photos. There’s only one of these planes left intact — above sea level, anyway —- at the Pima Air and Space Museum in Arizona. The flying boat was difficult to transport on land, so most were scrapped. One sunk in Lake Washington in 1949. Navy divers tried to surface the plane in the 1990s but only succeeded in breaking the tail off. The majority is still about 70 feet underwater.
It’s also virtually in the Pima museum, thanks to the GUE’s photogrammetry efforts. Along with Dr. Megan Lickliter-Mundon, an underwater aviation archaeologists, they created a 3D model of the rare plane, which sits alongside the recovered tail.
Those party cups are ubiquitous, showing up in as specks of red in some of GUE’s photogrammetry models.
Recreating wrecks like the PBM Mariner and another sunken plane, the PB4Y-2, requires a lot of photos, which in turn takes a lot of processing power. First, the software analyzes the photos and starts lining them up. It recognizes certain objects, a rudder, a wing flap, and starts mapping them out, using photos of the same object taken at different angles. This is called a point cloud, which Beemster Leverenz compares to connect the dots. The shape is there; it’s just not filled in.
Next, the computer connects those dots into a mesh. “The mesh doesn’t actually have the color to it,” he said. “It’s really similar [to] putting together a plastic model before you painted it.” The white mesh looks like a plane, but it doesn’t have the details and definition needed to distinguish certain parts. The third step is to layer the details from the photos on top of the mesh, the sort of “color in” process.
For GUE’s latest project, the PB4Y-2, Beemster Leverenz was able to recruit a non-diver to help. Patrick Goodwin works for Dice, which makes the Battlefield video game series. He and Beemster Leverenz have a mutual friend and happened to start discussing photogrammetry via voice chat while playing a video game together. Dice uses photogrammetry to realistically bring real-world objects and places — like the Alps — into games. Goodwin optimizes models to make them wieldy. If they’re overly detailed, they’ll be too overloaded with data to spin and allow you to see the wreck from every angle. The plane’s rivets, for example, don’t need to be built into the model when they can be projected on top instead. It’s like the difference between painting individual stripes or slapping on a decal.
In addition, Goodwin is helping render some of the environment around the wreck. “If you want to make a model of a blank white room, you can’t do it,” Beemster Leverenz said. The software needs contrast to create the model. The plane itself has that, but the ground it rests on doesn’t. “It’s just sort of a flat greenish, yellowish nothingness,” he said. But it’s necessary to provide context. Without it, “you end up with a model of an airplane that doesn’t look like it’s actually been crash into anything,” he added. Sometimes the contrast comes from unexpected places — a crinkled Target bag or a red solo cup. Those party cups are ubiquitous, showing up in as specks of red in some of GUE’s photogrammetry models.
Though everyone survived both the PB4Y-2 and PBM Mariner sinkings, the fact that man-made objects litter these aquatic floors is depressing — even if they are being reclaimed by marine life. There are ways to use photogrammetry to help nature as well, Beemster Leverenz said. The Marine and Science Technology Center in Des Moines, Washington has considered creating an artificial reef in Puget Sound — to replace waterlogged VW Beetles and other substitute marine environments. Photogrammetry could be a non-destructive way to measure the reef’s growth over time. Hopefully, it will stay free of solo cups.
- Keen wants to save the planet one pair of shoes at a time
- What is ray tracing, and how will it change games?
- Adobe Aero let me walk through my own photos in augmented reality
- Urine luck: Astronauts’ pee could be the secret ingredient to future moon bases
- Awesome Tech You Can’t Buy Yet: Ultrafast toothbrushes and a laptop/phone hybrid