Skip to main content

Watch your toes. Snapchat’s new lens turns the ground into hot lava

Image used with permission by copyright holder

Kids who grew up pretending the floor was hot lava can now actually turn the ground into a molten mess using a smartphone and Snapchat’s newest augmented reality technology. Snap Inc. has launched ground segmentation World Lenses, a new tool that recognizes where the ground is in a photgraph in order to douse it with water or, yes, turn it into hot lava.

The feature demonstrates just how far Snapchat’s augmented reality technology has come from just simply placing a dancing hot dog into a scene. The ground segmentation technology uses machine learning to identify which parts of the image are the ground and which are not, a challenge that involved teaching a computer the geometry and semantics of the real world.

Once the camera recognizes what the ground is, the AR lenses will turn the ground into hot lava, complete with steam and unscathed patches of ground to jump on. Or, the “floor is water” lens will flood the ground and include a reflection of everything else in the scene.

The new World Lenses are a result of an internal version of Lens Studio, the desktop platform that allows users to create their own lenses. The internal version allowed Snap team members to build the lenses; Snap says the ability to recognize the floor may eventually be part of the widely available Lens Studio as well.

Snap calls the ground segmentation “a natural evolution and next step for us in understanding what the camera can see and helping our community learn more about the world around them.” Snapchat launched World Lenses in 2017, expanding beyond the selfie lenses that use facial recognition to apply the filters that are now what Snapchat is most known for. The ground segmentation expands on options like sky replacement filters. Other World Lenses that are more advanced than that now-iconic dancing hot dog include the option to transform iconic landmarks using AR.

To find the ground segmentation lenses, update Snapchat, then head into the camera view with the rear-facing camera. (Double tap if you are in selfie mode to switch cameras.)Tap the camera screen to bring up the lens carousel and look for the new options called “floor is water” and “floor is lava.”

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
We finally know the price of Asus’ most powerful gaming NUC
The Asus ROG NUC on a desk surrounded by three monitors.

The first Asus ROG NUC (Next Unit of Computing) model is just around the corner. The small form factor PC is now up for pre-order at a German retailer, and although it's powerful enough to rival some of the best laptops, it costs more than many comparable models -- and you'll still have to pay extra for a monitor.

Asus' first take on Intel's portable PC contains a lot of compute power in a small chassis. Although there are a few configurations of the PC, the one that was spotted up for sale ahead of time comes with Intel's latest Meteor Lake-H CPU, the Core Ultra 9 185H, which sports 16 cores and 22 threads and can be boosted to run at up to 5.1GHz, all with a thermal design power (TDP) of 45 watts. However, Asus allows overclocking, meaning that the CPU can run at up to 65 watts instead.

Read more
YouTube tells creators to start labeling ‘realistic’ AI content
YouTube on Roku.

YouTube is taking steps to try to help viewers better understand if what they’re watching has been created, whether completely or in part, by generative AI.

“Generative AI is transforming the ways creators express themselves -- from storyboarding ideas to experimenting with tools that enhance the creative process,” YouTube said in a message shared on Monday. “But viewers increasingly want more transparency about whether the content they’re seeing is altered or synthetic.”

Read more
AMD is making the CPU more and more obsolete in gaming
A demo of AMD GPU work graphs featuring in-game scenery including a castle and a town.

At GDC 2024, AMD just expanded on Microsoft's recently announced Work Graphs API, and a quick demo shows just how powerful the new tech can be for gaming performance. AMD's iteration moves draw calls and mesh nodes from the CPU to the GPU, cutting back on the time it takes to execute these tasks. As a result, AMD found that there was a massive performance improvement -- rendering time saw a 64% boost -- when using Work Graphs with mesh shaders.

Microsoft introduced Work Graphs as a way to streamline processes both in gaming and in productivity, all by giving the GPU the power to schedule and execute tasks without first communicating with the CPU. It's built into the Direct3D 12 API and it can reduce bottlenecks and improve gaming performance in 3D games.

Read more