Skip to main content

Snapchat now uses the iPhone X’s TrueDepth Camera to make better filters

Snapchat TrueDepth
Image used with permission by copyright holder

Snapchat has updated its app to take advantage of the advanced facial recognition TrueDepth sensors on the iPhone X, delivering users even more spectacular Lenses.

If you’re a Snapchat user, or if you frequent social media, you’re likely aware of Snapchat’s Lenses. A long-running feature of the app, these AR filters allow users to morph their faces, overlay various masks, or apply an animal mask that tracks your facial movements. However, they’re not without some clunkiness for the majority of users, and most will find the overlays breaking if they move out of frame, or move too quickly. iPhone X owners, however, will find that their lenses function much better, thanks to Snapchat’s new integration with the iPhone X’s astounding facial tech.

Apple’s TrueDepth tech uses a series of sensors mounted into the front notch to paint 30,000 infrared dots onto your face, mapping your face’s structure in 3D. It’s the tech that powers the iPhone X’s Face ID, and it’s now being leveraged by Snapchat to make filters that are far more realistic. Thanks to the additional data allowed by TrueDepth, Snapchat’s filters can apparently compensate for ambient light, creating shadows and highlights where needed to follow the contours of your face and adapt to your surroundings.

You might be worried about companies exploiting this data for use in their own marketing schemes, or using the data from your Lenses to bypass your Face ID security settings, but there’s no need to worry. While Apple allows developers access to certain parts of the TrueDepth sensors, the developer agreement means that developers only have access to the visually mapped facial data, and not the mathematical algorithm used in Face ID. Apple also specifically bans developers from using this data for marketing purposes, selling it to other companies, or using the data to create a marketing profile of specific users.

Augmented reality, or AR, is getting bigger and bigger in mobile tech. We’ve seen a recent surge in AR apps thanks to the release of Google’s ARCore framework (here are our favorite AR apps), and we’re even seeing fun AR functionality built into new phones with functions like Apple’s Animoji, or Samsung’s AR Emoji.

Editors' Recommendations

Mark Jansen
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
Here’s how Apple could change your iPhone forever
An iPhone 15 Pro Max laying on its back, showing its home screen.

Over the past few months, Apple has released a steady stream of research papers detailing its work with generative AI. So far, Apple has been tight-lipped about what exactly is cooking in its research labs, while rumors circulate that Apple is in talks with Google to license its Gemini AI for iPhones.

But there have been a couple of teasers of what we can expect. In February, an Apple research paper detailed an open-source model called MLLM-Guided Image Editing (MGIE) that is capable of media editing using natural language instructions from users. Now, another research paper on Ferret UI has sent the AI community into a frenzy.

Read more
There’s a big problem with the iPhone’s Photos app
The Apple iPhone 15 Plus's gallery app.

While my primary device these days continues to be my iPhone 15 Pro, I’ve dabbled with plenty of Android phones since I’ve been here at Digital Trends. One of my favorite brands of phone has been the Google Pixel because of its strong suite of photo-editing tools and good camera hardware.

Google first added the Magic Eraser capability with the Pixel 6 and Pixel 6 Pro, which is a tool I love using. Then, with the Pixel 8 series, Google added the Magic Editor, which uses generative AI to make edits that wouldn’t be possible otherwise. There are also tools like Photo Unblur, which is great for old photographs and enhancing images that were captured with low-quality sensors.

Read more
Why you should buy the iPhone 15 Pro Max instead of the iPhone 15 Pro
Someone holding an iPhone 15 Pro Max outside on a patio, showing the back of the Natural Titanium color.

If you want the best iPhone money can buy in 2024, you have two options: the iPhone 15 Pro and the iPhone 15 Pro Max. They have the same chipset, similar display technology, nearly identical cameras, etc. It's a really close battle, save for the fact that the iPhone 15 Pro is $200 cheaper.

It might be tempting to save some cash and choose the iPhone 15 Pro, but I recommend you splurge for the larger (and more expensive) iPhone 15 Pro Max. Why? Let me explain.
It's a big iPhone you won't hate using

Read more