Skip to main content

Snapchat now uses the iPhone X’s TrueDepth Camera to make better filters

Snapchat TrueDepth
Image used with permission by copyright holder

Snapchat has updated its app to take advantage of the advanced facial recognition TrueDepth sensors on the iPhone X, delivering users even more spectacular Lenses.

Recommended Videos

If you’re a Snapchat user, or if you frequent social media, you’re likely aware of Snapchat’s Lenses. A long-running feature of the app, these AR filters allow users to morph their faces, overlay various masks, or apply an animal mask that tracks your facial movements. However, they’re not without some clunkiness for the majority of users, and most will find the overlays breaking if they move out of frame, or move too quickly. iPhone X owners, however, will find that their lenses function much better, thanks to Snapchat’s new integration with the iPhone X’s astounding facial tech.

Apple’s TrueDepth tech uses a series of sensors mounted into the front notch to paint 30,000 infrared dots onto your face, mapping your face’s structure in 3D. It’s the tech that powers the iPhone X’s Face ID, and it’s now being leveraged by Snapchat to make filters that are far more realistic. Thanks to the additional data allowed by TrueDepth, Snapchat’s filters can apparently compensate for ambient light, creating shadows and highlights where needed to follow the contours of your face and adapt to your surroundings.

You might be worried about companies exploiting this data for use in their own marketing schemes, or using the data from your Lenses to bypass your Face ID security settings, but there’s no need to worry. While Apple allows developers access to certain parts of the TrueDepth sensors, the developer agreement means that developers only have access to the visually mapped facial data, and not the mathematical algorithm used in Face ID. Apple also specifically bans developers from using this data for marketing purposes, selling it to other companies, or using the data to create a marketing profile of specific users.

Augmented reality, or AR, is getting bigger and bigger in mobile tech. We’ve seen a recent surge in AR apps thanks to the release of Google’s ARCore framework (here are our favorite AR apps), and we’re even seeing fun AR functionality built into new phones with functions like Apple’s Animoji, or Samsung’s AR Emoji.

Mark Jansen
Mobile Evergreen Editor
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
The iPhone should copy this Android phone’s shortcut button feature, here’s why
The buttons on the iPhone 16e

The iPhone is renowned for its ability to start entire trends and drive the smartphone industry in new directions. 

Beginning with the launch of the original iPhone in 2007, which transitioned the industry from resistive to capacitive touchscreens and eliminated the need for a stylus, the iPhone also defined the current smartphone with the introduction of the App Store and the app economy.

Read more
iPhone theft victim sues Apple. It sparks a new hope for others, too
The iPhone 16 sticking out of someone's pocket.

Smartphones are the center of our digital existence. Not just because they open the doors for communication and social connection, but also due to their role as gatekeepers of our financial and professional lives. 

Needless to say, a stolen iPhone can upend your life in many ways, but it’s even harder to recover those precious files stored on the device. A few victims of iPhone theft may finally have a chance, thanks to a lawsuit against Apple over not offering enough help in recovery efforts.

Read more
I tested the Pixel 9a and iPhone 16e’s cameras, and the two almost tied
A person holding the Google Pixel 9a and Apple iPhone 16e.

The Google Pixel 9a’s arch rival, almost regardless of whether you are trying to decide which one to buy, is the Apple iPhone 16e. Just like dogs chase cats, a new Pixel phone will go up against an iPhone in a camera test at some point, and over the past week or so, we’ve worked to answer the question of which phone takes better photos, the Pixel 9a or the iPhone 16e.
The camera specs
Google Pixel 9a (left) and Apple iPhone 16e Andy Boxall / Digital Trends

The two phones have very different camera systems. The Google Pixel 9a has a 48-megapixel main camera with an f/1.7 aperture and optical image stabilization (OIS), plus a 13MP wide-angle camera with an f/2.2 aperture and a 120-degree field of view. On the front is a 13MP selfie camera.

Read more