Skip to main content

Apple secretly adds AR-powered FaceTime eye correction in iOS 13

Image used with permission by copyright holder

While we knew iOS 13 was going to contain a lot of useful additions outside of the headline features like Dark Mode, we didn’t expect Apple to add AR-powered eye correction to FaceTime video calls. But that seems to be exactly what it’s done in the most recent update for the iOS 13 public beta.

According to app designer Mike Rundle (and signal-boosted by the surprised folks on Reddit’s Apple subreddit), the iOS 13 public beta now includes an option for “FaceTime Attention Correction.” According to the feature’s tooltip, turning this on will increase the accuracy of your eye contact with the camera during FaceTime video calls. What does that mean? AR black magic trickery, basically.

Recommended Videos

Haven’t tested this yet, but if Apple uses some dark magic to move my gaze to seem like I’m staring at the camera and not at the screen I will be flabbergasted. (New in beta 3!) pic.twitter.com/jzavLl1zts

— Mike Rundle (@flyosity) July 2, 2019

Please enable Javascript to view this content

It all comes down to a minor, but irritating flaw that FaceTime — and admittedly, all other video-calling apps — suffers from. If you’re looking at your screen to look at the person you’re talking to, then you’re not looking at the camera. If you’re not looking at the camera, then it doesn’t seem as if you’re looking at the person you’re calling — which leads to a weird disconnect where everyone in the call seems to be not looking directly at anyone else.

Apple’s new setting changes that, making subtle alterations to your video stream to make it seem as if you’re actually looking directly at the person on the other end of the call. People were quick to try it out, and noticed immediately that the setting is actually fairly effective.

So how does it work? It’s a combination of Apple’s ARKit augmented reality software and the TrueDepth cameras built into the latest iPhones. FaceTime uses the TrueDepth camera to grab a depth map of your face — much like FaceID — and then runs the data through ARKit, creating a slightly altered version of your eyes and nose with a new focus. Thanks to the processing power of the most recent iPhones, this can happen in real time, making the process seamless. In a video, Dave Shukin shows how it’s done.

How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.

Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN

— Dave Schukin (@schukin) July 3, 2019

As ever, there’s a catch to this amazing new feature. It’s only available to the most recent batch of iPhones — so only iPhone XS and XS Max owners are currently able to experience it. Despite being loaded with the same hardware, the iPhone X misses out. But with Apple being Apple, don’t be surprised if this rolls out for iPhone X in the full release of iOS 13, or comes to it shortly afterward. At this moment, it’s also unknown whether this feature will also come to MacOS and iPadOS — but we’d be surprised if it didn’t.

Mark Jansen
Mobile Evergreen Editor
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
A hidden iOS 18.1 upgrade made it harder to extract data from iPhones
A person holding the Apple iPhone 16 Plus.

Apple Intelligence was the most notable upgrade that arrived on iPhones with the iOS 18 series of updates. But it seems Apple reinforced the security protocols in the background that could prevent bad actors from gaining unauthorized access to iPhones that haven’t been unlocked in a while by their legitimate owner.

Earlier this month, 404Media reported that law enforcement officials are troubled by iPhones that are mysteriously rebooting. Citing a report courtesy of officials in Michigan, the outlet notes that the reboots are hampering the ability to access what’s stored on the phones through brute-force unlock methods.

Read more
Apple quietly nixed this Apple Intelligence feature from iOS 18.2
Image Playground on iPad.

One of the most highly anticipated features of Apple Intelligence, Image Playground, has finally launched in the iOS 18.2 developer and public betas. This artificial intelligence tool, announced in June, enables users to create cartoon-like images from text descriptions. Unfortunately, at least in the beta version, one of Image Playground's announced features is missing.

As first noted on X (formerly Twitter) by @nicolas09f9 (via MacRumors), Image Playground was once expected to feature three design styles: Animation, Illustration, and Sketch. For whatever reason, the latter isn't a choice in the beta.

Read more
iOS 18.2 may make charging your iPhone even easier. Here’s how
A close-up view of the App Library page on the iPhone 16.

We've all been in a situation where we need to charge our phone quickly, but it can be hard to gauge just how much time it needs to spend on the charger before it gets a usable amount of juice. A feature coming to iOS 18.2 will tell you how much more time your phone needs, although we aren't quite sure yet when it will be released.

On Monday, iOS 18.2 beta 2 was released to developers. 9to5Mac spotted the codebase for this feature in their breakdown, stating that it will calculate the amount of time needed to reach a certain charge threshold based on how powerful the charger is. The framework was dubbed "BatteryIntelligence" within the code, but although it was present, the feature isn't finished. That likely means it has been added in for testing purposes, but won't be ready for full deployment for some time yet.

Read more