While we knew iOS 13 was going to contain a lot of useful additions outside of the headline features like Dark Mode, we didn’t expect Apple to add AR-powered eye correction to FaceTime video calls. But that seems to be exactly what it’s done in the most recent update for the iOS 13 public beta.
According to app designer Mike Rundle (and signal-boosted by the surprised folks on Reddit’s Apple subreddit), the iOS 13 public beta now includes an option for “FaceTime Attention Correction.” According to the feature’s tooltip, turning this on will increase the accuracy of your eye contact with the camera during FaceTime video calls. What does that mean? AR black magic trickery, basically.
Haven’t tested this yet, but if Apple uses some dark magic to move my gaze to seem like I’m staring at the camera and not at the screen I will be flabbergasted. (New in beta 3!) pic.twitter.com/jzavLl1zts
— Mike Rundle (@flyosity) July 2, 2019
It all comes down to a minor, but irritating flaw that FaceTime — and admittedly, all other video-calling apps — suffers from. If you’re looking at your screen to look at the person you’re talking to, then you’re not looking at the camera. If you’re not looking at the camera, then it doesn’t seem as if you’re looking at the person you’re calling — which leads to a weird disconnect where everyone in the call seems to be not looking directly at anyone else.
Apple’s new setting changes that, making subtle alterations to your video stream to make it seem as if you’re actually looking directly at the person on the other end of the call. People were quick to try it out, and noticed immediately that the setting is actually fairly effective.
So how does it work? It’s a combination of Apple’s ARKit augmented reality software and the TrueDepth cameras built into the latest iPhones. FaceTime uses the TrueDepth camera to grab a depth map of your face — much like FaceID — and then runs the data through ARKit, creating a slightly altered version of your eyes and nose with a new focus. Thanks to the processing power of the most recent iPhones, this can happen in real time, making the process seamless. In a video, Dave Shukin shows how it’s done.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin ???? (@schukin) July 3, 2019
As ever, there’s a catch to this amazing new feature. It’s only available to the most recent batch of iPhones — so only iPhone XS and XS Max owners are currently able to experience it. Despite being loaded with the same hardware, the iPhone X misses out. But with Apple being Apple, don’t be surprised if this rolls out for iPhone X in the full release of iOS 13, or comes to it shortly afterward. At this moment, it’s also unknown whether this feature will also come to MacOS and iPadOS — but we’d be surprised if it didn’t.
- 12 iOS 16.4 features that are about to make your iPhone even better
- Surprise Apple sale brings big discounts on iPad and Apple Watch
- You aren’t ready for this Galaxy S23 vs. iPhone 14 Pro camera test
- I love the Galaxy S23 — here are 5 things the iPhone still does better
- The one thing the iPhone 14, Galaxy S23, and Pixel 7 all get wrong