Apple’s dual cameras brought features such as DSLR-like bokeh to smartphones, but the next camera tech the company is focusing on could make those features look old school in comparison. An anonymous source inside the company recently told Bloomberg that Apple is working on a 3D laser system that expands the camera’s capabilities — and increases the realism of augmented reality effects.
The tech works similarly to the iPhone X’s front-facing sensor that allows users to unlock the phone with their face and fake studio lighting on selfies, but brings a similar idea to the rear-facing camera. While similar in concept and the use of lasers, the rear-facing system would instead measure how long it takes each laser to bounce off a surface, in order to create a depth map of the scene. By timing how long it takes for the laser to travel to those surfaces, the iPhone could create a much more detailed depth map then what the two offset dual cameras can create.
A better depth map could in turn be a big boon for augmented reality. With more data about where everything is in the scene, the future iPhone could create a more realistic placement of AR objects. If the smartphone understands the scene in three dimensions, the placement of virtual objects, and even movement, could be adjusted to fit within that specific space.
The unnamed source suggested that the feature could be coming in 2019, but says that the feature may not make it into the final version if the 3D laser system doesn’t do well in testing. With Apple not discussing unreleased products, the feature can be added to a growing list of iPhone rumors. The company is rumored to be releasing a more budget-friendly iPhone alongside a larger, pricier model in 2018. While initial reports suggested the same TrueDepth tech that allows facial ID, analysts later said the company would be instead focusing on new models with that feature in the front-facing camera, a rumor that supports the claim the feature is coming in 2019.
Depth information is becoming increasing available in high-end smartphones, but the tech behind the feature varies. The iPhone Plus smartphones (including the 7 Plus and the X) compare the views from two different lenses to add depth effects. The Google Pixel 2 instead compares the view from opposite sides of a pixel, a variation that works with the dual pixel autofocus system.
- There’s only one reason I’m still using an iPhone in 2023
- This is what an iPhone looks like after a year with no screen protector
- I reviewed 20 phones in 2023. These are my 5 favorites
- Does the Google Pixel Watch work with an iPhone?
- It looks like the iPhone 16 will get a big design upgrade