Skip to main content

Apple may introduce augmented reality functionality into the iPhone’s camera

Report: Apple has up to 1,000 engineers working on AR product for the iPhone

Google isn’t the only company looking to cash in on the augmented reality craze. Business Insider reports that Apple plans to integrate augmented reality tech directly into the iPhone’s camera app.

The new feature would reportedly take the form of computer vision. Apple’s work-in-progress camera app will be able to identify objects in-frame and recognize faces, according to the report. And it will be made available to developers in the form of a software development kit.

Recommended Videos

The most recent news we’ve heard about Apple’s augmented reality plans suggest that Apple may have as many as 1,000 engineers working on an AR-related product in Israel — a product that will make its way to the next iPhone, at least according to analyst Steven Milunovich and his team at UBS. The report, from Business Insider, also highlights that Apple has recently made a number of AR-related purchases, including PrimeSense, a 3D sensing company in Tel Aviv, and RealFace, a facial recognition company also based in Tel Aviv.

Please enable Javascript to view this content

The Cupertino, California-based company is far from the first to enter the AR arena. Google Goggles, Google’s eponymous object recognition app, recognizes landmarks, barcodes, books, and works of art, and parses text of labels and signage using optical character recognition. And Amazon’s Flow app can decode QR codes, scan business cards, and recognize tens of millions of books, DVDs, and packaged products.

Apple’s system sounds nearly as ambitious: an app that can identify objects that users point the iPhone’s camera at in real time. It will rely on machine learning, a type of artificial intelligence that “learns” and improves over time, and a database of 3D objects that Apple will either license or build itself.

Beyond those basics, the project’s implications aren’t clear. Google’s Project Tango, an augmented reality platform, leverages sensors to measure the depth of surroundings, but the iPhone lacks the hardware necessary to perform that sort of tracking. Apple Insider speculates that Apple’s brand of machine-learning-powered object tracking could be used for spatial mapping, and that facial recognition, meanwhile, could be used to apply Snapchat-style filters to people’s faces.

Spearheading the project is a team comprised of employees from recent Apple acquisitions. The iPhone maker purchased PrimeSense, the Israeli company behind the motion-tracking hardware in Microsoft’s Kinect sensor, in 2013. It bought Metaio in February 2014, FaceShift in 2015, and Flyby Media in January 2016, all of which specialize in virtual reality and AR technologies. And it hired a senior optics manufacturing engineer specializing in heads-up displays, camera systems, and image sensors in September.

The project builds on another reportedly in development at Apple’s Cupertino headquarters: AR glasses. According to Business Insider, Apple is developing a compact pair of AR eyewear that connect wirelessly with an iPhone and show images and other information in the wearer’s field of vision.

If anything is for certain, it’s that an updated camera app will debut far ahead of a headset. Bloomberg reports that Apple has begun talking with suppliers about the glasses and has ordered “small quantities” of displays for testing. The publication pegs 2018 as the product’s earliest possible release window.

Until then, Snapchat’s Spectacles will have to do.

Updated on 03-02-2017 by Christian de Looper: Added new Business Insider report saying that Apple had as many as 1,000 engineers working on AR.

Kyle Wiggers
Former Digital Trends Contributor
Kyle Wiggers is a writer, Web designer, and podcaster with an acute interest in all things tech. When not reviewing gadgets…
I’ve used the iPhone 16 Pro Max for 6 months. Here’s why I love it
The back of the Apple iPhone 16 Pro Max.

I bought the Apple iPhone 16 Pro Max when it was announced and have used it every day since then, racking up six months of use, and yet I’ve written very little about it. It’s time to change that, explain why it is technically my only “permanent” phone, and why I think it’s superb.
How I use my iPhone

I have two SIM cards. One is my “main” SIM card which is attached to the phone number I use, and the other is all about data, and they both live in different phones. My main SIM is switched in and out of review Android phones all the time, while the SIM I use mostly for data only lives in my Apple iPhone. They’re both always with me, and since September 2024 I’ve used the Apple iPhone 16 Pro Max alongside whatever Android phone I’m reviewing.

Read more
Apple might serve a massive front camera upgrade on iPhone 17
An iPhone 16 laying on a shelf with its screen on.

The domain of Apple leaks is currently obsessed with the controversial iPhone 17 Pro design refresh, which could stir some heated debate with its massive camera hump. A lot of chatter is also focused on the svelte iPhone 17 Air. Yet, it seems there are a few other internal upgrades worth getting excited about.
According to analyst Jeff Pu, Apple will equip all four iPhone 17 series models with an upgraded 24-megapixel front camera. So far, Apple has stuck with a 12-megapixel selfie snapper on its mainline iPhones. Moreover, the company hasn’t ever deployed a 24-megapixel camera sensor, keeping its experiments limited to 12-megapixel and 48-megapixel units in the past few years.
The research note by Pu, which was seen by MacRumors and 9to5Mac, doesn’t go into details about the specifications or feature details of the new 24-megapixel front snapper on the iPhone 17 series. However, we can take an educated guess, based on what Apple accomplished when it switched from 12-megapixel to 48-megapixel rear cameras.

A 24-megapixel sensor will most likely default to pixel-binning for delivering pictures and videos at a lower resolution than the native pixel count. Pixel-binning essentially combines the light data collected by adjacent pixels, creating what is colloquially known as a super-pixel.
The sum total of these efforts are pictures that are more detailed and with more realistic color rendering, especially in low-light scenarios. Depending on how the pixels are combined, the final image is usually a lower-resolution shot, but more pleasing to look at.
For example, the iPhone 16 Pro’s 48-megapixel main camera does 4-in-1 pixel binning to produce 12-megapixel pictures, but you can still stick full-res 48-megapixel shots, too. There’s also an intermediary option to to get the best of both worlds with 24-megapixel clicks.
With a 24-megapixel selfie camera coming into the picture, iPhone 17 buyers can expect improved selfies and better-quality video calls. Moreover, since there are more pixels to collect light data, Apple might leverage it to offer more advanced camera features, too.

Read more
Apple’s rumored foldable could be the most expensive iPhone by far
Concept render of a foldable iPhone.

If you're waiting on Apple's rumored foldable iPhone, start saving your pennies. And nickles, dimes, and quarters, too. Analyst Tim Long told Barclays the first foldable iPhone could start in the $2,300 range, which would make it nearly double the price of the current most expensive iPhone (the iPhone 16 Pro Max) and one of the single priciest handsets on the market.

This announcement follows rumors that the foldable iPhone will enter mass production sometime in 2026 or 2027 and lines up with what tipster Ming-Chi Kuo predicted for the price. That said, the rumored handset has gained a lot of attention from iPhone fans. The expected demand for the iPhone foldable is great enough that even the higher price tag might not hurt sales.

Read more