Skip to main content

Apple is injecting core services on iOS with AR technology, a new patent shows

Although Apple CEO Tim Cook hasn’t officially laid out the company’s etched-in-stone plans for virtual reality and augmented reality, he has hinted that the company may lean more towards the latter, as AR keeps users visually connected with others in the same room while VR experiences completely engulf each viewer. Support for these technologies begins at the operating system level, and a recent patent reveals that Apple could be currently working on implementing AR into its base iOS services.

Published by the United States Patent and Trademark Office on Tuesday, Apple’s new patent is simply called “Augmented reality maps.” It was submitted for approval in February 2010, and lists former Apple employee Jaron Waldman as the inventor. Waldman was responsible for developing the location-based services infrastructure for Maps, CoreLocation, and MapKit for iOS and OS X. He left Apple in 2013 to establish Curbside, a service for finding, buying, and retrieving products from nearby stores.

Recommended Videos

Based on the new patent, Apple’s Maps application won’t utilize a special headset that will project augmented map information in the user’s field of view. Instead, the information will be displayed on a mobile device’s screen, and will based on the user’s current geographical location, camera direction, and the tilt of the device. Essentially, users will point their iOS device in a specific direction and a search result will display nearby points of interest in an overlay rendered over the camera’s real-time feed.

For instance, the owner of an iPhone may load up Maps and point the device in a specific direction when visiting Washington, D.C. In turn, Maps will render relevant information in a layer drawn over the camera’s real-time video feed displayed on the screen. This information will stay “attached” to those real-world points of interest no matter how the iPhone’s point-of-view is shifted. After that, if the user selects a location on the screen, Maps will progress to display directional information as the user moves toward that location.

Apple AR Maps Patent
Image used with permission by copyright holder

That said, the AR version of Maps relies on several factors: a built-in camera, GPS connectivity, a digital compass to determine the camera’s direction, an accelerometer to determine the camera’s tilt orientation, and internet access for retrieving and servicing map information requests based on the area surrounding the current GPS coordinates. That information will include “descriptive” data of features including places, roads, buildings, and other points of interest.

Of course, we can’t forget the graphics processor that will generate the augmented information and the real-time video feed on the same screen:

“The graphics processor configured to: overlay, on the video image, map data representing each of the one or more identified points of interest and an illustrated portion of a route from the GPS location identified by the GPS device to a selected one of the identified points of interest, the route being determined based on the map data; and update the illustrated portion of the route as the GPS location of the video capture device changes with movement along the route; and the display configured to present the video image that has been overlaid.”

If anything, the patent reveals that Apple has tooled around with AR technology for at least six years. Naturally, the advancement of mobile technology makes the patent relevant to today’s devices rather than what powered the iPhone 4 so many years ago. It also shows that Apple is indeed building an AR foundation into iOS, and hopefully we’ll see all that work go live with an upcoming iOS release in the near future.

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
There’s a new way to use ChatGPT on your iPhone. Here’s how it works
Someone holding the iPhone 16 Pro with its display on.

There is a new way to access ChatGPT on Apple's iPhone and iPad. As reported by MacRumors, the latest version of the ChatGPT app makes it even easier to access the app's SearchGPT feature.

ChatGPT, a sophisticated AI chatbot developed by OpenAI, utilizes an ever-growing dataset to answer questions, write stories, summarize factual topics, translate languages, and create creative content. It is available on Apple devices through the ChatGPT app, and it is expected to be integrated into Siri in a future version of Apple Intelligence.

Read more
Google Gemini arrives on iPhone as a native app
the Google extensions feature on iPhone

Google announced Thursday that it has released a new native Gemini app for iOS that will give iPhone users free, direct access to the chatbot without the need for a mobile web browser.

The Gemini mobile app has been available for Android since February, when the platform transitioned from the older Bard branding. However, iOS users could only access the AI on their phones through either the mobile Google app or via a web browser. This new app provides a more streamlined means of chatting with the bot as well as a host of new (to iOS) features.

Read more
macOS Sequoia fixes a problem that’s bugged me for years
The iPhone Mirroring feature from macOS Sequoia being demonstrated at the Worldwide Developers Conference (WWDC) 2024.

Sometimes, people think it’s the big, headline features -- like Apple Intelligence -- that make an operating system great. But there’s one new feature in macOS Sequoia that shows the opposite is true -- that a collection of less glamorous, yet meaningful changes can have a much bigger impact.

I’m talking about Apple’s new iPhone Mirroring feature. Or rather, one particular element of iPhone Mirroring: its new drag-and-drop ability. Even in the few short days it’s been available, it’s managed to improve my daily workflow and fix an issue that’s been bugging me for years.

Read more