Home > Virtual Reality > Apple is injecting core services on iOS with AR…

Apple is injecting core services on iOS with AR technology, a new patent shows

Although Apple CEO Tim Cook hasn’t officially laid out the company’s etched-in-stone plans for virtual reality and augmented reality, he has hinted that the company may lean more towards the latter, as AR keeps users visually connected with others in the same room while VR experiences completely engulf each viewer. Support for these technologies begins at the operating system level, and a recent patent reveals that Apple could be currently working on implementing AR into its base iOS services.

Published by the United States Patent and Trademark Office on Tuesday, Apple’s new patent is simply called “Augmented reality maps.” It was submitted for approval in February 2010, and lists former Apple employee Jaron Waldman as the inventor. Waldman was responsible for developing the location-based services infrastructure for Maps, CoreLocation, and MapKit for iOS and OS X. He left Apple in 2013 to establish Curbside, a service for finding, buying, and retrieving products from nearby stores.

More: Apple patent could force you to put down your iPhone at “sensitive events”

Based on the new patent, Apple’s Maps application won’t utilize a special headset that will project augmented map information in the user’s field of view. Instead, the information will be displayed on a mobile device’s screen, and will based on the user’s current geographical location, camera direction, and the tilt of the device. Essentially, users will point their iOS device in a specific direction and a search result will display nearby points of interest in an overlay rendered over the camera’s real-time feed.

For instance, the owner of an iPhone may load up Maps and point the device in a specific direction when visiting Washington, D.C. In turn, Maps will render relevant information in a layer drawn over the camera’s real-time video feed displayed on the screen. This information will stay “attached” to those real-world points of interest no matter how the iPhone’s point-of-view is shifted. After that, if the user selects a location on the screen, Maps will progress to display directional information as the user moves toward that location.

Apple AR Maps Patent

That said, the AR version of Maps relies on several factors: a built-in camera, GPS connectivity, a digital compass to determine the camera’s direction, an accelerometer to determine the camera’s tilt orientation, and internet access for retrieving and servicing map information requests based on the area surrounding the current GPS coordinates. That information will include “descriptive” data of features including places, roads, buildings, and other points of interest.

Of course, we can’t forget the graphics processor that will generate the augmented information and the real-time video feed on the same screen:

“The graphics processor configured to: overlay, on the video image, map data representing each of the one or more identified points of interest and an illustrated portion of a route from the GPS location identified by the GPS device to a selected one of the identified points of interest, the route being determined based on the map data; and update the illustrated portion of the route as the GPS location of the video capture device changes with movement along the route; and the display configured to present the video image that has been overlaid.”

If anything, the patent reveals that Apple has tooled around with AR technology for at least six years. Naturally, the advancement of mobile technology makes the patent relevant to today’s devices rather than what powered the iPhone 4 so many years ago. It also shows that Apple is indeed building an AR foundation into iOS, and hopefully we’ll see all that work go live with an upcoming iOS release in the near future.