Skip to main content

How Apple could bring augmented reality to the masses with ARKit and iOS 11

apples arkit to bring augmented reality the masses apple car
Image used with permission by copyright holder
Some of the biggest tech companies — Apple, Google, and Microsoft to name a few – seem sure that augmented reality (AR) is going to take smartphones to the next level. So far, it’s all been talk, but that could change before the end of this year. Apple is preparing to unleash ARKit, a brand-new framework for creating this kind of content.

AR has been a priority for Apple for some time. The company has spent a great deal of time and effort hiring staff and acquiring startups to ensure that it has all the talent it needs once the technology is mature enough for the masses. Now, we’re on the verge of Apple’s AR coming-out party: the release of iOS 11.

To understand the implications of ARKit, we spoke to a developer who’s worked on augmented reality tech for the better part of a decade, and has already spent some time putting Apples developer kit through its paces.

The ground floor of AR

Jan-Hein Pullens and his team produce AR content for clients in the home furnishing and real estate industries – and demand for their work may skyrocket as Apple attempts to bring the technology to the masses.

Today, there’s sufficient hardware and infrastructure to allow users to run high-quality AR content.

However, when Pullens and Pieter Aarts founded RoOomy back in 2009, the technological landscape was very different. It would still be three years before the Oculus Rift Kickstarter campaign even brought virtual reality into the public consciousness. Google Glass, one of the first high-profile AR projects to get underway, wouldn’t be successfully prototyped until mid-2011.

Pullens and Aarts were initially excited by the prospect of giving people a way to see how large pieces of furniture might look in their own home using AR. However, the hardware to run that kind of software simply wasn’t available to the public.

“Eight years ago, for example, there wasn’t an iPad,” said Pullens when he spoke to Digital Trends on the phone last month. “People had desktops, and phones.”

Desktop PCs aren’t ideal for AR content because you can’t move them around to see different angles. And back then, smartphones weren’t much better off. They simply didn’t have the horsepower (or the sensors) needed to present AR software.

Whether you’re trying to sell someone a luxury sofa, or a luxury apartment, it’s crucial that your virtual visualization plays to the strengths of the product. “It needs to be very realistic, otherwise it looks gimmicky and like a game,” he explained.

Today, there’s sufficient hardware and infrastructure for some phone owners to run high-quality AR content. Google Tango is the most well-established platform on the scene right now. Unfortunately, it’s only compatible with two smartphones – the Lenovo Phab 2 Pro and the Asus Zenfone AR. Do you know anyone who owns those phones? Neither do we. But with the iPhone entering the fray, suddenly a huge chunk of smartphone users will be AR capable.

Apple opens the gates

ARKit will be supported by iOS devices that use the Apple A9 or A10 processors – the 2017 iPad, the iPhone 6S, and onward. Admittedly, that does leave the millions of users with older hardware unable to access AR content built using the platform, but it absolutely dwarfs the userbase for Google Tango, Microsoft HoloLens, and every other AR platform.

Sophisticated AR functionality requires specialized sensors, like a depth-sensing camera.

There is another complication. Some of the most sophisticated AR functionality requires specialized sensors, like a depth-sensing camera. It’s true that the iPhone 7 Plus has some depth-sensing capabilities, utilizing two lenses working in sync to measure relative distance. However, in the grander scheme of AR tech, it’s a relatively primitive solution.

In February 2017, there were rumblings that the next iPhone would implement an infrared sensor similar to the one used in Microsoft’s Kinect accessory for the Xbox 360, as reported by The Verge. This kind of sensor would provide much more detailed information on an object’s relative position to the device than the current dual-lens set-up. It’s also rumored that Apple will introduce some kind of component that serves this purpose as part of its 2017 iPhone refresh (read the latest iPhone 8 rumors), but there’s nothing official yet.

These new devices will be considered the baseline for AR developers moving forward, particularly because of the advantages associated with depth-sensing cameras. However, the combination of ARKit and current hardware is already bearing fruit. Pullens and his team spent some time with an early version of the development kit, and they like what they’ve seen.

“The first findings that we have with Apple ARKit are promising, they’re actually very promising,” said Pullens. He praised the way the platform copes with occlusion, and its capacity to prevent virtual objects from interfering with one another.

roOmy Bedroom AR furniture
AR bedroom demonstration by roOmy Image used with permission by copyright holder

For Pullens, the most impressive aspect of ARKit is its stability. Virtual objects can often ‘drift’ when they’re not properly aligned with their real-world surroundings, which can be a big problem for the type of visualizations that he and his team at RoOomy produce.

“What I mean by drifting, is for example, a chair in an AR view,” he said. “You would like to see that chair be very stable – you wouldn’t want it to drift or tremble. So, the first findings that we have with Apple are very promising, because it’s quite stable.”

A virtual leather chair isn’t much help if it insists on floating towards the ceiling, or wobbles like there’s a cat under the cushion.

While Pullen had plenty of praise, he also raised some areas where Apple might make improvements. He noted the way ARKit renders light and shadow maps is alright, but added that he expects it to be even better once the platform is ready for release. He also suggested he can see its surface detection capabilities being refined significantly with an improved depth-sensing camera – so, it’d be ideal if the rumors of an infrared camera on the iPhone 8 prove true.

ARKit makes everything easier

AR developers are excited about ARKit because it should open the technology to a much wider audience. Apple seems heavily invested in AR, so we can expect this kind of content to be a priority for the iPhone and iPad. This is an appealing proposition for the people creating AR experiences.

We can expect this kind of content to be a priority for the iPhone and iPad.

Yet a bigger audience isn’t the only benefit of Apple’s development kit. ARKit also aims to remove a lot of the busywork from creating software, allowing developers to focus on how they can use the functionality to provide new and engaging experiences.

“It helps developers like us to provide new features and make good use of AR technology,” said Pullens. “Otherwise, one has to build everything themselves.”

For example, every AR apps needs surface detection that allows a virtual object to sit on a table or the floor. Previously, developers might’ve spent months creating their own surface detection algorithms, or make do with so-so middleware provided by another company. With ARKit, they have access to a highly sophisticated solution that’s already tailored to iOS.

“You get a lot of features already for free in this kit,” added Pullens, referring to functionality like occlusion and light and shadow maps. “This will give a big push to the development community, for new AR solutions to be out there.”

Early ARKit creations are already impressive

Developers who are interested in a sneak preview of ARKit can get it by downloading the beta version of Xcode 9, which includes the iOS 11 SDK. It is already leading to new ideas. AR and VR feed MixedRealityDesign has set up a website dubbed Made with ARKit, which curates a selection of the very best projects around.

The slightly creepy ‘A robot dancing in my living room’ demonstrates the superior stability that Pullens spoke about. An android performs some fluid dance moves in front of a sofa, and despite the camera moving around, the scene looks incredibly natural. The shadow that the robot casts on the floor is particularly impressive.

‘Inter-dimensional Portal’ places a window to another world in the middle of a city street. The graphics used to render this virtual space aren’t very refined, but the overall effect is arresting, particularly once the user walks through the portal. It’s easy to see how this kind of idea might be used in a location-based game along the lines of Pokemon Go.

While these two examples are fun, ‘ARKit will change how we order food’ is much more practical. Rather than looking at flat images on a paper menu, an app produces 3D visualizations of choices right on the table. Another practical implementation is the ‘AR Measure App Demo,’ which offers up a virtual tape measure.

These projects have rough edges, ranging from awkward user interfaces to ugly assets. It’s important to remember that these are early concepts created using tools that have been available for a matter of weeks. Still, ARKit is providing a solid foundation that allows some intriguing ideas to come to fruition.

Apple, in usual fashion, has taken time to get AR right, instead of being first. The first wave of results suggests it’s a step beyond Google Tango and other peers. The next step will be putting advanced AR-friendly hardware into the hands of a broad range of users, and that looks set to happen when the next iPhone is announced.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
iOS 18 could make my iPhone look like Android, and I hate it
The Apple iPhone 15 Pro Max and the Samsung Galaxy S23 Ultra's rear panels.

If rumors are to be believed, iOS 18 will allow you to customize the home screen on your iPhone more substantially than ever before. This feature will be familiar to Android phone owners, but I don’t want my iPhone to look like an Android phone.

It’s a weird double-edged sword, as by giving you more freedom to make the home screen look unique, iOS may also lose what makes it unique compared to the less constrained world of Android.
iOS 18 and your iPhone home screen

Read more
iOS 18 could add a customization feature I’ve waited years for
iOS 17 interactive widgets on an iPhone 15 Pro Max.

iOS 18 is coming later this year, and all signs point to it being a dramatic iPhone update. Now, thanks to one new report, it looks like iOS 18 could add a customization feature I've been waiting years and years and years for: better home screen customization.

According to Bloomberg's Mark Gurman, iOS 18 will introduce a "more customizable" home screen. More specifically, iOS 18 will allow you to place app icons and widgets anywhere you want. If you want a space or break between an app icon or your widget, welcome to the future: iOS 18 may finally let you do that. MacRumors corroborated this report with its own sources, too.

Read more
I found 16 new widgets for iOS 17 that you have to try
A selection of widgets on an iPhone home screen.

Widgets have long been a popular feature on iPhones. Since the release of iOS 17, they have become more interactive, providing a more engaging experience. Interactive widgets allow you to perform important actions from your iPhone's home screen. For example, you can complete your to-do lists, play and pause media and podcasts, control your smart home devices, and much more — all without opening any apps.

Interactive widgets are not limited to Apple's built-in apps on iOS 17. Third-party developers are also adopting this new feature. Although not all apps offer interactive widgets, there are plenty that do. Here are a few interactive widgets that are worth exploring.
Calculator 17

Read more