Skip to main content

Apple calls the Vision Pro a ‘spatial computer.’ What is that?

Apple Vision Pro being worn by a person while using a keyboard.
Apple

When Apple announced the Vision Pro, it described it not as a headset, but as a “spatial computer.” We’ve seen similar devices before from Microsoft, Meta, and Magic Leap, but those companies favor the more familiar terms extended reality (XR), virtual reality (VR), and augmented reality (AR).

Recommended Videos

So, what is spatial computing, and why did Apple CEO Tim Cook call it “the beginning of a new era for computing?”

What is a spatial computer?

Apple Vision Pro performs a retina scan for authentication.
Apple

In a 2003 MIT graduate thesis, Simon Greenwold defines spatial computing as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” The thesis describes the fundamental objective of spatial computing as “the union of the real and computed.” Greenwold envisioned all sorts of devices with sensing and processing capabilities.

Twenty years later, a spatial computer is associated with a head-mounted display that detects objects, surfaces, and walls in your surroundings. Cameras, microphones, and sensors provide information to the processor to analyze and present useful information.

As a computer with awareness of its environment, it’s a step up from traditional towers and laptops, which can capture the outside world in some ways but still leave most of the analysis to us. Now we’re starting to get assistance with reality. It started with smartphones — we can ask new questions: How far is that? How long will it take to get there? What kind of flower is that?

In the future, we’ll all be wearing spatial computers. It’s the next step after smart glasses, which will help ease the transition from smartphones. You’ll be able to instantly see directions, hear translations, and request more details about anything around you.

Imagine a super powerful version of Google Lens, a measuring app, a translation app, a recommendation guide, and a custom audiovisual tutor available anytime you ask for help throughout your day. Now go even further.

A future spatial computer will completely replace every screen, printer, most computers, all tablets, all phones, and all watches. It will help you connect to others and put them in the room with you, even when they’re miles away. It will help you with all your work and personal tasks, greatly simplifying life. We’re not there yet; the Apple Vision Pro is just the beginning.

Spatial computer = reality computer

Apple Vision Pro can show an app and floating FaceTime windows,
Apple

As a spatial computer, the Vision Pro interacts with the real world. The device scans its surroundings with lidar and color cameras to augment your experience with virtual screens, surround sound, and even three-dimensional objects.

When you turn or move, the Vision Pro adjusts the image displayed accordingly, as if the computer-generated elements on the screen are present in your room. Of course, your iPhone can handle AR also, placing an Ikea shelf in the corner with ARKit or showing an iPad on your table.

The Vision Pro goes further, filling your view with multiple browser screens, a giant TV screen, and friends or coworkers in a group chat. In some cases, the experience extends beyond the screen, wrapping an immersive, themed environment around you. Apple’s Vision Pro can operate within reality or completely transform it. That’s impressive, but it isn’t completely new.

Any VR headset with a passthrough view is a type of spatial computer that matches your movement to the displayed image. Meta, HTC, Pico, and others have similar capabilities, though not as accurate as Apple’s.

For example, Meta’s Quest Pro can overlay 3D graphics on your room, display multiple virtual screens, then switch to total immersion to display a 360-degree video in 3D. It can identify where the floor is, but it lacks a depth sensor, so you have to mark furniture manually. That limits how well graphics can interact with your surroundings.

High-end AR headsets, like Microsoft HoloLens 2 and Magic Leap 2, include depth mapping hardware so virtual objects can interact with the environment. However, the small field of view in the see-through displays spoils immersion and intuitive interaction. The edges become a constant reminder that this isn’t real, just like looking at AR effects through a smartphone.

Apple Vision Pro is a beginning

Image used with permission by copyright holder

Apple’s Vision Pro could be the first device to get spatial computing right. However, it’s too expensive for most consumers, and the full extent of its capabilities is unclear. The Vision Pro probably isn’t the ultimate spatial computer. It’s the beginning of the AR future we’ve marveled at in science fiction movies for a couple of decades.

Apple’s Vision Pro is bulky, so it won’t be as convenient as the translucent computer interface in Minority Report or as powerful as Tony Stark’s Jarvis which intuitively displays relevant data with minimal input. However, it’s revolutionary in many ways.

The Vision Pro knows where you are in space, where you’re looking, notices the smallest gesture of your fingers, and detects when people are nearby. Two powerful processors provide sufficient performance to enable huge potential.

Apple barely scratched the surface of what’s possible when it announced the Vision Pro. As Meta learned, overhyping is a costly mistake in the VR industry. Apple made no mention of the metaverse or even VR gaming.

The Vision Pro will be more than just a wearable computer with FaceTime and immersive cinema. It’s only a matter of time before we learn that the Vision Pro is the basis for Apple’s version of an augmented and virtual layer over reality. That’s when the Vision Pro will come closer to the potential of the spatial computer of the future.

Alan Truly
Alan Truly is a Writer at Digital Trends, covering computers, laptops, hardware, software, and accessories that stand out as…
This new VR headset matches Vision Pro’s display at the weight of an iPhone
A closeup show the front panel of the Pimax Dream Air with Pimax logo.

Pimax just announced a new PC VR headset that weighs less than 200 grams and boasts 4K per eye microOLED panels and pancake lenses. That means the Pimax Dream Air matches the display specifications of Apple’s Vision Pro, yet weighs less than an iPhone 16 Pro.

The Dream Air looks quite similar to the Vision Pro, and Pimax undoubtedly drew inspiration from Apple’s design. The renders show a compact, curved headset with a single rear head strap that splits at the back to cup the head.

Read more
Here’s how Apple may make the next Vision headset more affordable
A person wearing an Apple Vision Pro headset.

A new report suggests that Apple may be lining up its plans for the launch of its more budget-friendly Vision headset. As spotted by Wccftech, the report comes from analyst firm TrendForce, which indicates a move away from the high-end micro-OLED panels used in the Vision Pro.

The new options include glass-based OLED displays, as well as a different form of OLED known as LTPO backplane technology, which was first used on the Apple Watch Series 4 back in 2018. Since then, it's become a familiar display technology that's been applied broadly across the industry in smartphones and watches.

Read more
Apple could tie up with Sony for a critical Vision Pro upgrade
A man wears an Apple Vision Pro headset.

Apple hasn’t quite tasted the domain-shifting success it expected with the Vision Pro headset. A price tag worth $3,500 was already a deterrent, but the gaming ecosystem — a key driver for the VR segment — has also been lackluster. The company is now hoping to fix that situation with some help from Sony.

According to Bloomberg, the two companies have been working together to bring support for the PlayStation VR 2’s controllers to the pricey Apple headset. “Apple has discussed the plan with third-party developers, asking them if they’d integrate support into their games,” adds the report.

Read more