Skip to main content

Nvidia and Apple are collaborating on the Vision Pro in the most unlikely way

Nvidia revealing support for the Apple Vision Pro.
Nvidia

You don’t normally see tech titans like Nvidia and Apple pair up, but the two companies announced at this week’s Nvidia GTC 2024 that they are coming together around the Vision Pro. Nvidia is bringing its Omniverse Cloud platform to Apple’s headset, allowing users to interact with objects and design directly through the Vision Pro.

The basis of support is a set of Omniverse Cloud APIs that can stream Omniverse assets to Apple’s headset. Omniverse isn’t running on the Vision Pro itself. Instead, designers can stream scenes made with the Universal Scene Description (OpenUSD) in Omniverse to the Vision Pro, and interact with the 3D objects natively.

Recommended Videos

Nissian demoed this capability in a demo video. Through the Vision Pro, the user is able to swap out paint colors, adjust the trim, and even go inside the car using spatial awareness thanks to the Vision Pro.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

It’s sure to make a splash in the enterprise sector, but there are some consumer implications here. Nvidia is essentially showing that it can stream interactable 3D applications to the Vision Pro. This is enabled by Nvidia’s Graphics Delivery Network (GDN), which is already being used to stream 3D applications from the cloud. The fact that it can work on the Vision Pro is a big deal.

The linchpin for this are the Omniverse Cloud APIs. Also at GTC, Nvidia revealed five new APIs centered around Omniverse Universal Scene Description (OpenUSD) loud that can be used individually or collectively:

  • USD Render: support for ray-traced renders of OpenUSD data
  • USD Write: support for modifying OpenUSD data
  • USD Query: support for interactive scenes
  • USD Notify: support for tracking USD changes
  • Omniverse Channel: allows users to connect tools and projects across scenes

Right now, Omniverse Cloud on the Vision Pro is focused around enterprise applications, just as Apple’s headset itself is. This is still a critical foundation for streaming interactable 3D applications to Apple’s headset in the future. Even with how powerful the Vision Pro is, it’s not enough to handle aspects like ray tracing in highly detailed 3D scenes. Being able to stream these scenes at the same quality could line up some exciting apps in the future.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Did Apple just hint that the M4 MacBook Pro isn’t coming in 2024?
Apple MacBook Pro 16 downward view showing keyboard and speaker.

Apple held its third-quater earnings call this week, and it looks like things went pretty well overall. Total revenue was $85 billion, up around 5% year-over-year, and the Mac managed to go up 2% year-over-year as well, bringing in just over $7 billion. But a comment from the Q&A section of the call suggests that the company isn't expecting any bumps in Mac revenue for the rest of the year and, as MacRumors suggests, this could be code for "no new MacBook."

After being pressed for clarification on product revenue expectations for the September quarter, Chief Financial Officer Luca Maestri commented:

Read more
The Vision Pro 2 could gain this huge upgrade to visuals
Apple Vision Pro display model.

The tandem OLED technology used in the 2024 iPad Pro models could be used in a future Vision Pro. LG and Samsung have prototyped micro versions of the tandem displays, essentially shrinking them down for use in headsets such as the Vision Pro.

A report, originating from the Korean site Sisa Journal as picked up by MacRumors, mentions that it's unknown whether LG and Samsung are planning to mass produce these displays right now.

Read more
The Apple Vision Pro can now be controlled only by your mind
Mark has ALS but can use the Vision Pro via Synchron's Stentrode.

The Apple Vision Pro is already incredibly easy to use, largely thanks to its lack of controllers. You just look at a control and tap your index finger to your thumb to select.

But hand gestures aren’t always easy or possible for the millions of people worldwide who have paralysis of the upper limbs. Synchron recently announced a spatial computing breakthrough that lets users of the Stentrode BCI (brain computer interface) implant control an Apple Vision Pro.

Read more