Skip to main content

Nvidia and Microsoft are solving a big problem with Copilot+

The Surface Laptop running local AI models.
Luke Larsen / Digital Trends
Computex Logo
This story is part of our coverage of Computex, the world's biggest computing conference.

When Microsoft announced Copilot+ PCs a few weeks back, one question reigned supreme: Why can’t I just run these AI applications on my GPU? At Computex 2024, Nvidia finally provided an answer.

Nvidia and Microsoft are working together on an Application Programming Interface (API) that will allow developers to run their AI-accelerated apps on RTX graphics cards. This includes the various Small Language Models (SLMs) that are part of the Copilot runtime, which are used as the basis for features like Recall and Live Captions.

Recommended Videos

With the toolkit, developers can allow apps to run locally on your GPU instead of the NPU. This opens up the door to not only more powerful AI applications, as the AI capabilities of GPUs are generally higher than NPUs, but also the ability to run on PCs that don’t currently fall under the Copilot+ umbrella.

It’s a great move. Copilot+ PCs currently require a Neural Processing Unit (NPU) that’s capable of at least 40 Tera Operations Per Second (TOPS). At the moment, only the Snapdragon X Elite satisfies that criteria. Despite that, GPUs have much higher AI processing capabilities, with even low-end models reaching to 100 TOPS, and higher-end options scaling even higher.

In addition to running on the GPU, the new API adds retrieval-augmented generation (RAG) capabilities to the Copilot runtime. RAG gives the AI model access to specific information locally, allowing it to provide more helpful solutions. We saw RAG on full display with Nvidia’s Chat with RTX earlier this year.

Performance comparison with the RTX AI toolkit.
Nvidia

Outside of the API, Nvidia announced the RTX AI Toolkit at Computex. This developer suite, arriving in June, combines various tools and SDKs that allow developers to tune AI models for specific applications. Nvidia says that by using the RTX AI Toolkit, developers can make models four times faster and three times smaller compared to using open-source solutions.

We’re seeing a wave of tools that enable developers to build specific AI applications for end users. Some of that is already showing up in Copilot+ PCs, but I suspect we’ll see far more AI applications at this point next year. We have the hardware to run these apps, after all; now we just need the software.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Microsoft considers developing AI models to better control Copilot features
The new Copilot 365 logo.

Microsoft may be on its way to developing AI models independent of its partnership with OpenAI. Over time, the generative AI company, OpenAI, has expanded its influence in the industry, meaning Microsoft has lost its exclusive standing with the brand. Several reports indicate Microsoft is looking to create its own “frontier AI models” so it doesn’t have to depend as much on third-party sources to power its services.

Microsoft and OpenAI have been in a notable partnership since 2021. However, January reports indicated the parties have had collaborative concerns over OpenAI's GPT-4, with Microsoft having said the model was too pricey and didn’t perform to consumer expectations. Meanwhile, OpenAI has been busy with several business ventures, having announced its $500 billion Stargate project, a collaborative effort with the U.S. government to construct AI data centers nationwide. The company also recently secured its latest investment round, led by SoftBank, raising $40 billion, and putting its current valuation at $300 billion, Windows Central noted.

Read more
Google Gemini’s best AI tricks finally land on Microsoft Copilot
Copilot app for Mac

Microsoft’s Copilot had a rather splashy AI upgrade fest at the company’s recent event. Microsoft made a total of nine product announcements, which include the agentic trick called Actions, Memory, Vision, Pages, Shopping, and Copilot Search. 

A healthy few have already appeared on rival AI products such as Google’s Gemini and OpenAI’s ChatGPT, alongside much smaller players like Perplexity and browser-maker Opera. However, two products that have found some vocal fan-following with Gemini and ChatGPT have finally landed on the Copilot platform. 

Read more
I never use my Microsoft Copilot subscription. I still think it’s worth it
Microsoft 356 apps.

If you have a regular subscription, you’re likely well-versed in the dance of paying for something and wondering if it’s worth the value. For many people, that might be a streaming service that hasn’t been used in six or more months or a membership for a gym that hasn’t been visited since before that last holiday. For me, I grapple with what to do with my Microsoft 365 subscription, specifically after the recent price hike due to the addition of Copilot+ features.

Microsoft 365 is one of those interesting computer suites that you don’t realize that you need until you need it. It's likely why I’ve allowed Microsoft to snatch money from my bank account for several years, when I only use one or two programs and one or two features.

Read more