Skip to main content

Nvidia and Microsoft are solving a big problem with Copilot+

The Surface Laptop running local AI models.
Luke Larsen / Digital Trends
Computex 2024 logo.
This story is part of our coverage of Computex, the world's biggest computing conference.

When Microsoft announced Copilot+ PCs a few weeks back, one question reigned supreme: Why can’t I just run these AI applications on my GPU? At Computex 2024, Nvidia finally provided an answer.

Nvidia and Microsoft are working together on an Application Programming Interface (API) that will allow developers to run their AI-accelerated apps on RTX graphics cards. This includes the various Small Language Models (SLMs) that are part of the Copilot runtime, which are used as the basis for features like Recall and Live Captions.

Recommended Videos

With the toolkit, developers can allow apps to run locally on your GPU instead of the NPU. This opens up the door to not only more powerful AI applications, as the AI capabilities of GPUs are generally higher than NPUs, but also the ability to run on PCs that don’t currently fall under the Copilot+ umbrella.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

It’s a great move. Copilot+ PCs currently require a Neural Processing Unit (NPU) that’s capable of at least 40 Tera Operations Per Second (TOPS). At the moment, only the Snapdragon X Elite satisfies that criteria. Despite that, GPUs have much higher AI processing capabilities, with even low-end models reaching to 100 TOPS, and higher-end options scaling even higher.

In addition to running on the GPU, the new API adds retrieval-augmented generation (RAG) capabilities to the Copilot runtime. RAG gives the AI model access to specific information locally, allowing it to provide more helpful solutions. We saw RAG on full display with Nvidia’s Chat with RTX earlier this year.

Performance comparison with the RTX AI toolkit.
Nvidia

Outside of the API, Nvidia announced the RTX AI Toolkit at Computex. This developer suite, arriving in June, combines various tools and SDKs that allow developers to tune AI models for specific applications. Nvidia says that by using the RTX AI Toolkit, developers can make models four times faster and three times smaller compared to using open-source solutions.

We’re seeing a wave of tools that enable developers to build specific AI applications for end users. Some of that is already showing up in Copilot+ PCs, but I suspect we’ll see far more AI applications at this point next year. We have the hardware to run these apps, after all; now we just need the software.

Jacob Roach
Former Digital Trends Contributor
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Nvidia’s new GPUs are already running into problems
Nvidia Blackwell chips.

Nvidia's latest Blackwell GPUs are running into problems in the data center, reports The Information. According to the report, Nvidia's customers are worried about how well the AI accelerators will hold up, as overheating issues have caused delays in server racks being deployed for AI training.

The Blackwell architecture is at the heart of both Nvidia's next-gen AI accelerators and its upcoming RTX 50-series graphics cards. In the data center, the architecture was previously delayed due to "design flaws," pushing the deployment of the B100 and B200 GPUs back. That's despite big orders with AI players like Meta, Microsoft, and Google.

Read more
Microsoft Copilot: how to use this powerful AI assistant
Man using Windows Copilot PC to work

In the rapidly evolving landscape of artificial intelligence, Microsoft's Copilot AI assistant is a powerful tool designed to streamline and enhance your professional productivity. Whether you're new to AI or a seasoned pro, this guide will help you through the essentials of Copilot, from understanding what it is and how to sign up, to mastering the art of effective prompts and creating stunning images.

Additionally, you'll learn how to manage your Copilot account to ensure a seamless and efficient user experience. Dive in to unlock the full potential of Microsoft's Copilot and transform the way you work.
What is Microsoft Copilot?
Copilot is Microsoft's flagship AI assistant, an advanced large language model. It's available on the web, through iOS, and Android mobile apps as well as capable of integrating with apps across the company's 365 app suite, including Word, Excel, PowerPoint, and Outlook. The AI launched in February 2023 as a replacement for the retired Cortana, Microsoft's previous digital assistant. It was initially branded as Bing Chat and offered as a built-in feature for Bing and the Edge browser. It was officially rebranded as Copilot in September 2023 and integrated into Windows 11 through a patch in December of that same year.

Read more
Microsoft is backtracking on its Copilot key
The Copilot key shown on a white keyboard.

The Copilot key was a big part of Microsoft's initial push with AI PCs, but it didn't exactly receive positive reception.

But now, in a Windows Insider blog post from earlier this week, Microsoft says users will be able to configure the Copilot key to open apps other than the Copilot AI assistant. This will be made first available to Insiders in the Release Preview on the 23H2 version of Windows 11. It was initially thought it would roll out in the Windows 11 Preview Build 22631.4387 build, but that's no longer the case.

Read more