Skip to main content

Nvidia won CES 2025, and the RTX 5090 has nothing to do with it

Nvidia CEO Jensen Huang holding an RTX 50 GPU and a laptop.
Nvidia
The CES 2025 logo.
Read and watch our complete CES coverage here

Great, here’s the entitled journalist telling me that the $2,000 graphics card won CES 2025. I’ve seen plenty of strong opinions about Nvidia’s CES announcements online, but even ignoring the bloated price of the new RTX 5090, Nvidia won this year’s show. And it kind of won by default. Between Intel’s barebones announcements and an overstuffed AMD presentation that ignored what might be AMD’s most important GPU launch ever, it’s not surprising that Team Green came out ahead.

But that’s despite the insane price of the RTX 5090, not because of it.

Recommended Videos

Nvidia introduced a new range of graphics cards, and the impressive multi-frame generation of DLSS 4, but its announcements this year were much more significant than that. It all comes down to the ways that Nvidia is leveraging AI to make PC games better, and the fruits of that labor may not pay off immediately.

Nvidia's RTX 5090 sitting at CES 2025.
Jacob Roach / Digital Trends

There are the developer-facing tools like Neural Materials and Neural Texture Compression, both of which Nvidia briefly touched on during its CES 2025 keynote. For me, however, the standout is neural shaders. They certainly aren’t as exciting as a new graphics card, at least on the surface, but neural shaders have massive implications for the future of PC games. Even without the RTX 5090, that announcement alone is significant enough for Nvidia to steal this year’s show.

Neural shaders aren’t some buzzword, though I’d forgive you for thinking that given the force-feeding of AI we’ve all experienced over the past couple of years. First, let’s start with the shader. If you aren’t familiar, shaders are essentially the programs that run on your GPU. Decades ago, you had fixed-function shaders; they could only do one thing. In the early 2000s, Nvidia introduced programmable shaders that had far greater capabilities. Now, we’re starting with neural shaders.

In short, neural shaders allow developers to add small neural networks to shader code. Then, when you’re playing a game, those neural networks can be deployed on the Tensor cores of your graphics card. It unlocks a boatload of computing horsepower that, up to this point, had fairly minimal applications in PC games. They were really just fired up for DLSS.

Introducing NVIDIA RTX Kit: Transforming Rendering with AI and Path Tracing

Nvidia has uses for neural shaders that it has announced so far — the aforementioned Neural Materials and Neural Texture Compression, and Neural Radiance Cache. I’ll start with the last one because it’s the most interesting. The Neural Radiance Cache essentially allows AI to guess what an infinite number of light bounces in a scene would look like. Now, path tracing in real time can only handle so many light bounces. After a certain point, it becomes too demanding. Neural Radiance Cache not only unlocks more realistic lighting with far more bounces but also improves performance, according to Nvidia. That’s because it only requires one or two light bounces. The rest are inferred from the neural network.

Similarly, Neural Materials compresses dense shader code that would normally be reserved for offline rendering, allowing what Nvidia calls “film-quality” assets to be rendered in real time. Neural Texture Compression applies AI to texture compression, which Nvidia says saves 7x the memory as traditional block-based compression without any loss in quality.

A palace in a video game.
Nvidia

That’s just three applications of neural networks being deployed in PC games, and there are already big implications for how well games can run and how good they can look. It’s important to remember that this is the starting line, too — AMD, Intel, and Nvidia all have AI hardware on their GPUs now, and I suspect there will be quite a lot of development on what kinds of neural networks can go into a shader in the future.

Maybe there are cloth or physics simulations that are normally run on the CPU that can be run through a neural network on Tensor cores. Or maybe you can expand the complexity of meshes by inferring triangles that the GPU doesn’t need to account for. There are the visible applications of AI, such as through non-playable characters, but neural shaders open up a world of invisible AI that makes rendering more efficient, and therefore, more powerful.

It’s easy to get lost in the sauce of CES. If you were to believe every executive keynote, you would walk away with literally thousands of “ground-breaking” innovations that barely manage to move a patch of dirt. Neural shaders don’t fit into that category. There are already three very practical applications of neural shaders that Nvidia is introducing, and people much smarter than myself will likely dream up hundreds more.

I should be clear, though — that won’t come right away. We’re only seeing the very surface of what neural shaders could be capable of in the future, and even then, it’ll likely be multiple years and graphics card generations down the road before their impact is felt. But when looking at the landscape of announcements from AMD, Nvidia, and Intel, only one company introduced something that could really be worthy of that “ground-breaking” title, and that’s Nvidia.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Nvidia may release the RTX 5070 in March to counter AMD’s RDNA 4 GPUs
The RTX 5070 in a graphic.

Nvidia’s upcoming RTX 5070 may now be launching in early March, according to industry analyst MEGAsizeGPU (@Zed__Wang on X). Initially expected to debut in February, the source suggests that the mid-range Blackwell GPU has been pushed back—potentially as a strategic move to counter AMD’s upcoming Radeon RX 9070.

Unveiled at CES 2025, the RTX 5070 is currently the most affordable GPU from the RTX 50-series lineup, at least till the RTX 5060 series goes official. It is powered by the GB205 GPU, featuring 48 Streaming Multiprocessors (SMs), 6,144 CUDA cores, and is equipped with 12GB of GDDR7 memory. The GPU utilizes a 192-bit memory interface, delivering a bandwidth of 672GB/s.

Read more
List price for RTX 50-series GPUs might be officially dead
RTX 5090.

There's no doubt that Nvidia's new RTX 50-series GPUs are expensive, despite ranking among some of the best graphics cards you can buy. It's looking like prices will remain high in the immediate future. Both MSI and Asus have introduced price increases for their RTX 50-series models, with MSI completely doing away with cards at list price, as reported by VideoCardz.

The new RTX 5090 and RTX 5080 are sold out everywhere, but MSI and Asus are two brands that still list official prices online. You can't buy these cards, but it's a look inside where prices are headed once cards become available again. MSI has completely done away with models at MSRP, with its most inexpensive card, the RTX 5080 Ventus, now listed for $1,140. Most of MSI's RTX 5080 offerings range from $1,300 to $1,500, marking anywhere from a $300 to $500 increase over list price.

Read more
Nvidia might break with tradition for the RTX 5060
Two RTX 4060 graphics cards stacked on top of each other.

Although Nvidia has already established the flagship RTX 5090 as one of the best graphics cards you can buy, most PC gamers are eagerly waiting for Team Green's more budget-conscious offerings. According to a new rumor, Nvidia's RTX 5060 and RTX 5060 Ti will ditch the 16-pin power connector that Nvidia has used for the past few generations in favor of a standard 8-pin power connector.

The rumor comes from Brother Pan Talks Computers, a Chinese media outlet that VideoCardz reports has ties to Zotac. Nvidia has yet to announce the RTX 5060 and RTX 5060 Ti, but they'll be some of the most important entries in Nvidia's latest RTX 50-series lineup. Traditionally, Nvidia's 60-class graphics cards are among the most popular GPUs on the market, and last-gen's RTX 4060 still tops the charts in the Steam hardware survey.

Read more