Skip to main content

Nvidia’s ‘infinite resolution’ patent could change gaming forever

In a patent filing released on Thursday, June 7, Nvidia describes a technology that could fundamentally alter the way games look, feel, and perform. Nvidia calls it “infinite resolution,” and it’s effectively a clever way of using vector graphics to replace static textures in games. Let’s dive into what that means and why it could be a big deal.

Today, most games use textures that are created for a set of fairly standard resolutions, 720p, 1080p, 1440p, 4K — and some in-between. These textures cover just about every surface in modern PC games, from character models, to weapons, to environments, every 3D model is effectively “wrapped” with a 2D texture. Nvidia filed a patent to change how these textures are rendered.

Currently, developers package games with a series of these textures, one for each resolution the game runs at, and one for each detail setting at each resolution. This requires a lot of storage space, and it means that today’s games have a ceiling or a maximum resolution.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

To see what we mean, try opening up Diablo 2 on a modern computer. Your resolution in that game is going to max out somewhere around 1024 x 768, way below what current monitors are capable of. As a result, it’s not going to look its best. The game is going to stretch those old under-sized textures across your whole display, like when you zoom in really far on a small picture.

Nvidia’s solution would fix these issues. Instead of packaging games with a massive set of static textures, games built using Nvidia’s technology would include only a single set of texture information, not the actual textures themselves. Effectively, each in-game texture would be drawn in real time from instructions the developers include in the game. Your computer would use its processing and graphics rendering horsepower to do the heavy lifting here.

Because your computer would be drawing each texture in real time, your games would be future-proofed to some extent. If a game like Diablo 2 was built using this technology, that would mean playing the game on a massive 8K monitor would look just as tack-sharp and detailed as it would on an old 800 x 600 CRT monitor.

This technology isn’t actually anything new, it’s just a novel application of an existing technology: Vector graphics. If you’re unfamiliar, vector graphics are used for a variety of purposes, but most notably they’re used in graphic design. When a designer creates a logo or design with vector art, that logo or design can be blown up or shrunk down to any size without losing detail. Nvidia’s patent filing here simply applies these principles to textures in PC games.

It’s unclear what potential speed bumps this technology might encounter, how it might bog down a typical gaming PC, or whether it would only be useful for certain types of games, but it’s an interesting concept and we’re excited to see where it could lead. To be clear, Nvidia has been working on this for quite a while, but this latest patent filing suggests the company could be close to bringing it to market.

Editors' Recommendations

Jayce Wagner
Former Digital Trends Contributor
A staff writer for the Computing section, Jayce covers a little bit of everything -- hardware, gaming, and occasionally VR.
PC gaming has an efficiency problem
The Ryzen 5 7600X sitting among thermal paste and RAM.

It's the word PC executives love to say and PC gamers hate to hear: efficiency. I wouldn't blame you if you plug your ears every time there's a "performance per watt" metric, or when AMD, Nvidia, and Intel start going on about the efficiency of their hardware. But efficiency is important in your gaming PC, and it's a problem facing PC gaming as a whole.

No, the components themselves don't have an efficiency problem. In fact, recent hardware -- particularly from AMD and Nvidia -- is some of the most efficient hardware we've seen in years. But PC gamers have a problem dealing with efficiency and leveraging it for a better gaming experience, and PC executives have a problem communicating why it's so important.
Poor communication

Read more
The RTX 5070 might come with an unexpected downgrade
The Gigabyte GeForce RTX 4070 Ti Super AI Top graphics card showcased at Computex 2024.

Rumors are starting to circulate around Nvidia's upcoming RTX 50-series GPUs, which are expected to begin rolling out later this year. Specs for the full range of desktop cards have been shared by well-known leaker kopite7kimi on X (formerly Twitter), and it looks like most of the range is set to get a downgrade in the next generation -- at least in terms of core count.

The leaker shared core counts and the memory interface for the suspected range of GPUs, following up rumors for the RTX 5090 and RTX 5080 a little over a week ago. The GB205 GPU is expected to go in the RTX 5070, and the leaker claims it will come with 50 Streaming Multiprocessors (SMs). The AD104 GPU, seen in the RTX 4070, RTX 4070 Super, and RTX 4070 Ti, comes with 60 SMs.

Read more
This is our first look at RTX 50-series GPUs — and there’s one big surprise
Nvidia Blackwell chips.

Although we've heard plenty of rumors about Nvidia's upcoming RTX 50-series GPUs, we haven't seen any leaks -- until now. Clevo, a Taiwanese laptop maker, was recently hit by a ransomware attack, and some confidential slides detailing Nvidia's next-gen GPUs have made their way online.

Dominic Alvieri, a cybersecurity analyst, grabbed the slides from ransomware group RansomHub and posted them on X (formerly Twitter). The slides include timelines for Intel and AMD mobile CPUs, as well as a few details about Nvidia's next-gen graphics cards. According to the slides, Nvidia is prepping six new mobile GPUs that are slated to arrive in 2025.

Read more