Skip to main content

Nvidia’s ‘infinite resolution’ patent could change gaming forever

In a patent filing released on Thursday, June 7, Nvidia describes a technology that could fundamentally alter the way games look, feel, and perform. Nvidia calls it “infinite resolution,” and it’s effectively a clever way of using vector graphics to replace static textures in games. Let’s dive into what that means and why it could be a big deal.

Today, most games use textures that are created for a set of fairly standard resolutions, 720p, 1080p, 1440p, 4K — and some in-between. These textures cover just about every surface in modern PC games, from character models, to weapons, to environments, every 3D model is effectively “wrapped” with a 2D texture. Nvidia filed a patent to change how these textures are rendered.

Recommended Videos

Currently, developers package games with a series of these textures, one for each resolution the game runs at, and one for each detail setting at each resolution. This requires a lot of storage space, and it means that today’s games have a ceiling or a maximum resolution.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

To see what we mean, try opening up Diablo 2 on a modern computer. Your resolution in that game is going to max out somewhere around 1024 x 768, way below what current monitors are capable of. As a result, it’s not going to look its best. The game is going to stretch those old under-sized textures across your whole display, like when you zoom in really far on a small picture.

Nvidia’s solution would fix these issues. Instead of packaging games with a massive set of static textures, games built using Nvidia’s technology would include only a single set of texture information, not the actual textures themselves. Effectively, each in-game texture would be drawn in real time from instructions the developers include in the game. Your computer would use its processing and graphics rendering horsepower to do the heavy lifting here.

Because your computer would be drawing each texture in real time, your games would be future-proofed to some extent. If a game like Diablo 2 was built using this technology, that would mean playing the game on a massive 8K monitor would look just as tack-sharp and detailed as it would on an old 800 x 600 CRT monitor.

This technology isn’t actually anything new, it’s just a novel application of an existing technology: Vector graphics. If you’re unfamiliar, vector graphics are used for a variety of purposes, but most notably they’re used in graphic design. When a designer creates a logo or design with vector art, that logo or design can be blown up or shrunk down to any size without losing detail. Nvidia’s patent filing here simply applies these principles to textures in PC games.

It’s unclear what potential speed bumps this technology might encounter, how it might bog down a typical gaming PC, or whether it would only be useful for certain types of games, but it’s an interesting concept and we’re excited to see where it could lead. To be clear, Nvidia has been working on this for quite a while, but this latest patent filing suggests the company could be close to bringing it to market.

Jayce Wagner
Former Digital Trends Contributor
A staff writer for the Computing section, Jayce covers a little bit of everything -- hardware, gaming, and occasionally VR.
Indiana Jones and the Great Circle proves Nvidia wrong about 8GB GPUs
Indiana jones buried in the sand.

Nvidia was wrong, and Indiana Jones and the Great Circle is proof of that. Despite being a game that's sponsored by Nvidia due to its use of full ray tracing -- which is said to arrive on December 9 -- multiple of Nvidia's best graphics cards struggle to maintain a playable frame rate in the game, and that largely comes down to VRAM.

Computer Base tested a swath of GPUs in the game across resolutions with the highest graphics preset, and one consistent trend emerged. Any GPUs packing less than 12GB of VRAM couldn't even maintain 30 frames per second (fps) in the game at its highest graphics settings. That led to some wild comparisons as you can see in the chart below. The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB.

Read more
A PC ‘recession’ could make hardware way more expensive, says researcher
The RTX 4080 in a running test bench.

Get ready to spend big if you plan on scoring one of the best graphics cards or best processors. According to Jon Peddie Research, the PC market could be headed for a "recession" due to proposed tariffs on several countries, which are said to go into effect shortly after Donald Trump becomes president on January 20.

The quote comes from JPR's third-quarter GPU market study. Market share has shifted a bit, CPU shipments are up by 12%, but there really isn't much to write home about -- short of the tariffs. "AMD and Intel released new CPUs, and there was some pent-up demand for them. However, looking forward, we think that if the proposed tariffs are imposed, the PC market will suffer a recession due to increased prices and unmatched increases in income," wrote Dr. Jon Peddie.

Read more
Intel’s new $249 GPU brings 1440p gaming to the masses
An exploded view of Intel's Arc A580 GPU.

Intel is trying to redefine what a "budget GPU" really means in 2024, and it's doing so with the new Arc B580 GPU. In what Intel itself described as its "worst kept secret," the B580 is the debut graphics card in Intel's new Battlemage range of discrete GPUs, and it's arriving at just $249. That's a price point that's been relegated to 1080p for decades, but Intel says the B580 will change that dynamic.

It's a 1440p GPU, at least by Intel's definition. That's despite the fact that Intel is comparing the card to GPUs like the RTX 4060 and RX 7600, both of which are more expensive than the B580 and squarely target 1080p. Intel says it can deliver higher performance than these two GPUs while undercutting the price, all in an attempt to capitalize on 1440p gamers. "1440p is becoming 1080p," as Intel's Tom Petersen put it in a pre-briefing with the press.

Read more