Skip to main content

Richly rendered ‘Asteroids’ demo showcases power of Nvidia’s RTX graphics

Nvidia has released a new demo to showcase some of the advanced graphics capabilities of the company’s Turing architecture found on the latest RTX series graphics cards, like the flagship GeForce RTX 2080 Ti. The public demo, called Asteroids, showcases the new mesh shading capabilities, which Nvidia claims will improve image quality and performance when rendering a large number of complex objects in scenes in a game.

With Turing, Nvidia introduced a new programmable geometric shading pipeline, transferring some of the heavy workload from the processor onto the GPU. The GPU then applies culling techniques to render objects — in the case of this demo, the objects are asteroids — with a high level of detail and image quality.

Recommended Videos

“Turing introduces a new programmable geometric shading pipeline built on task and mesh shaders,” Nvidia graphics software engineer Manuel Kraemer explained in a detailed blog post explaining the benefits of mesh shading on Turing. “These new shader types bring the advantages of the compute programming model to the graphics pipeline. Instead of processing a vertex or patch in each thread in the middle of fixed function pipeline, the new pipeline uses cooperative thread groups to generate compact meshes (meshlets) on the chip using application-defined rules.”

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

In the demo, Nvidia showed that each asteroid contains 10 levels of details. Objects are segmented into smaller meshlets, and Turing allows the meshlets to be rendered in parallel with more geometry while fetching less data overall. With Turing, the task shader is employed first to check the asteroid and its position in the scene to determine which level of detail, or LoD, to use. Sub-parts, or meshlets, are then tested by the mesh shade, and the remaining triangles are culled by the GPU hardware. Before the Turing hardware was introduced, the GPU would have to cull each triangle individually, which produced congestion on the CPU and the GPU.

“By combining together efficient GPU culling and LOD techniques, we decrease the number of triangles drawn by several orders of magnitude, retaining only those necessary to maintain a very high level of image fidelity,” Kraemer wrote. “The real-time drawn triangle counters can be seen in the lower corner of the screen. Mesh shaders make it possible to implement extremely efficient solutions that can be targeted specifically to the content being rendered.”

In addition to using this technique to create rich scenes in a game, Nvidia said that the process could also be used in scientific computing.

“This approach greatly improves the programmability of the geometry processing pipeline, enabling the implementation of advanced culling techniques, level-of-detail, or even completely procedural topology generation,” Nvidia said.

Developers can download the Asteroids demo through Nvidia’s developer portal, and the company also posted a video showing how mesh shader can improve rendering.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
The RTX 5060 will be Nvidia’s most important GPU, and I’m worried about it
Two graphics cards sitting on top of each other.

Nvidia just finished revealing its range of new RTX 50-series GPUs, the first of which will arrive in just a couple of weeks. They're some of the best graphics cards ever made, according to Nvidia, and for the flagship RTX 5090 that clocks in at $2,000, I believe the company. Lower down the stack, however, I'm concerned.

For the past couple of years, there's been a growing issue surrounding graphics cards with 8GB of VRAM, which is something we've seen on full display with games like Indiana Jones and the Great Circle. Despite backlash in the previous generation concerning releases like the RTX 4060 Ti, I'm worried that Nvidia will repeat the mistakes of the past when the RTX 5060 inevitably rolls around.
It'll be popular

Read more
Nvidia is giving away the RTX 5090 — here’s how to win
MSI's RTX 5090.

Nvidia's insanely powerful RTX 5090 is right around the corner. Priced at $2,000, the GPU is far from cheap -- but Nvidia is hosting a sweepstakes where you can try your luck at winning one. Here's how to participate and attempt to win Nvidia's best graphics card.

Nvidia has already been giving away some GPUs, as well as a whole custom PC, in the lead-up to the launch of the RTX 50-series. Now that the cards are no longer a secret, the sweepstakes have moved on from the classic GeForce 256 to the GeForce RTX 5090, which is bound to be a beastly card (although perhaps not as beastly as the benchmarks would have you believe).

Read more
Here’s how Nvidia’s CEO defends the RTX 5090’s $2,000 price tag
Nvidia's RTX 5090 sitting at CES 2025.

"When someone would like to have the best, they just go for the best," said Nvidia's CEO Jensen Huang in a Q&A session with media at CES 2025. Huang was speaking on the newly-announced RTX 5090, and its new price tag of $2,000, making it the most expensive desktop graphics card Nvidia has ever released.

It's a new high for Nvidia, but also a bold departure from the rest of the range. The next card down in Nvidia's stack, the RTX 5080, comes in at $1,000 -- half the price of the flagship. Huang suggested that customers don't want to deal in micro-segmentation minutia. "$2,000 is not small money, it's fairly high value," Huang said. "But a lot of customers, they just absolutely want the best."

Read more