Skip to main content

Richly rendered ‘Asteroids’ demo showcases power of Nvidia’s RTX graphics

Nvidia has released a new demo to showcase some of the advanced graphics capabilities of the company’s Turing architecture found on the latest RTX series graphics cards, like the flagship GeForce RTX 2080 Ti. The public demo, called Asteroids, showcases the new mesh shading capabilities, which Nvidia claims will improve image quality and performance when rendering a large number of complex objects in scenes in a game.

With Turing, Nvidia introduced a new programmable geometric shading pipeline, transferring some of the heavy workload from the processor onto the GPU. The GPU then applies culling techniques to render objects — in the case of this demo, the objects are asteroids — with a high level of detail and image quality.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

“Turing introduces a new programmable geometric shading pipeline built on task and mesh shaders,” Nvidia graphics software engineer Manuel Kraemer explained in a detailed blog post explaining the benefits of mesh shading on Turing. “These new shader types bring the advantages of the compute programming model to the graphics pipeline. Instead of processing a vertex or patch in each thread in the middle of fixed function pipeline, the new pipeline uses cooperative thread groups to generate compact meshes (meshlets) on the chip using application-defined rules.”

In the demo, Nvidia showed that each asteroid contains 10 levels of details. Objects are segmented into smaller meshlets, and Turing allows the meshlets to be rendered in parallel with more geometry while fetching less data overall. With Turing, the task shader is employed first to check the asteroid and its position in the scene to determine which level of detail, or LoD, to use. Sub-parts, or meshlets, are then tested by the mesh shade, and the remaining triangles are culled by the GPU hardware. Before the Turing hardware was introduced, the GPU would have to cull each triangle individually, which produced congestion on the CPU and the GPU.

“By combining together efficient GPU culling and LOD techniques, we decrease the number of triangles drawn by several orders of magnitude, retaining only those necessary to maintain a very high level of image fidelity,” Kraemer wrote. “The real-time drawn triangle counters can be seen in the lower corner of the screen. Mesh shaders make it possible to implement extremely efficient solutions that can be targeted specifically to the content being rendered.”

In addition to using this technique to create rich scenes in a game, Nvidia said that the process could also be used in scientific computing.

“This approach greatly improves the programmability of the geometry processing pipeline, enabling the implementation of advanced culling techniques, level-of-detail, or even completely procedural topology generation,” Nvidia said.

Developers can download the Asteroids demo through Nvidia’s developer portal, and the company also posted a video showing how mesh shader can improve rendering.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
5 cheap graphics cards you should buy instead of the RTX 4060
Two RTX 4060 graphics cards sitting next to each other.

If you're in the market for a budget GPU, the RTX 4060 is one of the best graphics cards you can buy. It's available for a reasonable price, offers solid 1080p performance, and comes with a suite of Nvidia-exclusive features. Still, it's not the right graphics card for everyone.

As you can read in our RTX 4060 review, Nvidia's value-focused GPU has a few minor issues.  It's still a card to keep in mind if you're shopping for a budget graphics card, but we rounded up five alternatives that fill in the gaps that the RTX 4060 leaves.
AMD RX 7600

Read more
Nvidia could flip the script on the RTX 5090
The Hyte Y40 PC case sitting on a table.

We already know Nvidia is working on its RTX 50-series graphics cards, code-named Blackwell, but the rollout may not go as expected.

According to well-known hardware leaker kopite7kimi, Nvidia plans to launch the RTX 5080 before it launches the RTX 5090. That may not sound like a big deal, but it's a change of pace compared to what we saw in the last generation.

Read more
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more