Skip to main content

Nvidia's 2016 conference shows the serious side of the GPU

If there was one thing missing from 2016’s GPU technology conference, it was gaming. While GeForce owners are lamenting the lack of new Pascal-based GTX cards, a developer community is abuzz at the prospects of new, powerful hardware, and Nvidia’s expanded software offerings.

The face of computing is changing, and a lot of it has to do with GPUs. Developers are finding new and exciting ways to leverage these chips’ immense and unique power, and it has the community spilling over with ideas, innovations, and new tech.

Recommended Videos

Deep learning

Deep learning and cognitive computing are definitely the hot topics at GTC this time around. Their presence has been looming the past couple of years, but more as a proof of concept. This year, researchers were ready to show off the work these systems have been doing.

One of the most striking demos from Facebook’s Artificial Intelligence Research lab made sure everyone knows just how powerful cognitive computing has come. When fed data sets, the algorithms are able to turn them into useful insight, and even recreate the important pieces of information as it identifies them.

But the crucial part of the equation comes in when we discuss how to teach these systems what to look for. Early attempts focused on telling a machine what to look for and what its goals are, but now the conversation revolves around how these machines can teach themselves.

nvidiagtc4
Image used with permission by copyright holder

An untrained deep learning system is simply set loose on a humongous amount of data, and then researchers watch to see what it looks for. Whether that’s a swath of renaissance paintings, or videos of people engaged in a variety of sports and games, we’re starting to let the computer decide which factors are important, and that’s leading to greater accuracy and more versatile algorithms.

It’s also leading to systems that can complete more and more advanced tasks, and at rates we never could have imagined before we let the machines take over.

Self-driving cars

Strangely, Nvidia has positioned itself as a leading figure in the race to create self-driving cars. As you might imagine, GPUs are well suited for this kind of work, which requires juggling dozens of elements, and doing so quickly. Nvidia’s hardware is perfect for heavily realized workloads, and the company is taking full advantage of its advantageous position in the marketplace, as well as the connections its built over the years, to build a platform for self-driving cars that will power the fully autonomous RoboRace.

Nvidia even showed off its own self-driving car concept, lovingly known as BB8. The demonstration video showed a car that, at first, struggled to even stay on the road. It ran over cones, drove into the dirt, and didn’t stop when it should’ve. With just a few months of training and learning, the car drove perfectly, able to switch between driving surfaces smoothly, and adjust to unusual situations, like roads with no central divider.

At the heart of that car is Nvidia’s Drive PX2 chip. Specifically designed for self-driving cars, the chip supports up to 12 high-definition cameras, and leverages Nvidia’s GPU tech for instant responsiveness and sensor management.

Virtual reality for work and play

That’s not to say the GPU community is all work and no play, and virtual reality is one of the key areas where the gaming world finds its way back in. That’s not the sole focus of VR anymore, though. From architecture to car design, there are many way to make virtual reality a tool instead of a toy.

IRay, in particular, is a massive step forward for VR, as it brings Nvidia’s highly detailed and accurate lighting into the virtual realm. IRay is already used for architectural projects and automotive designs, where light refraction doesn’t just have to look nice, it has to be pixel perfect.

Nvidia’s work is also leading to new hardware in other areas. During a presentation on computational displays, Nvidia vice president of graphics research David Luebke showed off a variety of new display types that might make a good fit for future VR and AR displays.

NVIDIA Prototype 1,700Hz Zero Latency Display

Among them were see-through displays with precision pinholes, LED projectors, and sets of micro-lenses. These new display technologies provide advanced optical features, like highly adjustable focal points, see-through capabilities, and sky-high refresh rates. One of the displays even allows developers to program eyeglass prescriptions into the lenses themselves.

Not so far away

We’ve come a long way on these advanced computing topics in just a few years, but most researchers agree that we’re going to move a lot faster in the next few. These innovations are heralding the dawn of the age of AI, and you can expect to see a lot of the currently in development software enhance everything from online shopping to social media, and yes, even gaming.

And the hardware is starting to get there, too. Nvidia showed off high-end Tesla offerings that, while insanely expensive and power hungry, have made leaps and bounds in terms of efficiency over the last generation of hardware.

With all of this new tech, we’re starting to move away from asking how to reach tech milestones, and starting to ask — when?

Brad Bourque
Brad Bourque is a native Portlander, devout nerd, and craft beer enthusiast. He studied creative writing at Willamette…
Topics
AMD’s RX 9070 XT beats Nvidia’s $1,000+ GPU, but there’s a catch
Fans on the RTX 5080.

AMD's RX 9070 XT hit the shelves last week, and the response has been largely positive. The GPU was expected to perform on around the same level as Nvidia's RTX 5070 Ti, making it capable of beating some of the best graphics cards. However, a known overclocker just managed to push the GPU to new heights, helping it beat Nvidia's $1,000+ RTX 5080.

Der8auer took the RX 9070 XT out for an extensive spin and achieved interesting results. Prior to launch, many thought the RX 9070 XT would rival the RTX 5070 at best, but now, we've seen it beating not just the RTX 5070 Ti but also the RTX 5080 in today's test. The catch? Not only did Der8auer use a premium card, but the GPU was also overclocked and undervolted.

Read more
AMD RX 9060 XT might not be the Nvidia-beating GPU we first thought
The RX 7600 XT graphics card on a pink background.

A new leak tells me that AMD's RX 9060 XT may repeat the same mistake as the RX 7600 XT, all the while missing out on a chance to win against rival Nvidia. While AMD's RX 9060 XT may not compete against some of the best graphics cards, it's going to be a mainstream card, which is great news for gamers. The downside, as is often the case, might lie in VRAM.

As shown in this listing filed with the Eurasian Economic Commission (EEC), Acer has just registered several new trademarks for its upcoming GPUs. This includes the Acer Predator Bifrost RX 9060 XT OC in two variants: One with 8GB VRAM and one with 16GB.

Read more
Nvidia’s sub-$350 GPU is now the most popular card on Steam
Two RTX 4060 cards side by side

Nvidia’s RTX 4060 has officially become the most widely used graphics card among gamers on Steam, thanks to its affordable price and solid performance for 1080p gaming. According to the latest Steam Hardware and Software Survey, the budget-friendly GPU has steadily gained traction since its mid-2023 launch, appealing to casual gamers, esports players, and budget-conscious PC builders.

For years, older budget GPUs like the GTX 1650 and RTX 3060 dominated Steam’s charts. However, the RTX 4060 has now surpassed both, securing the top position with an 8.57% market share in February 2025. Its rise can be attributed to competitive pricing (around $300-$350), low power consumption, and modern gaming features like DLSS 3 and ray tracing support.

Read more