Skip to main content

Nvidia's 2016 conference shows the serious side of the GPU

nvidia gtc wrap up nvidiagtc1
Image used with permission by copyright holder
If there was one thing missing from 2016’s GPU technology conference, it was gaming. While GeForce owners are lamenting the lack of new Pascal-based GTX cards, a developer community is abuzz at the prospects of new, powerful hardware, and Nvidia’s expanded software offerings.

The face of computing is changing, and a lot of it has to do with GPUs. Developers are finding new and exciting ways to leverage these chips’ immense and unique power, and it has the community spilling over with ideas, innovations, and new tech.

Deep learning

Deep learning and cognitive computing are definitely the hot topics at GTC this time around. Their presence has been looming the past couple of years, but more as a proof of concept. This year, researchers were ready to show off the work these systems have been doing.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

One of the most striking demos from Facebook’s Artificial Intelligence Research lab made sure everyone knows just how powerful cognitive computing has come. When fed data sets, the algorithms are able to turn them into useful insight, and even recreate the important pieces of information as it identifies them.

But the crucial part of the equation comes in when we discuss how to teach these systems what to look for. Early attempts focused on telling a machine what to look for and what its goals are, but now the conversation revolves around how these machines can teach themselves.

nvidiagtc4
Image used with permission by copyright holder

An untrained deep learning system is simply set loose on a humongous amount of data, and then researchers watch to see what it looks for. Whether that’s a swath of renaissance paintings, or videos of people engaged in a variety of sports and games, we’re starting to let the computer decide which factors are important, and that’s leading to greater accuracy and more versatile algorithms.

It’s also leading to systems that can complete more and more advanced tasks, and at rates we never could have imagined before we let the machines take over.

Self-driving cars

Strangely, Nvidia has positioned itself as a leading figure in the race to create self-driving cars. As you might imagine, GPUs are well suited for this kind of work, which requires juggling dozens of elements, and doing so quickly. Nvidia’s hardware is perfect for heavily realized workloads, and the company is taking full advantage of its advantageous position in the marketplace, as well as the connections its built over the years, to build a platform for self-driving cars that will power the fully autonomous RoboRace.

Nvidia even showed off its own self-driving car concept, lovingly known as BB8. The demonstration video showed a car that, at first, struggled to even stay on the road. It ran over cones, drove into the dirt, and didn’t stop when it should’ve. With just a few months of training and learning, the car drove perfectly, able to switch between driving surfaces smoothly, and adjust to unusual situations, like roads with no central divider.

At the heart of that car is Nvidia’s Drive PX2 chip. Specifically designed for self-driving cars, the chip supports up to 12 high-definition cameras, and leverages Nvidia’s GPU tech for instant responsiveness and sensor management.

Virtual reality for work and play

That’s not to say the GPU community is all work and no play, and virtual reality is one of the key areas where the gaming world finds its way back in. That’s not the sole focus of VR anymore, though. From architecture to car design, there are many way to make virtual reality a tool instead of a toy.

IRay, in particular, is a massive step forward for VR, as it brings Nvidia’s highly detailed and accurate lighting into the virtual realm. IRay is already used for architectural projects and automotive designs, where light refraction doesn’t just have to look nice, it has to be pixel perfect.

Nvidia’s work is also leading to new hardware in other areas. During a presentation on computational displays, Nvidia vice president of graphics research David Luebke showed off a variety of new display types that might make a good fit for future VR and AR displays.

NVIDIA Prototype 1,700Hz Zero Latency Display

Among them were see-through displays with precision pinholes, LED projectors, and sets of micro-lenses. These new display technologies provide advanced optical features, like highly adjustable focal points, see-through capabilities, and sky-high refresh rates. One of the displays even allows developers to program eyeglass prescriptions into the lenses themselves.

Not so far away

We’ve come a long way on these advanced computing topics in just a few years, but most researchers agree that we’re going to move a lot faster in the next few. These innovations are heralding the dawn of the age of AI, and you can expect to see a lot of the currently in development software enhance everything from online shopping to social media, and yes, even gaming.

And the hardware is starting to get there, too. Nvidia showed off high-end Tesla offerings that, while insanely expensive and power hungry, have made leaps and bounds in terms of efficiency over the last generation of hardware.

With all of this new tech, we’re starting to move away from asking how to reach tech milestones, and starting to ask — when?

Editors' Recommendations

Topics
Brad Bourque
Former Digital Trends Contributor
Brad Bourque is a native Portlander, devout nerd, and craft beer enthusiast. He studied creative writing at Willamette…
Nvidia RTX 4090 prices are skyrocketing as stocks run seriously low
Nvidia GeForce RTX 4090 GPU.

If you were in the market for a graphics card during the pandemic, you would have noticed that PC component prices – especially those for graphics cards -- went through the roof. Now, GPU prices are surging once again, albeit for a very different reason.

It’s bad news if you’re looking to upgrade to one of the best graphics cards, as the high-end Nvidia RTX 4090 is easily the worst affected. That’ll be grim reading if you looking to take your PC build to the next level.

Read more
This underrated Nvidia GPU is still the one to buy
Nvidia's RTX 4070 graphics cards over a pink background.

There hasn't been a lot of positive reception to Nvidia's graphics cards this generation. There's no doubt that Nvidia's RTX 40-series GPUs are among the best graphics cards you can buy, but with disappointing generational improvements, higher pricing, and reliance on features like DLSS 3, it's easy to write off a new graphics card when Nvidia releases one.

But there's one Goldilocks graphics card from Nvidia's current generation that's the right choice for a lot of PC gamers. It delivers all of the best parts of Nvidia's cutting-edge generation and ignores the insane pricing and disappointing generational uplifts. And now, months after launch, prices have dropped, making it an even better value. That GPU is Nvidia's RTX 4070.
Not a warm reception

Read more
I bought Nvidia’s worst-value GPU, but I don’t regret it
MSI RTX 4080 Suprim X installed in a PC.

After championing AMD and considering using it in my next build, I still ended up buying one of Nvidia's best graphics cards -- although arguably, the GPU in question is also among Nvidia's worst. Sure, it's a powerful card, but in terms of value, it's probably one of the worst options I could have gone with.

I'm talking about the RTX 4080. There's no doubt it can blitz through any game you can throw at it, as you can read in our RTX 4080 review, but it's also $1,200 -- a full $500 more than last-gen's RTX 3080.

Read more