Skip to main content

School’s in session — Nvidia’s driverless system learns by watching

nvidia gpu driverless car project 100658383 primary idge 900x599
Nvidia's GPU-based driverless system learns by watching human drivers Nvidia
How do you train a car to drive itself? Let it watch real drivers. Engineers from graphics processing unit (GPU) company Nvidia designed a system that learned how to drive after watching humans drive for a total of 72 hours, as reported by NetworkWorld. The likely conclusion of the system’s success is that driverless cars are coming faster than most of us expected.

The details of how Nvidia trained two test cars are in a file titled End to End Learning for Self-Driving Cars. The basics are as follows: Nvidia used three cameras and two Nvidia DRIVE PX computers to watch humans drive cars for 72 hours.  The car training was conducted in several states, in varied weather conditions, day and night, and on a wide variety of road types and conditions. The cameras captured massive amounts data in 3D, which was tracked and stored by the on-board GPUs. That data was then analyzed and broken into learning steps with Torch 7, a machine learning system. The output? A trained system for driving cars without driver intervention.

Recommended Videos

In subsequent tests using the trained system in cars on test roads and public roads (including highways), the system achieved 98 to 100 percent autonomy. When a driver intervenes, the system continues to learn.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Only one camera and one computer are needed in a car based on Nvidia’s GPU system. When the time comes that you can buy a car with this system, it will arrive ready to drive itself — you won’t need to give it more training runs, as told to Digital Trends. While the car can drive itself, when it detects (via the camera) something new (as in something it hasn’t “seen” before), it alerts the driver to take over and then goes into learning mode. Whatever it learns is later transmitted to the cloud and that information is incorporated into the next software update so the learning can benefit all cars using Nvidia’s self-driving system. You can watch Nvidia’s GPU-based car training and performance here.

When speaking with Digital Trends, a spokesperson from Nvidia used the example of encountering a moose. The moose example works because chancing upon a moose in the road is rare in most parts of the U.S. and apparently has never happened in their car training. So if I was driving a car with the Nvidia self-driving system and a moose was on or beside the road, the system would alert me to take over. The onboard system would watch what I did and record changes to steering and other systems like brakes and acceleration. My reactions would be transmitted to the cloud. After the next software update, if another person encountered a moose, that car would know how to react based on my reactions.

This learn-by-watching method of training driverless vehicles is both more realistic and inclusive than chains of rules written by programmers based on highly varied elements like road surface, lane markers, lights, and traffic conditions. The GPU-based system does the data-gathering and the machine learning system creates the rules.

Digital Trends was informed that Nvidia is now in conversations with more than 80 major OEMs (original equipment manufacturers) and research institutions. The one manufacturer Nvidia was able speak about publicly is Volvo. Next year Volvo’s Drive Me project in Gothenburg, Sweden will use 100 Volvo XC90s equipped with the Nvidia system to observe how the system works on “a defined set of roads” in the Swedish city.

Driverless cars are coming and Nvidia’s system looks to be on pace to speed up the transition.

Bruce Brown
Bruce Brown Contributing Editor   As a Contributing Editor to the Auto teams at Digital Trends and TheManual.com, Bruce…
How to watch Nvidia’s big RTX 50-series GPU launch today
how to watch nvidias ces 2025 keynote screenshot

Update: Nvidia's keynote is ongoing, but the RTX 50-series graphics cards have already been announced, starting with the RTX 5090.

Looking to buy a new high-end graphics card in 2025? It might be worth watching Nvidia's CES 2025 keynote, as it may have all the details for the RTX 50-series, which could well be your next-gen upgrade. Even if you aren't planning to buy any time soon, though, it may still be worth checking out, as Nvidia is expected to blow the doors off GPU performance with its next-gen designs, and show off more besides.

Read more
Intel Arc B580 vs. Nvidia RTX 4060: a one-sided showdown
The back of the Intel Arc B580 graphics card.

Intel is back with one of the best graphics cards you can buy -- the Arc B580. As you can read in my Intel Arc B580 review, it's a graphics card that has no business being as powerful as it is given how inexpensive it is. And when comparing it to its main competitor, Nvidia's RTX 4060, Intel mops the floor with its rival.

I've been testing Intel's latest GPU over the last couple of weeks, and I decided to put it head-to-head with Nvidia's budget RTX 4060, which is currently the second-most-popular GPU on Steam. Given the performance I've seen, Intel's GPU deserves to start climbing up the rankings in those same charts.
Specs and pricing

Read more
Indiana Jones and the Great Circle proves Nvidia wrong about 8GB GPUs
Indiana jones buried in the sand.

Nvidia was wrong, and Indiana Jones and the Great Circle is proof of that. Despite being a game that's sponsored by Nvidia due to its use of full ray tracing -- which is said to arrive on December 9 -- multiple of Nvidia's best graphics cards struggle to maintain a playable frame rate in the game, and that largely comes down to VRAM.

Computer Base tested a swath of GPUs in the game across resolutions with the highest graphics preset, and one consistent trend emerged. Any GPUs packing less than 12GB of VRAM couldn't even maintain 30 frames per second (fps) in the game at its highest graphics settings. That led to some wild comparisons as you can see in the chart below. The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB.

Read more