Skip to main content

Nvidia’s $200 Jetson Orin Nano minicomputer is 80 times faster than the previous version

Nvidia announced the upcoming release of the Jetson Orin Nano, a system-on-module (SOM) that will power up the next generation of entry-level AI and robotics, during its GTC 2022 keynote today.

Nvidia says this new version delivers an 80x increase in performance over the $99 Jetson Nano. The original version was released in 2019 and has been used as a bare-bones entry into the world of AI and robotics, particularly for hobbyists and STEM students. This new version looks to seriously up the power.

Nvidia's Jetson Orin Nano robotics processor.
Image used with permission by copyright holder

A system-on-module (also referred to as computer-on-module) features a single board with a microprocessor. It also has memory and input/outputs (IOs), and usually has a carrier board. It’s not the same thing as a system-on-a-chip (SOC), mind you — an SOM is board-based and may have the space to include extra components; it could even include an SOC. In short, an SOM is a ready-to-use computing solution, but it’s not a full computer.

Recommended Videos

With the technicalities out of the way, let’s talk about Nvidia’s latest development, the Jetson Orin, arriving with six Orin-based production modules that were made to handle AI and robotics applications at an affordable price. Among them is the Nvidia Jetson Orin Nano.

Despite being the smallest form factor Jetson SOM, the Jetson Orin Nano can handle up to 40 trillion operations per second (TOPS) of AI-related tasks. The performance hits new heights with the AGX Orin, serving up 275 TOPS in order to handle advanced autonomous machines.

Nvidia’s Jetson Orin comes with an Ampere-based GPU, an ARM-based CPU, and multimodal sensor support. It’s also fully compatible with Nvidia’s Orin NX modules, including full emulation support that will enable Nvidia’s customers to design around multiple Jetson modules. Other perks include support for multiple concurrent AI application pipelines, complete with fast inputs and outputs.

The Jetson Orin Nano modules will be available in two variants, one with 8GB memory and up to 40 TOPS, and one with 4GB memory and up to 20 TOPS. In terms of power consumption, the SOM needs next to nothing: The former requires between 7 watts and 15 watts while the latter only needs 5 watts to 10 watts.

Nvidia Jetson Orin Nano system-on-module.
Nvidia

Nvidia foresees that the modules will be used by a wide variety of customers, from engineers dealing with edge AI applications to robotics operating system developers. The low price point, starting at just $199, will make this technology more accessible to a wider range of users. Nvidia cites Canon, John Deere, Microsoft Azure, and more as early adopters of Jetson Orin Nano.

“With an orders-of-magnitude increase in performance for millions of edge AI and ROS developers today, Jetson Orin is the ideal platform for virtually every kind of robotics deployment imaginable,” said Deepu Talla, vice president of Nvidia’s embedded and edge computing division.

Nvidia claims that the Jetson Orin will offer an 80x increase in performance over the previous generation of Jetson SOMs. That’s a massive step up at a reasonable price. The modules will be available starting in January 2023.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
9 macOS Sequoia features every Mac user should know
macOS Sequoia being introduced by Apple's Craig Federighi at the Worldwide Developers Conference (WWDC) 2024.

Apple’s macOS Sequoia operating system launched with a whole heap of interesting new features, and there’s a lot to try if you’ve just recently updated your Mac. But which new additions are worth your time, and which can be passed over?

That’s the question we’re aiming to answer today. We’ve scoured macOS Sequoia to find the nine key features that every Mac user should know about. From Apple Intelligence to iPhone Mirroring, these are the tools and technologies that you’ll want to try next.

Read more
Meta is training AI on your data. Users say opting out doesn’t work.
Meta AI WhatsApp widget.

Imagine a tech giant telling you that it wants your Instagram and Facebook posts to train its AI models. And that too, without any incentive. You could, however, opt out of it, as per the company. But as you proceed with the official tools to back out and prevent AI from gobbling your social content, they simply don’t work. 

That’s what users of Facebook and Instagram are now reporting. Nate Hake, publisher and founding chief of Travel Lemming, shared that he got an email from Meta about using his social media content for AI training. However, the link to the opt-out form provided by Meta doesn’t work.

Read more
Your politeness toward ChatGPT is increasing OpenAI’s energy costs 
ChatGPT's Advanced Voice Mode on a smartphone.

Everyone’s heard the expression, “Politeness costs nothing,” but with the advent of AI chatbots, it may have to be revised.

Just recently, someone on X wondered how much OpenAI spends on electricity at its data centers to process polite terms like “please” and “thank you” when people engage with its ChatGPT chatbot.

Read more