Skip to main content

Nvidia’s latest A.I. results prove that ARM is ready for the data center

Nvidia just published its latest MLPerf benchmark results, and they have are some big implications for the future of computing. In addition to maintaining a lead over other A.I. hardware — which Nvidia has claimed for the last three batches of results — the company showcased the power of ARM-based systems in the data center, with results nearly matching traditional x86 systems.

In the six tests MLPerf includes, ARM-based systems came within a few percentage points of x86 systems, with both using Nvidia A100 A.I. graphics cards. In one of the tests, the ARM-based system actually beat the x86 one, showcasing the advancements made in deploying different instruction sets in A.I. applications.

Related Videos
MLPerf results with Arm processors.

“The latest inference results demonstrate the readiness of ARM-based systems powered by ARM-based CPUs and Nvidia GPUs for tackling a broad array of A.I. workloads in the data center,” David Lecomber, senior director of HPC at Arm, said. Nvidia only tested the ARM-based systems in the data center, not with edge or other MLCommons benchmarks.

MLPerf is a series of benchmarks for A.I. that are designed, contributed to, and validated by industry leaders. Although Nvidia has led the charge in many ways with MLPerf, the leadership of the MLCommons consortium is made up of executives from Intel, the Video Electronics Standards Association, and Arm, to name a few.

The latest benchmarks pertain to MLCommons’ inference tests for the data center and edge devices. A.I. inference is when the model begins producing results. It comes after the training phase where the A.I. model is still learning, which MLCommons also has benchmarks for. Nvidia’s Triton software, which deals with inference, is in use at companies like American Express for fraud detection and Pinterest for image segmentation.

Nvidia also highlighted its Multi-Instance GPU (MIG) feature when speaking with press. MIG allows the A100 and A30 graphics cards to go from a single A.I. processing unit into a few A.I. accelerators. The A100 is able to split into seven separate accelerators, while the A30 can split into four.

By splitting up the GPU, Nvidia is able to run the entire MLPerf suite at the same time with only a small loss in performance. Nvidia says it measured 95% of per-accelerator performance when running all of the tests compared to a baseline reading, allowing the GPUs to run multiple A.I. instructions at the same time.

Editors' Recommendations

Google Bard vs. ChatGPT: which is the better AI chatbot?
ChatGPT versus Google on smartphones.

Google Bard and ChatGPT are two of the most prominent artificial intelligence (AI) chatbots available in 2023. But which is better? Both offer natural language responses to natural language inputs, using machine learning and millions of data points to craft useful, informative responses. Most of the time. These AI tools aren't perfect yet, but they point to an exciting future of AI assistant search and learning tools that will make information all the more readily available.

As similar as these chatbots are, they also have some distinct differences. Here's how ChatGPT and Google Bard measure up against one another.

Read more
I’ve seen the (distant) future of AI web search – here’s where it’s amazing, and where it struggles
Bing copilot AI chat interface.

The aggressiveness with which artificial intelligence (AI) moved from the realm of theoretical power into real-world consumer-ready products is astonishing. For several years now, and up until a couple of months ago when OpenAI's ChatGPT broke onto the scene, companies from the titans of Microsoft and Google down to myriad startups espoused the benefits of AI with little practical application of the tech to back it up. Everyone knew AI was a thing, but most didn't actually utilize it.

Just a handful of weeks after announcing an investment in OpenAI, Microsoft launched a publicly-accessible beta version of its Bing search engine and Edge browser powered by the same technology that has made ChatGPT the talk of the town. ChatGPT itself has been a fun thing to play with, but launching something far more powerful and fully integrated into consumer products like Bing and Edge is an entirely new level of exposure for this tech. The significance of this step cannot be overstated.
ChatGPT felt like a toy; having the same AI power applied to a constantly-updated search database changes the game.
Microsoft was kind enough to provide me with complete access to the new AI "copilot" in Bing. It only takes a few minutes of real-world use to understand why Microsoft (and seemingly every other tech company) is excited about AI. Asking the new Bing open-ended questions about planning a vacation, setting up a week of meal plans, or starting research into buying a new TV and having the AI guide you to something useful, is powerful. Anytime you have a question that would normally require pulling information from multiple sources, you'll immediately streamline the process and save time using the new Bing.
Let AI do the work for you
Not everyone wants to show up to Google or Bing ready to roll up their sleeves and get into a multi-hour research session with lots of open tabs, bookmarks, and copious amounts of reading. Sometimes you just want to explore a bit, and have the information delivered to you -- AI handles that beautifully. Ask one multifaceted question and it pulls the information from across the internet, aggregates it, and serves it to you in one text box. If it's not quite right, you can ask follow-up questions contextually and have it generate more finely-tuned results.

Read more
Google is holding off on releasing its ChatGPT rival until it reaches a ‘high bar for safety’
Google demoing its Bard AI offering car buying advice.

Google provided more details on its upcoming Bard AI during a livestreamed event on Wednesday. Positioned as a competitor to the viral AI chatbot ChatGPT, Google Bard promises to integrate AI into search results -- eventually, at least.

Google's event was light on details for the new AI. The company demoed a chatbot version of Bard that showed a user asking for car buying advice, weighing the pros and cons of an electric vehicle, and planning a road trip with their new car. The AI, in a conversational way, provided answers in seconds.

Read more