Skip to main content

Nvidia Workbench lets anyone train an AI model

Nvidia CEO showing the RTX 4060 Ti at Computex 2023.
Nvidia

Nvidia has just announced the AI Workbench, which promises to make creating generative AI a lot easier and more manageable. The workspace will allow developers to develop and deploy such models on various Nvidia AI platforms, including PCs and workstations. Are we about to be flooded with even more AI content? Perhaps not, but it certainly sounds like the AI Workbench will make the whole process significantly more approachable.

In the announcement, Nvidia notes that there are hundreds of thousands of pretrained models currently available; however, customizing them takes time and effort. This is where the Workbench comes in, simplifying the process. Developers will now be able to customize and run generative AI with minimal effort, utilizing every necessary enterprise-grade model. The Workbench tool supports various frameworks, libraries, and SDKs from Nvidia’s own AI platform, as well as open-source repositories like GitHub and Hugging Face.

Once customized, the models can be shared across multiple platforms with ease. Devs running a PC or workstation with an Nvidia RTX graphics card will be able to work with these generative models on their local systems, but also scale up to data center and cloud computing resources when necessary.

“Nvidia AI Workbench provides a simplified path for cross-organizational teams to create the AI-based applications that are increasingly becoming essential in modern business,” said Manuvir Das, Nvidia’s vice president of enterprise computing.

Nvidia has also announced the fourth iteration of its Nvidia AI Enterprise software platform, which is aimed at offering the tools required to adopt and customize generative AI. This breaks down into multiple tools, including Nvidia NeMo, which is a cloud-native framework that lets users build and deploy large language models (LLMs) like ChatGPT or Google Bard.

A MacBook Pro on a desk with ChatGPT's website showing on its display.
Hatice Baran / Unsplash

Nvidia is tapping into the AI market more and more at just the right time, and not just with the Workbench, but also tools like Nvidia ACE for games. With generative AI models like ChatGPT being all the rage right now, it’s safe to assume that many developers might be interested in Nvidia’s one-stop-shop easy solution. Whether that’s a good thing for the rest of us still remains to be seen, as some people use generative AI for questionable purposes.

Let’s not forget that AI can get pretty unhinged all on its own, like in the early days of Bing Chat, and the more people who start creating and training these various models, the more instances of problematic or crazy behavior we’re going to see out in the wild. But assuming everything goes well, Nvidia’s AI Workbench could certainly simplify the process of deploying new generative AI for a lot of companies.

Editors' Recommendations

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Nvidia is ‘no longer a graphics company’
NVIDIA CEO Jensen Huang on stage.

It's no secret that Nvidia has quickly morphed into an AI company. Although it creates some of the best graphics cards for PC gamers, the company's supercomputing efforts have catapulted it into being a trillion-dollar company, and that transformation was spurred on by the monumental rise of ChatGPT. That shift, from a graphics company to an AI company, was intentional choice by Nvidia's CEO Jensen Huang.

In a moment of saying the quiet part out loud, Greg Estes, the vice president of corporate marketing at Nvidia, said: "[Jensen] sent out an email on Friday evening saying everything is going to deep learning, and that we were no longer a graphics company. By Monday morning, we were an AI company. Literally, it was that fast."

Read more
Windows 11 will soon harness your GPU for generative AI
A hand grabbing MSI's RTX 4090 Suprim X.

Following the introduction of Copilot, its latest smart assistant for Windows 11, Microsoft is yet again advancing the integration of generative AI with Windows. At the ongoing Ignite 2023 developer conference in Seattle, the company announced a partnership with Nvidia on TensorRT-LLM that promises to elevate user experiences on Windows desktops and laptops with RTX GPUs.

The new release is set to introduce support for new large language models, making demanding AI workloads more accessible. Particularly noteworthy is its compatibility with OpenAI's Chat API, which enables local execution (rather than the cloud) on PCs and workstations with RTX GPUs starting at 8GB of VRAM.

Read more
In the year of AI, is background blur really the best we can do?
The AI features in Windows 11 for Teams

AI is the future. I'm convinced of it, too, because I've heard every company that even remotely touches technology tell me that's the case. We need AI. It's the bedrock of every technological innovation in the future.

If you can't sense the sarcasm, let me bring this down to earth. AI is everywhere, and that's not surprising. There are new tech trends every year, but this most recent boom in AI feels different. Ever since the switch was flipped and ChatGPT was unleashed on the world, there has been an AI frenz,y with some of the world's largest and most wealthy companies duking it out to end up on top of the AI pile.

Read more