Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Microsoft explains how thousands of Nvidia GPUs built ChatGPT

ChatGPT rose to viral fame over the past six months, but it didn’t come out of nowhere. According to a blog post published by Microsoft on Monday, OpenAI, the company behind ChatGPT, reached out to Microsoft to build AI infrastructure on thousands of Nvidia GPUs more than five years ago.

OpenAI and Microsoft’s partnership has caught a lot of limelight recently, especially after Microsoft made a $10 billion investment in the research group that’s behind tools like ChatGPT and DALL-E 2. However, the partnership started long ago, according to Microsoft. Since then, Bloomberg reports that Microsoft has spent “several hundred million dollars” in developing the infrastructure to support ChatGPT and projects like Bing Chat.

Hopper H100 graphics card.
Image used with permission by copyright holder

Much of that money went to Nvidia, which is now in the forefront of computing hardware required to train AI models. Instead of gaming GPUs like you’d find on a list of the best graphics cards, Microsoft went after Nvidia’s enterprise-grade GPUs like the A100 and H100.

Recommended Videos

It’s not just as simple as getting graphics cards together and training a language model, though. As Nidhi Chappell, Microsoft head of product for Azure, explains: “This is not something that you just buy a whole bunch of GPUs, hook them together, and they’ll start working together. There is a lot of system-level optimization to get the best performance, and that comes with a lot of experience over many generations.”

With the infrastructure in place, Microsoft is now opening up its hardware to others. The company announced on Monday in a separate blog post that it would offer Nvidia H100 systems “on-demand in sizes ranging from eight to thousands of Nvidia H100 GPUs,” delivered through Microsoft’s Azure network.

The popularity of ChatGPT has skyrocketed Nvidia, which has invested in AI through hardware and software for several years. AMD, Nvidia’s main competitor in gaming graphics cards, has been attempting to make headway into the space with accelerators like the Instinct MI300.

According to Greg Brockman, president and co-founder of OpenAI, training ChatGPT wouldn’t have been possible without the horsepower provided by Microsoft: “Co-designing supercomputers with Azure has been crucial for scaling our demanding AI training needs, making our research and alignment work on systems like ChatGPT possible.”

Nvidia is expected to reveal more about future AI products during the GPU Technology Conference (GTC). with the keynote presentation kicks things off on March 21. Microsoft is expanding its AI road map later this week, with a presentation focused around the future of AI in the workplace scheduled for March 16.

Jacob Roach
Former Digital Trends Contributor
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Your politeness toward ChatGPT is increasing OpenAI’s energy costs 
ChatGPT's Advanced Voice Mode on a smartphone.

Everyone’s heard the expression, “Politeness costs nothing,” but with the advent of AI chatbots, it may have to be revised.

Just recently, someone on X wondered how much OpenAI spends on electricity at its data centers to process polite terms like “please” and “thank you” when people engage with its ChatGPT chatbot.

Read more
Why writing with ChatGPT actually makes my life harder
ChatGPT prompt bar.

I remember when ChatGPT first appeared, and the first thing everyone started saying was "Writers are done for." People started speculating about news sites, blogs, and pretty much all written internet content becoming AI-generated -- and while those predictions seemed extreme to me, I was also pretty impressed by the text GPT could produce.

Naturally, I had to try out the fancy new tool for myself but I quickly discovered that the results weren't quite as impressive as they seemed. Fast forward more than two years, and as far as my experience and my use cases go, nothing has changed: whenever I use ChatGPT to help with my writing, all it does is slow me down and leave me frustrated.

Read more
Fun things to ask ChatGPT now that it remembers everything
ChatGPT on a laptop

If you hadn't heard, ChatGPT's memory just got a whole lot better. Rolled out across the world to Plus and Pro users over the past few days, ChatGPT's various models can now reference almost any past conversation you had. It doesn't remember everything word for word, but can pull significant details, themes, and important points of reference from just about anything you've ever said to it.

It feels a little creepy at times, but ChatGPT can now be used for much more personalized tasks. OpenAI pitches this as a way to improve its scheduling feature to use it as a personal assistant, or to help you continue longer chats over extended periods of time. But it's also quite fun to see what ChatGPT can tell you by trawling throughh all your chatlogs. It's often surprising some of the answers it spits out in response.

Read more