Skip to main content

The popularity of ChatGPT may give Nvidia an unexpected boost

The constant buzz around OpenAI’s ChatGPT refuses to wane. With Microsoft now using the same technology to power its brand-new Bing Chat, it’s safe to say that ChatGPT may continue this upward trend for quite some time. That’s good news for OpenAI and Microsoft, but they’re not the only two companies to benefit.

According to a new report, the sales of Nvidia’s data center graphics cards may be about to skyrocket. With the commercialization of ChatGPT, OpenAI might need as many as 10,000 new GPUs to support the growing model — and Nvidia appears to be the most likely supplier.

Related Videos
Nvidia's A100 data center GPU.
Nvidia

Research firm TrendForce shared some interesting estimations today, and the most interesting bit pertains to the future of ChatGPT. According to TrendForce, the GPT model that powers ChatGPT will soon need a sizeable increase in hardware in order to scale up the development.

“The number of training parameters used in the development of this autoregressive language model rose from around 120 million in 2018 to almost 180 billion in 2020,” said TrendForce in its report. Although it didn’t share any 2023 estimates, it’s safe to assume that these numbers will only continue to rise as much as technology and budget allow.

The firm claims that the GPT model needed a whopping 20,000 graphics cards to process training data in 2020. As it continues expanding, that number is expected to rise to above 30,000. This could be great news for Nvidia.

These calculations are based on the assumption that OpenAI would be using Nvidia’s A100 GPUs in order to power the language model. These ultrapowerful graphics cards are really pricey — in the ballpark of $10,000 to $15,000 each. They’re also not Nvidia’s top data center cards right now, so it’s possible that OpenAI would go for the newer H100 cards instead, which are supposed to deliver up to three times the performance of A100. These GPUs come with a steep price increase, with one card costing around $30,000 or more.

The data center GPU market doesn’t only consist of Nvidia — Intel and AMD also sell AI accelerators. However, Nvidia is often seen as the go-to solution for AI-related tasks, so it’s possible that it might be able to score a lucrative deal if and when OpenAI decides to scale up.

Should gamers be worried if Nvidia does, indeed, end up supplying a whopping 10,000 GPUs to power up ChatGPT? It depends. The graphics cards required by OpenAI have nothing to do with Nvidia’s best GPUs for gamers, so we’re safe there. However, if Nvidia ends up shifting some production to data center GPUs, we could see a limited supply of consumer graphics cards down the line. Realistically, the impact may not be that bad — even if the 10,000-GPU prediction checks out, Nvidia won’t need to deliver them all right away.

Editors' Recommendations

ChatGPT was down nearly all day, chat history still in progress
Close up of ChatGPT and OpenAI logo.

ChatGPT had been down for most of the day, creating an unsettling moment for the large number of people that have come to rely on the advanced AI for help with writing and coding, as well as assistance on a variety of topics.

The outage included for paid subscribers of ChatGPT Plus, which was supposed to provide priority status during peak times.

Read more
Check your inbox — Google may have invited you to use Bard, its ChatGPT rival
ChatGPT versus Google on smartphones.

AI chatbots have been the subject of much public fascination as of late, with the likes of ChatGPT continuously making headlines. But now, Google is finally getting in on the trend by soft-launching Bard for select Pixel users.

Bard is Google's AI chatbot that was previously unavailable to the public, but according to a report from 9to5Google, the company is inviting some of its most loyal and dedicated customers to give it a try.

Read more
Here’s the ChatGPT word limit and how to get around it
A laptop opened to the ChatGPT website.

Fans of ChatGPT adore the AI chatbot for several purposes, including its ability to generate detailed essays in a matter of seconds. However, one of its little-known limitations is that there is a word and character limit set on how much content it can output per query.
Reddit members and other AI enthusiasts have been discussing this for months, and luckily there are easy workarounds for this limitation by way of the prompts you can use.

What is the ChatGPT word limit?
ChatGPT's parent company, OpenAI. set the word and character limit as part of its ongoing development of the AI chatbot, which is still in its research preview phase. Some of the issues with ChatGPT include its affinity toward "social biases, hallucinations, and adversarial prompts," in addition to producing inaccurate content when the AI algorithm is overwhelmed or at a loss for information to process.
Similarly, ChatGPT might simply stop producing content when the request is too complex for the AI to handle. This happens at about 500 words or 4,000 characters. If you happen to give the chatbot a request for a specific number of words above 500, you might find that it cuts off midsentence somewhere after 500 words.

Read more