Skip to main content

Nvidia’s new liquid-cooled GPUs are heading to data centers

Nvidia is taking some notes from the enthusiast PC building crowd in an effort to reduce the carbon footprint of data centers. The company announced two new liquid-cooled GPUs during its Computex 2022 keynote, but they won’t be making their way into your next gaming PC.

Instead, the H100 (announced at GTC earlier this year) and A100 GPUs will ship as part of HGX server racks toward the end of the year. Liquid cooling isn’t new for the world of supercomputers, but mainstream data center servers haven’t traditionally been able to access this efficient cooling method (not without trying to jerry-rig a gaming GPU into a server, that is).

Nvidia A100 liquid-cooled data center GPU.
Image used with permission by copyright holder

In addition to HGX server racks, Nvidia will offer the liquid-cooled versions of the H100 and A100 as slot-in PCIe cards. The A100 is coming in the second half of 2022, and the H100 is coming in early 2023. Nvidia says “at least a dozen” system builders will have these GPUs available by the end of the year, including options from Asus, ASRock, and Gigabyte.

Data centers account for around 1% of the world’s total electricity usage, and nearly half of that electricity is spent solely on cooling everything in the data center. As opposed to traditional air cooling, Nvidia says its new liquid-cooled cards can reduce power consumption by around 30% while reducing rack space by 66%.

Instead of an all-in-one system like you’d find on a liquid-cooled gaming GPU, the A100 and H100 use a direct liquid connection to the processing unit itself. Everything but the feed lines is hidden in the GPU enclosure, which itself only takes up one PCIe slot (as opposed to two for the air-cooled versions).

Data centers look at power usage effectiveness (PUE) to gauge energy usage — essentially a ratio between how much power a data center is drawing versus how much power the computing is using. With an air-cooled data center, Equinix had a PUE of about 1.6. Liquid cooling with Nvidia’s new GPUs brought that down to 1.15, which is remarkably close to the 1.0 PUE data centers aim for.

Energy usage for Nvidia liquid-cooled data center GPUs.
Image used with permission by copyright holder

In addition to better energy efficiency, Nvidia says liquid cooling provides benefits for preserving water. The company says millions of gallons of water are evaporated in data centers each year to keep air-cooled systems operating. Liquid cooling allows that water to recirculate, turning “a waste into an asset,” according to head of edge infrastructure at Equinix Zac Smith.

Although these cards won’t show up in the massive data centers run by Google, Microsoft, and Amazon — which are likely using liquid cooling already — that doesn’t mean they won’t have an impact. Banks, medical institutions, and data center providers like Equinix compromise a large portion of the data centers around today, and they could all benefit from liquid-cooled GPUs.

Nvidia says this is just the start of a journey to carbon-neutral data centers, as well. In a press release, Nvidia senior product marketing manager Joe Delaere wrote that the company plans “to support liquid cooling in our high-performance data center GPUs and our Nvidia HGX platforms for the foreseeable future.”

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
Nvidia’s supercomputer may bring on a new era of ChatGPT
Nvidia's CEO showing off the company's Grace Hopper computer.

Nvidia has just announced a new supercomputer that may change the future of AI. The DGX GH200, equipped with nearly 500 times more memory than the systems we're familiar with now, will soon fall into the hands of Google, Meta, and Microsoft.

The goal? Revolutionizing generative AI, recommender systems, and data processing on a scale we've never seen before. Are language models like GPT going to benefit, and what will that mean for regular users?

Read more
Nvidia may launch 3 new GPUs, and they’re bad news for AMD
An Nvidia GeForce RTX graphics card seen from the side.

In a surprising twist, Nvidia might be releasing not one, but three graphics cards. They all fall under the same RTX 4060 umbrella, although two of them are RTX 4060 Ti models.

This marks a strong entry into the midrange segment for Nvidia, with one of the cards addressing a significant concern -- low VRAM. Should AMD be worried about losing even more business to Team Green?

Read more
Nvidia’s new Guardrails tool fixes the biggest problem with AI chatbots
Bing Chat saying it wants to be human.

Nvidia is introducing its new NeMo Guardrails tool for AI developers, and it promises to make AI chatbots like ChatGPT just a little less insane. The open-source software is available to developers now, and it focuses on three areas to make AI chatbots more useful and less unsettling.

The tool sits between the user and the Large Language Model (LLM) they're interacting with. It's a safety for chatbots, intercepting responses before they ever reach the language model to either stop the model from responding or to give it specific instructions about how to respond.

Read more