Skip to main content

Support for dual GPUs could be making an unexpected comeback

Intel seems to be bringing back something that Nvidia and AMD had long given up on: the ability, and the incentive, to use dual graphics cards in a single system.

Multi-GPUs were once a big deal, but the latest generation abandoned that idea for a variety of reasons. However, Intel has allegedly confirmed that you’ll be able to use multiple Intel Arc GPUs at once. Will that help Intel capture some of Nvidia’s and AMD’s customer base?

Update: Intel reached out to us with a short clarification, saying: “Intel showed a Blender Cycles rendering demo at SIGGRAPH with Intel Arc graphics. Multi-GPU rendering support for Intel Arc and Intel Arc Pro graphics cards through oneAPI is supported starting in Blender 3.3. Intel Arc graphics does not support multi-GPU for gaming.”

The original article follows below.

Intel Arc A750M Limited Edition graphics card sits on a desk.
Intel

A good few years have passed since some of us were yearning for a dual-GPU setup with one of Nvidia’s latest and greatest. Stacking Titan GPUs was something many gamers longed for, but realistically, most of us couldn’t afford it — and the performance gains weren’t quite worth it for the average player.

While the technology persists in the high-performance computing (HPC) segment and among professionals, consumers now stick to a single graphics card. Intel seems eager to shake things up in that regard.

According to TweakTown, which cites an Intel representative, the company is currently readying its oneAPI software in order to be prepared to support multiple GPUs. In fact, Intel was allegedly planning to show off a dual-Arc system during SIGGRAPH 2022, but was unable to do so. Why? TweakTown claims that Intel couldn’t find a chassis big enough to fit two GPUs in time for the event. This checks out, seeing as Intel seemingly only had a small-form NUC chassis on hand, equipped with a single Arc A770 Limited Edition GPU.

Two Intel Arc GPUs running side by side.
Linus Tech Tips

Multi-GPU support is an interesting addition to Intel Arc. At this point, it’s hard to deny that it’ll be tricky for Intel to compete with AMD and Nvidia. Sure, the lineup can trade blows with some Team Green and Team Red GPUs, but we have next-gen cards coming out in the next couple of months — Intel is certainly going to fall behind.

Using dual Intel Arc GPUs versus a single Nvidia or AMD card could prove to be viable, and if the cards are priced down, it might even be a decent option. On the other hand, with the extra power consumption, the requirement of a roomy case, and the thermal concerns, there were plenty of good reasons why AMD and Nvidia stopped pushing for dual-GPU setups. Intel might reveal more about the tech shortly, so perhaps then we will learn about its exact plans.

Editors' Recommendations

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Someone tweaked AMD’s RX 7800 XT, and the results may surprise you
YouTube channel Techtesters reviewing an underclocked AMD RX 7800 XT graphics card.

AMD’s RX 7800 XT graphics card handily beats Nvidia’s RTX 4070, outperforming it in games and costing less to boot, with the GPU comparison falling firmly in AMD’s favor. The main drawback? Its power consumption is much higher. But it doesn’t have to be that way, as one YouTube channel has shown that undervolting the card could have a notable impact.

When set up right out of the box, AMD’s card has a power draw of 263 watts, while Nvidia’s offering sips just 200 watts. If you’re concerned about your graphics card getting hot and heavy in operation -- and spiking your electricity bill -- the RTX 4070 is probably the better choice.

Read more
Intel’s new integrated graphics could rival discrete GPUs
The Intel Meteor Lake chip.

Intel has just announced an interesting update to its upcoming Meteor Lake chips: The integrated graphics are about to receive an unprecedented boost, allowing them to rival low-end, discrete GPUs. Now equipped with hardware-supported ray tracing, these chips have a good chance of becoming the best processors if you're not buying a discrete graphics card. Performance gains are huge, and it's not just the gamers who stand to benefit.

The information comes from a Graphics Deep Dive presentation hosted by Intel fellow Tom Petersen, well-known to the GPU market for working on Intel Arc graphics cards. This time, instead of discrete graphics, Petersen focused on the integrated GPU (iGPU) inside the upcoming Meteor Lake chip. Petersen explained the improvements at an architectural level, introducing the new graphics as Intel Xe-LPG.

Read more
Thunderbolt 5 may help bring back external GPUs, Intel says
The Razer Core X Chroma external graphics card on a desk next a laptop and a monitor.

Intel has just revealed Thunderbolt 5, which brings a stonking bandwidth increase to the speedy connector. Not only does that mean you’ll be able to charge connected laptops much faster than you can with Thunderbolt 4, but Intel also believes it could breathe new life into a forgotten product for gamers and creatives: the external GPU.

Thunderbolt 5 is a substantial improvement over Thunderbolt 4. The new standard offers 80 gigabits of bidirectional bandwidth -- double that of its predecessor -- and 120Gbps of bandwidth for external displays.

Read more