More details are emerging through leaks about Intel’s premium Xe gaming graphics, and it looks like Nvidia’s GeForce and AMD’s Radeon are about to get some serious competition. In a detailed leak from YouTube channel Moore’s Law is Dead, we now have images of the GPU’s engineering board, as well as some specifications behind Intel’s heavyweight gaming GPU.
High-end gamers will be pleased to know that if Moore’s Law is Dead’s information pans out, Intel’s Gen 12-based Xe HPG DG2 discrete graphics card will support ray-tracing, making it competitive against Nvidia’s GeForce RTX 2000 and 3000 series as well as AMD’s current Radeon RX 6000 series graphics.
The card is also expected to support Intel’s take on DLSS — something that AMD did not provide with the launch of Radeon RX 6000 — so you should get some pretty great performance with graphics upscaling. Nvidia calls its technique Dynamic Learning Super Sampling, and Intel’s version could be called XeSS, or Xe Super Sampling.
The Xe HPG DG2 discrete graphics will be a family of graphics, so potentially, Intel can build out a family of gaming-specific GPUs from entry-level to premium, similar to what its rivals have done. The premium graphics will be based on a 6nm node, likely manufactured by TSMC, that will be available in both desktop and laptop designs. For reference, Nvidia’s GeForce RTX 3080 was built on Samsung’s 8N design.
The top of the line model is expected to ship with 512 execution units, or EU, and 4,096 cores. It will have a 256-bit bus and up to 16GB GDDR6 RAM, though an 8GB GDDR6 variant could also be possible. The card is expected to have a clock speed of 2.2GHz, though it’s unclear if this will be the base speed or boost speed. The TDP is now listed at approximately 275W, up from the initial expectation go 225-250W from earlier leaks. Intel could potentially boost that to 300W even, according to Wccftech, if it wants to pursue performance and faster clock speeds.
And according to the leak, this means that the DG2 GPU will be able to have performance similar to rival Nvidia’s flagship GeForce RTX 3080 and AMD’s Radeon RX 6800 XT. Early engineering benchmarks using the 3DMark TimeSpy utility show that performance of this card varies a bit, and results were between that of Nvidia’s GeForce RTX 2080 and the ultra-premium GeForce RTX 3090.
In addition to the premium model with 512 EUs, Intel is also expected to launch configurations with 384 EUs and 3,072 shading units, 256 EUs and 2,048 shading units, 192 EUs and 1,536 shading units, 128 EUs and 1,024 shading units, and 86 EUs and 768 shading units. The lower-end models will have 4GB of GDDR6 RAM.
Moore’s Law is Dead predicts that Intel will deliver good driver support for its GPUs, and the company can either push back the launch to ensure full driver support or adopt AMD’s strategy and launch the card first with some missing software features that will be filled in later through software updates. At this point, the card is still in early stages, and that even a late 2021 launch seems optimistic at best.
Right now, the engineering sample seems very primitive with a green PCB and plastic shroud — features that are expected to change on the final release. The card is shown with a dual-slot design and cooling is done via dual fans and an aluminum fin heatsink. Intel could even use a polarizing design in the DG2’s final design, according to the channel, and use a white color for the card.
The mid-range model could cost around $200-300, making them very competitive, but more premium configurations are expected to cost more. It’s unclear if the card will still be affected by the global semiconductor shortage that’s plaguing Intel’s GPU rivals AMD and Nvidia when it launches.
In addition to the DG2, Intel is also hard at work on a DG3 version that is expected in 2023.
- The Aorus RTX 4090 Master is the biggest GPU we’ve ever seen
- AMD cuts GPU prices at just the right time to pull ahead of Nvidia
- Nvidia says falling GPU prices are ‘a story of the past’
- Here are all of the games that will support Nvidia DLSS 3
- Nvidia RTX DLSS: everything you need to know