Skip to main content

Intel may already be conceding its fight against Nvidia

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn’t been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

Recommended Videos

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it’s not surprising at all.

Arc Battlemage leaks

First, let’s talk about what’s new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it’s alive and well. The cards might even be coming out this year, although given Intel’s track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

RedGamingTech posted a big update to Intel Arc Battlemage specs in his latest video, and it doesn’t sound particularly good for high-end gaming enthusiasts. According to the YouTuber, the specifications of the flagship chip may be significantly different compared to his previous predictions. What’s worse, it might never even be released.

Initially, RedGamingTech suggested that the top Battlemage GPU would feature 56 Xe cores and a frequency of up to 3GHz. That’s still the case, but rumor has it that there’s been a big shake-up in memory bus and cache configuration. Instead of the 256-bit bus and the 116MB of L2 cache, the YouTuber now says that we can expect a 192-bit bus, 8MB of L2 cache, and a whopping 512MB of Adamantine cache.

Adamantine cache is still pretty unknown to us at this stage, although an Intel patent that PCGamer shared details on tells us more about it. It’s essentially Level 4 cache that’s comparable to AMD’s Infinity Cache and appears to work in a similar way.

That sounds pretty good, right? With 56 Xe cores, the card would be a huge upgrade over the Arc A770 that comes with 32 cores. However, even despite this massive L4 cache, those specs already hint at a less-than-high-end flagship for Intel. With a 192-bit bus, Intel would probably stop at around 12GB of VRAM, unless it ends up feeling adventurous like AMD with the RX 7600 XT or Nvidia with the RTX 4060 Ti. (Let’s hope that it won’t.)

Regardless of whether this GPU is even real, RedGamingTech suspects that Intel may choose not to release it at all due to unsatisfactory profit margins. Instead, Intel might focus on a GPU with 40 Xe cores, a 192-bit memory bus, 18MB of L2 cache, and zero “Adamantine” cache.

Is it time for Nvidia to celebrate?

Nvidia GeForce RTX 4090 GPU.
Jacob Roach / Digital Trends

AMD is reportedly bowing out of the high-end GPU race in this next generation. Now, Intel is said to be doing the same. Where does that leave Nvidia? Right at the very top, with complete control of the enthusiast GPU market and nothing to worry about in that regard.

It’s a dream for Nvidia, but it’s not so great for us, the end users. Giving Nvidia the ability to drive up the prices as much as it wishes brought us the RTX 40-series, where the prices and the performance often just don’t add up. With zero competition at the high end, the RTX 5090 might turn out to be a terrifying monstrosity with an eye-watering price tag. After all, why wouldn’t it be? It’s not like AMD or Intel are doing anything to keep Nvidia from doing otherwise.

On the other hand, even if Intel chooses to focus on the mainstream segment, things won’t change too much. AMD is Nvidia’s main competitor, and even now, when it has a couple of horses in this race, it still can’t match Nvidia’s flagship RTX 4090, or even the surprisingly impressive new RTX 40 Super cards. Intel, now one generation behind (and soon to be two), wouldn’t have been able to beat Nvidia’s future flagship either.

For the mainstream market, meaning the vast majority of GPUs that are sold, it’s actually good if AMD and Intel will be there and give Nvidia some heat. Those prices might end up less inflated as a result. Meanwhile, high-end gaming will be pricier than ever, but unfortunately, Intel wouldn’t have been able to stop Nvidia there anyway, regardless of the card it might never release.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
AMD takes lead over Nvidia, but how long will it last?
An Asus RX 9070 XT TUF GPU.

While both AMD and Nvidia make some of the best graphics cards, pitting the two against each other usually reveals that Nvidia dominates the GPU market with an over 80% share. However, a new survey revealed that, at least in the recent weeks, many gamers preferred to go with AMD when buying a GPU. But how long will this surprising lead even last?

https://x.com/3DCenter_org/status/1899732939686256846

Read more
AMD’s RX 9070 XT beats Nvidia’s $1,000+ GPU, but there’s a catch
Fans on the RTX 5080.

AMD's RX 9070 XT hit the shelves last week, and the response has been largely positive. The GPU was expected to perform on around the same level as Nvidia's RTX 5070 Ti, making it capable of beating some of the best graphics cards. However, a known overclocker just managed to push the GPU to new heights, helping it beat Nvidia's $1,000+ RTX 5080.

Der8auer took the RX 9070 XT out for an extensive spin and achieved interesting results. Prior to launch, many thought the RX 9070 XT would rival the RTX 5070 at best, but now, we've seen it beating not just the RTX 5070 Ti but also the RTX 5080 in today's test. The catch? Not only did Der8auer use a premium card, but the GPU was also overclocked and undervolted.

Read more
An AMD RX 9060 XT with 16GB would ruin Nvidia’s second-hand market
Several AMD RX 9000 series graphics cards.

I know, I know we're all hopped up about the RX 9070 XT and 9070 launch -- I know I am. But looking beyond the potential big win AMD is on for with its first RDNA4 graphics cards, I'm also particularly excited about the potential for the rumored 9060 XT. Not because it'll be cheaper again -- it will be -- but because it might have up to 16GB of VRAM. That's going to wreck Nvidia's long-term second-hand card market, which could have a much greater impact on AMD's market share over the long term.

It's all just rumors for now, and some of my colleagues are much less excited by this than I am, but I think there's some real potential here for this little card to be a game changer.
Hitting the VRAM wall
Video memory, or VRAM, has been a front-and centre feature of graphics card spec sheets for generations, but it's started to matter a lot more in recent years. While flagship graphics cards have exploded in their VRAM quantities, with the 5090 now offering 32GB, most mainstream cards have been getting by with less. It was only a couple of generations ago that the flagship RTX 3080 only had 10GB of VRAM, and outside of the top few models, you'll still see 12GB, 10GB, or even 8GB.

Read more