Skip to main content

Intel’s Arc graphics cards have quietly become excellent

Intel’s Arc A770 and A750 were decent at launch, but over the past few months, they’ve started to look like some of the best graphics cards you can buy if you’re on a budget. Disappointing generational improvements from AMD and Nvidia, combined with high prices, have made it hard to find a decent GPU around $200 to $300 — and Intel’s GPUs have silently filled that gap.

They don’t deliver flagship performance, and in some cases, they’re just straight-up worse than the competition at the same price. But Intel has clearly been improving the Arc A770 and A750, and although small driver improvements don’t always make a splash, they’re starting to add up.

Silently improving

The backs of the Arc A770 and Arc A750 graphics cards.
Jacob Roach / Digital Trends

If you keep up with the world of graphics cards, you’ve probably heard about Intel doubling performance with a driver update earlier this year. That was only for DirectX 9 games, however, which were all but broken at launch. Intel doubled their performance, sure, but that only brought the A770 and A750 up to par.

Since then, Intel has focused more attention on recent DirectX 11 and DirectX 12 games, and it’s clear some titles are receiving specific optimization. When I retested Cyberpunk 2077, for example, the Arc A750 went from 56 frames per second (fps) at 1080p to 76 fps — nearly a 36% increase. Similarly, the Arc A770 has gone from 59 fps to 83 fps since launch.

Previously, Cyberpunk 2077 was one of the worst showcases of the A750 and A770, falling short of even Nvidia’s RTX 3060. Now, it’s one of the best, with the A750 outperforming even the RTX 3060 Ti at 1080p and 1440p.

It’s not just Cyberpunk 2077. The A750 went from 86 fps in Horizon Zero Dawn at 1080p to 95 fps, while the A770 climbed from 98 fps to 106 fps. Again, the cards are now competing with GPUs like the RTX 3060 Ti and RX 6600 XT after taking a clear back seat.

The Arc A770 graphics card running in a PC.
Jacob Roach / Digital Trends

Those are impressive improvements, but they’re targeted. In a Vulkan-based game like Red Dead Redemption 2, Intel’s slew of driver updates didn’t move the needle at all. And in Assassin’s Creed Valhalla, there are some minor improvements, but they only account for a few frames.

What’s becoming clear, however, is that Intel’s claim that the Arc A750 and A770 have more fuel in the tank is holding up. Targeted optimization through drivers for specific games has brought some stark improvements. They aren’t universal, but if Intel keeps up its driver pace, the A750 and A770 could be a force to be reckoned with.

Competitive on price

Nvidia RTX 3060 Ti Founders Edition on a pink background.
Jacob Roach / Digital Trends

All of this comes in the context of price, though. For as impressive as Intel’s driver improvements are, the A750 and A770 are competing with last-gen GPUs from AMD and Nvidia in performance. Even the RX 7600 can blow past the A770 in most games, and it’s nearly $100 cheaper. Intel is starting to become competitive, though.

Take the Arc A750 at $250. It’s not a great option now that the RX 7600 is here at $270 and the last-gen RX 6600 XT is selling for around the same price. However, it’s been marked down to only $200 a handful of times recently, and at that price, it’s a steal.

That becomes apparent when you look at ray tracing. In Hogwarts Legacy, the RX 6700 XT  (around $350) squarely beats the A770 and A750 at 1080p (by about 18% and 37%, respectively). Flip on ray tracing, though, and suddenly even the A750 is matching AMD while the A770 claims a lead of 30%. The A770 is even competitive with the RTX 4060 Ti here.

Front of the AMD RX 7600.
Image used with permission by copyright holder

Similarly, AMD’s crop of GPUs at this price point can’t even maintain 30 fps in Cyberpunk 2077’s ray tracing mode at 1080p without the assistance of upscaling (you’ll need to jump up to a $500 RX 6800 to hit that mark), while the A750 and A770 are comfortably above 30 fps. That was a strength of Intel’s Arc GPUs at launch, being competitive with Nvidia at ray tracing, and even AMD’s new RX 7600 doesn’t change that.

There are other considerations here as well, specifically VRAM. The A770 is outfitted with 16GB of VRAM, while options like the RTX 3060 Ti are only outfitted with 8GB. In Horizon Zero Dawn, the A770 is a few frames slower than the RTX 3060 Ti. In more recent VRAM-limited games like Resident Evil 4 and The Last of Us Part 1, however, the A770 beats out the RTX 3060 Ti by close to 10%.

Intel Arc A770 GPU installed in a test bench.
Image used with permission by copyright holder

Intel still has a long road ahead, but it has made a ton of progress very quickly. At list price, the A750 and A770 are competitive, even if they aren’t the best option for everyone. On sale, it’s hard justifying anything from AMD or Nvidia at the same price. If you can find the A750 at $200, it’s suddenly competing with a GPU like the RTX 3050 on price, while offering around a 30% boost in performance.

My biggest hope is that Intel will stick with it. The A770 and A750 have proven over the last few months that Team Blue has a fighting chance in the world of GPUs. The A770 and A750 are a solid swing out of the gate, but the future of Arc really hinges on what the upcoming Battlemage, Celestial, and Druid generations can offer.

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
Nvidia isn’t selling graphics cards — it’s selling DLSS
RTX 4070 logo on a graphics card.

Nvidia does, of course, sell graphics cards. In fact, it sells most of the best graphics cards on the market. But more than ever before, the company is increasingly hanging its hat on its impressive Deep Learning Super Sampling (DLSS) to sell GPUs rather than raw performance.

The RTX 4090 stands as a crowning achievement in the world of consumer graphics cards, but once you get down into the cards that most gamers will actually buy, the generational improvements start to slip. This became abundantly clear with the launch of the RTX 4070. The card has been well-received, and I even awarded it a rare Editor's Choice award in my RTX 4070 review. But that's despite its generational improvements, not because of them.

Read more
Nvidia finally made a tiny RTX 4000 graphics card (but you probably don’t want it)
RTX 4000 SFF going into a PC case.

After months of massive graphics cards like the RTX 4090, Nvidia is finally slimming things down at its GPU Technology Conference (GTC). The RTX 4000 SFF delivers the Ada Lovelace architecture in a tiny package, but you probably won't find it sitting among the best graphics cards.

Although the RTX 4000 SFF uses the same architecture in gaming GPUs like the RTX 4080, it's built for a very different purpose. It uses Nvidia enterprise drivers, and it's made to power computer-aided design (CAD), graphics design, AI applications, and software development, according to Nvidia. The card takes up two slots and includes a low-profile bracket for cases like the Hyte Y40.

Read more
Here’s how Intel doubled Arc GPUs’ performance with a simple driver update
intel arc alchemist driver update doubled performance a770 logo respec featured

As newcomers in the world of discrete graphics cards, the best hope for Intel's Arc A770 and A750 was that they wouldn't be terrible. And Intel mostly delivered in raw power, but the two budget-focused GPUs have been lagging in the software department. Over the course of the last few months, Intel has corrected course.

Through a series of driver updates, Intel has delivered close to double the performance in DirectX 9 titles compared to launch, as well as steep upgrades in certain DirectX 11 and DirectX 12 games. I caught up with Intel's Tom Petersen and Omar Faiz to find out how Intel was able to rearchitect its drivers, and more importantly, how it's continuing to drive software revisions in the future.
The driver of your games

Read more