Skip to main content

Here’s why Intel’s A380 GPU could really be a hidden gem

The Intel Arc A380, the only Arc Alchemist graphics card that’s currently available, was just tested in various games after being overclocked.

The performance gains caused by the overclocking show that the GPU has the potential to be much better than what some previous benchmarks may have implied.

Тест видеокарты Intel A380 в играх и рабочем ПО.

Intel’s Arc A380 has already been seen in a number of benchmarks and tests, including Intel’s own, which redeemed it slightly after a round of bad news. This time around, the GPU was put to the test by Pro Hi-Tech, a YouTuber who specializes in overclocking. That’s exactly what he did with the Arc A380 — he boosted the card to unlock some of the hidden power it seems to possess. These results could be a sign of the Arc A380 being a lot better than initially thought.

In order to overclock the GPU, the YouTuber had to take a different approach than usual. This is because well-known clock/voltage tools such as MSI Afterburner don’t support Intel Arc just yet. As such, he didn’t alter the GPUs core clocks; instead, he used Intel’s proprietary graphics utility tool in order to tweak the card’s voltage. Pro Hi-Tech adjusted the GPU Performance Boost setting to 55%, and the voltage offset to +0.255mv. Before moving on to testing the boosted GPU in a gaming scenario, the YouTuber also enabled resizable BAR.

These modifications brought up the clock on the Intel Arc A380 by up to an additional 150MHz, meaning a relatively small boost of six percent. However, the power usage went up considerably, from around 35 watts to — at times — more than 55 watts. That’s an increase of up to 57%, but also, it’s an interesting figure. Intel said that the official TDP of the GPU sits at 75 watts.

This brings us to the results of the testing. In order to give an accurate estimate of the card’s performance, the YouTuber compared the results to those of a regular Arc A380 with no overclock and to Nvidia’s GeForce GTX 1650, a card that has often been named as a direct competitor for this entry-level GPU.

Intel Arc A380 benchmarks.
Pro Hi-Tech

Pro Hi-Tech benchmarked the Intel Arc A380 in Cyberpunk 2077, God of War, Doom Eternal, Rainbow Six Siege, Watch Dogs Legion, and World of Tanks. Each and every game showed a performance increase, which is not all that surprising, but the gains are big enough to bring the Arc GPU to a level where it’s on par with the GTX 1650.

In Cyberpunk 2077, the boosted Arc A380 actually managed to beat Nvidia, reaching 51 frames per second (fps) compared to Nvidia’s 42. Some games, such as Doom Eternal, show a massive increase in fps, going from 64 to 102. On average, the stock version of Intel Arc A380 scored 55.1 fps across six titles; the overclocked version hit 75.6, and the GTX 1650 won by a negligible margin with 75.9. This was first spotted by Tom’s Hardware.

These benchmark results show that there might be more to Intel Arc than meets the eye. However, it’s now up to Intel to bring out that potential and tweak the performance of the GPU without requiring users to overclock it. Let’s hope that all the early benchmark data will prove to be useful and will allow Intel to optimize the Arc A380.

Editors' Recommendations

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Intel quietly steps out of the shadows with two new GPUs
Two Intel Arc chips in front of a blue and purple gradient background.

Intel has just released two new mobile graphics cards -- the Arc A570M and the Arc A530M. However, the launch was a little bit of a "don't blink or you'll miss it." The cards appeared on Intel's website, but there was no announcement of any kind.

Over time, we've grown quite fond of Intel's initial batch of desktop GPUs, so we're paying close attention to how the company continues to grow its mobile cards for gaming and other high-performance laptops. This unexpected launch puts Intel ahead of both AMD and Nvidia when it comes to the number of laptop GPUs available, but the actual number of computers that will utilize these cards remains to be seen.

Read more
Here’s why I’m glad Nvidia might kill its most powerful GPU
The RTX 4090 graphics card sitting on a table with a dark green background.

A reliable leaker has just revealed that Nvidia might be abandoning the idea of releasing an RTX 4090 Ti. If the project hadn't been canceled, the RTX 4090 Ti would have ended up becoming the best GPU by a mile -- or at least the most powerful. That spot is currently held by Nvidia's own RTX 4090.

But don't worry -- if the report about the cancellation is true, it's not such a bad thing at all. In fact, it might be for the best for pretty much everyone involved. Here's why.

Read more
Why Nvidia’s brand new GPU performs worse than integrated graphics
Four Nvidia H100 HPC GPUs side by side.

One might think that a GPU that costs over $40,000 is going to be the best graphics card for gaming, but the truth is a lot more complex than that. In fact, this Nvidia GPU can't even keep up with integrated graphics solutions.

Now, before you get too upset, you should know I'm referring to Nvidia's H100, which houses the GH100 chip (Grace Hopper). It's a powerful data center GPU made to handle high-performance computing (HPC) tasks -- not power PC games. It doesn't have any display outputs, and despite its extensive capabilities, it also doesn't have any coolers. This is because, again, you'd find this GPU in a data center or server setting, where it'd be cooled with powerful external fans.

Read more