Skip to main content

Intel and AMD are mounting an attack against Nvidia. But will it work?

Nvidia has been dominating the world of PC graphics for decades. The company hasn’t slowed down the pace either. Year after year, it continues to launch graphics cards for both desktop and laptop that undercut the competition and scoop up the lion’s share of GPU purchases.

But in 2022, among a frustrating GPU shortage and a shake-up in world of CPUs, there’s never been a better opportunity for a healthy dose of competition. And at CES 2022, both Intel and AMD seemed ready to pounce.

Intel and AMD arm up

Image used with permission by copyright holder

Both Intel and AMD came to CES this year with a sense of determination around graphics.

First, AMD.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

It’s a company that already has a substantial investment in PC graphics. Its Radeon platform has been the only legitimate challenger to Nvidia for years, but it’s only the past couple of years that it’s been a worthy rival. The RDNA architecture introduced in 2019 brought AMD’s graphics back into the fold, and the Radeon RX 6000 GPUs followed that up with the kind of performance that could finally compete with the best of Nvidia.

And at CES 2022, among AMD’s press conference littered with announcements, the company extended its graphics lineup with eight new mobile GPUs. Yes, you read that right. Eight new GPUs! They stack up neatly against Nvidia’s offerings, even introducing a new line for thin-and-light gaming laptops, which was previously a hole in its lineup. Even its vastly improved integrated graphics, courtesy of the new Ryzen 6000 chips, are taking shots at Nvidia’s entry-level laptop cards, such as the MX550.

Intel, meanwhile, has been building up to the launch of its Arc Graphics for more than a year now. While the company didn’t formally announce its first line of graphics cards, Intel did say that Arc Graphics would appear in over 50 PC designs, including the Alienware x17 and an unspecified Lenovo Yoga 2-in-1.

Intel announces Intel Arc dGPU for Alienware x17 laptop.
Image used with permission by copyright holder

That sounds like a great start for Intel Arc. Clearly, the company is using its strength in processors  That’s something AMD has struggled to do, with only a handful of systems that use all AMD parts.

But with Intel and AMD’s efforts combined, laptop and PC manufacturers have plenty of options to get away from the Nvidia monopoly on PC graphics. But there’s an uphill battle ahead.

The problem

More than 160 new GeForce laptops expected this year.
Image used with permission by copyright holder

Nvidia has something more than just powerful graphics cards. It has a platform. On the application side of things, there’s the immense strength of the CUDA core. No matter how you slice it, Nvidia’s graphics cards have a significant upper hand in creative application performance over AMD. Right now, if you’re working in an application like Blender, Nvidia’s GPUs will give you better performance.

On the gaming side, things are more balanced. Except when it comes to Nvidia RTX. Between DLSS and ray tracing, Nvidia has its suite of must-have software features that often gives it an edge over both Intel and AMD. Nvidia has been building support behind these offerings for years, and at this point, that multiyear advantage over Intel and AMD is going to be very hard to overcome.

The argument for buying AMD over Nvidia is hard to make in terms of value.

DLSS started out very shaky, but in the past year or so, it’s become a crucial element of modern PC gaming. That’s especially true if you don’t have the absolute top-of-the-line graphics card.

In many cases, for example, AMD’s graphics cards line are every bit as powerful as Nvidia’s, but don’t have the ray tracing and upscaling features that Nvidia’s have. When they are priced so close, especially in this market, the argument for buying AMD over Nvidia is hard to make in terms of value.

Intel has a better chance at challenging Nvidia in this department with its background in software and tight engineering partnerships with key applications. XeSS is promising as an alternative to DLSS, too — but adoption of a new technology isn’t going to happen overnight.

Luke Larsen
Luke Larsen is the Senior editor of computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
AMD just revealed a game-changing feature for your graphics card
AMD logo on the RX 7800 XT graphics card.

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it's using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We've heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia's claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Read more
AMD’s multi-chiplet GPU design might finally come true
RX 7900 XTX installed in a test bench.

An interesting AMD patent has just surfaced, and although it was filed a while back, finding it now is all the more exciting because this tech might be closer to appearing in future graphics cards. The patent describes a multi-chiplet GPU with three separate dies, which is something that could both improve performance and cut back on production costs.

In the patent, AMD refers to a GPU that's partitioned into multiple dies, which it refers to as GPU chiplets. These chiplets, or dies, can either function together as a single GPU or work as multiple GPUs in what AMD refers to as "second mode." The GPU has three modes in total, the first of which makes all the chiplets work together as a single, unified GPU. This enables it to share resources and, as Tom's Hardware says, allows the front-end die to deal with command scheduling for all the shader engine dies. This is similar to what a regular, non-chiplet GPU would do.

Read more
What is Gemini Advanced? Here’s how to use Google’s premium AI
Google Gemini on smartphone.

Google's Gemini is already revolutionizing the way we interact with AI, but there is so much more it can do with a $20/month subscription. In this comprehensive guide, we'll walk you through everything you need to know about Gemini Advanced, from what sets it apart from other AI subscriptions to the simple steps for signing up and getting started.

You'll learn how to craft effective prompts that yield impressive results and stunning images with Gemini's built-in generative capabilities. Whether you're a seasoned AI enthusiast or a curious beginner, this post will equip you with the knowledge and techniques to harness the power of Gemini Advanced and take your AI-generated content to the next level.
What is Google Gemini Advanced?

Read more