Skip to main content

Why is Intel into GPUs now? It was about to get stomped

Intel Raja Koduri
Image used with permission by copyright holder

Computer geeks woke this morning to shocking news.

Raja Koduri, former head of AMD’s Radeon Technologies Group, has joined Intel as Chief Architect. He will lead his new employer’s efforts to build discrete graphics hardware for “a broad range of computing segments.”

This will feel like a stab in heart for AMD’s fans. Raja was loved for his confident yet easy-going demeanor, and he’d become the unofficial face of the company’s underdog image. His resignation was bad enough — to have him join Intel a day later was the worst possible outcome.

It’d be easy to overstate such drama, but in this case the fuss is warranted. Intel has never been competitive in graphics hardware, and this hire is the company’s strongest attempt yet.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Why now?

The timing of this move may seem strange, if only because it’s been ages since Intel was serious about graphics. Its last major push began around the debut of the modern Core processor line. For a time, Intel HD graphics seemed to make decent progress — at least enough to be usable. That didn’t last long. Today, Intel’s integrated graphics are well behind entry-level hardware from AMD and Nvidia.

Without excellent graphics, Intel can’t present a package as complete as its competitors.

Intel’s “latest” graphics, in the eighth-generation processors, underscore this. Though called Intel UHD 620 — which sounds like an upgrade over the previous Intel HD 620 — the hardware makes no strides over its predecessor. In 3DMark Fire Strike, a common benchmark, UHD 620 is lucky to exceed a score of 900 while the Nvidia GTX 1050 by comparison scores around 5,400. That’s a big gap, and if you’re saddled with a laptop that lacks graphics from AMD or Nvidia, you already notice it.

This isn’t just about laptops, though. Intel’s press release suggests the company wants to build graphics hardware for a variety of systems and there’s a lot of reasons why Intel might want to. Gamers have proven a loyal group with deep pockets, buying expensive graphics cards even while PC sales have slipped year after year. And then there’s the enterprise world, where Nvidia is currently cleaning house with its high-end solutions that are powering data centers, self-driving cars, and research projects.

The real threat is Apple and Qualcomm

I’m sure Intel would love to see gamers, data centers, and universities buying its own high-end graphics at $500 a pop. That, however, is only part of Intel’s goal. Intel’s decision is more influenced by an impending war over the heart and soul of computers.

Since the mid-90s, virtually all home PCs have been sold with Windows running on Intel processors. The term “Wintel” has fallen out of fashion, but that hasn’t changed the reality. Intel Inside has been synonymous with the PC for 25 years, particularly among laptops and 2-in-1s. Most people don’t even think about it — and end up with a “Wintel” machine by default.

That dominance is no longer guaranteed. Apple and Qualcomm have major strides in computing performance over the past decade. Though Intel still has the technical edge, everyday PC use isn’t demanding enough to make it obvious. What is obvious, though, is how badly these competitors thrash Intel’s graphics. While iPads and smartphones can display rich 3D graphics, Intel’s hardware struggles to run new games at their lowest settings.

2018 will be the year this threat becomes real. Qualcomm and Microsoft have partnered to produce Windows 10 laptops that are compatible with all current Windows software, and the first products look set to appear at CES 2018 — and perhaps earlier. Shoppers buying next year might leave with an inexpensive, LTE-capable 2-in-1 powered by Qualcomm. It’s not hard to imagine how an inexpensive, thin, long-lasting, always-connected computer could damage Intel.

Intel is aware of this. In fact, it’s already threatened to sue Qualcomm and Microsoft over the issue, claiming the x86 emulation used to accomplish it is infringement on Intel’s patents.

Apple, meanwhile, has the iPad Pro. While not as direct an alternative to Intel’s laptops, Apple clearly wants people to buy an iPad instead of a traditional PC, and its hardware is now powerful enough to make that a convincing choice. The iPad Pro is still missing a few pieces of the puzzle — the keyboard isn’t great, for example — but a few generations might iron out those issues. Think about it: If you could buy an iPad with a good keyboard for $600 to $800 bucks, why wouldn’t you? The iPad will be more versatile and portable than any Intel-powered 2-in-1.

And — of course — it will deliver eye-candy Intel HD can’t hope to match.

Good news, or bad news?

That’s why Intel needs better graphics hardware. Without excellent graphics, Intel can’t present a package as complete as its competitors. Companies like Dell and Samsung want a single chip that can do everything. Pairing Intel hardware with a separate graphics chip isn’t ideal. The Intel-AMD partnership, announced just a few days ago, is just a band-aid over a wound that needs deeper attention.

Raja Koduri can mend that wound, but it will take time. You shouldn’t expect to see Intel-branded graphics cards on store shelves next year. I’d guess we won’t see real progress until the latter half of 2019, and that could easily slip into 2020, or even later.

Image used with permission by copyright holder

The timeline is important, because Intel’s peers are quick. As mentioned, you’ll see Qualcomm-powered laptops in stores next year. By late 2019, Apple will have blitzed through two iPad hardware cycles, and Qualcomm may be ready to introduce its third generation of laptop hardware.

It’s impossible to say if Intel’s new effort will be competitive, but whatever the result, there’s undoubtedly a new war over the PC’s future. All the big names in tech will be involved, and the result, whatever it is, will be visible in the next computer you buy.

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
Intel Arrow Lake gets a surprise 33% gaming boost — with one caveat
The Core Ultra 9 285K socketed into a motherboard.

Intel Arrow Lake has struggled to compete against some of the best processors from both AMD and Intel itself, but improvements are on the way. In fact, a completely unexpected update just gave the CPUs a major boost. Unfortunately, there's a caveat: That boost only applies to one game.

The update in question was just announced by CDProjektRed, which has dropped a surprise patch for Cyberpunk 2077. The game studio now promises to improve in-game performance on Arrow Lake CPUs by up to 33%, which is a tune-up that gamers badly need, considering that the CPUs generally failed to impress in gaming scenarios.

Read more
It’s finally time to stop ignoring Intel GPUs
Two intel Arc graphics cards on a pink background.

Intel is taking another swing at making it among the best graphics cards with the Arc B580, which is set to launch in a matter of days. It's the first time we're seeing discrete graphics on desktop packing Intel's Battlemage architecture, and it's arriving just weeks before AMD and Nvidia are set to launch new generations.

I'm sure you've heard about Intel's first attempt with discrete GPUs, and all of the problems that ensued. Things have changed quite a bit over the past few years, though. I need to wait until the Arc B580 is here to fully put it through its paces, but based on what Intel has shared so far, it's a card you should definitely keep an eye on.
Fulfilling AMD's role

Read more
Indiana Jones and the Great Circle proves Nvidia wrong about 8GB GPUs
Indiana jones buried in the sand.

Nvidia was wrong, and Indiana Jones and the Great Circle is proof of that. Despite being a game that's sponsored by Nvidia due to its use of full ray tracing -- which is said to arrive on December 9 -- multiple of Nvidia's best graphics cards struggle to maintain a playable frame rate in the game, and that largely comes down to VRAM.

Computer Base tested a swath of GPUs in the game across resolutions with the highest graphics preset, and one consistent trend emerged. Any GPUs packing less than 12GB of VRAM couldn't even maintain 30 frames per second (fps) in the game at its highest graphics settings. That led to some wild comparisons as you can see in the chart below. The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB.

Read more