Skip to main content

Could Intel’s upcoming event reveal more about its new discrete GPU?

We will set our graphics free SIGGRAPH2018 [Intel Graphics Card Teaser]

Back at the at the Siggraph 2018 show, Intel teased a new dedicated graphics card, but another event is also coming soon. DigiTimes is reporting that Intel is hosting a new conference in December, timed right before CES 2019.

We reached out to Intel for clarification and it confirmed that the event is an architecture day. Intel did not otherwise confirm discrete graphics for this event, but DigiTimes writes that the conference is all about progress on Intel’s latest technologies. That supposedly includes discussions on Arctic Sound, the code name for Intel’s discrete GPU, as well as the other units of Intel businesses.

From what was revealed at Siggraph 2018, Intel’s new dedicated graphics card will be coming in 2020 to “set graphics free.” That would be the first dedicated graphics card from Intel in more than 20 years, coming after i740 GPU in 1998. It also would place Intel alongside both Nvidia and AMD, which have dominated the graphics card and gaming market in all these years.

Intel might be better known for its integrated graphics, which aren’t known for power. Recently, the company has taken steps in recent times to try and resolve that deficiency. It previously pegged former AMD Radeon graphics chip designer Raja Koduri as its new chief architect and senior vice president of the new Core and Visual Computing Group. Though still unverified, DigiTimes also reports that Intel has opened a new GPU R&D center in Canada and has plans for another in India. If it is true, one can assume that is where these new dedicated graphics cards could likely be under development. Those are all just rumors at this point, but certainly interesting ones.

The sizzle reel and fancy silhouette of the upcoming dedicated graphics card seen at Siggraph 2018 also show that Intel has put a lot of effort into designing and marketing its products. Still, given that this upcoming conference is being dubbed as architecture day, the scope in this event could likely be limited, and technical details on chipsets would not be made clear. It could also just be a way to create media buzz or share more on how Intel’s microarchitecture is doing as a whole.

Arif Bacchus
Arif Bacchus is a native New Yorker and a fan of all things technology. Arif works as a freelance writer at Digital Trends…
This could be the reason you upgrade your GPU
The RTX 4080 in a running test bench.

Now more than ever, the best graphics cards aren't defined by their raw performance alone -- they're defined by their features. Nvidia has set the stage with DLSS, which now encompasses upscaling, frame generation, and a ray tracing denoiser, and AMD is hot on Nvidia's heels with FSR 3. But what will define the next generation of graphics cards?

It's no secret that features like DLSS 3 and FSR 3 are a key factor when buying a graphics card in 2024, and I suspect AMD and Nvidia are privy to that trend. We already have a taste of what could come in the next generation of GPUs from Nvidia, AMD, and even Intel, and it could make a big difference in PC gaming. It's called neural texture compression.
Let's start with texture compression

Read more
AMD just revealed a game-changing feature for your graphics card
AMD logo on the RX 7800 XT graphics card.

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it's using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We've heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia's claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Read more
The Intel we know is dead, but its new Lunar Lake chips are very much alive
An Intel executive holding a Lunar Lake CPU.

It’s not an exaggeration to call Lunar Lake radical. The upcoming chips don't showcase the Intel we’ve seen so many times before -- the one that pushes power, core counts, and clock speeds to their limit and wonders why it can’t manage a full day’s charge in a laptop. Lunar Lake is Intel admitting defeat, but not in a way that puts it down for the count. It’s clear that Lunar Lake, announced at Computex 2024, is a fresh start.

Intel has called it a “radical low-power architecture” previously, and that statement holds true. Everything in Lunar Lake is built around power efficiency. And with that new focus, Intel is shedding all of the tentpoles of its previous generations that don’t contribute -- even down to needing to manufacture the chip in its own fabs.
A new focus

Read more