Skip to main content

Intel’s discrete GPU could offer photorealistic graphics, teaser video hints

Image used with permission by copyright holder

The graphics card market might currently be dominated by AMD and Nvidia, but as the supposed 2020 release date for its own modern discrete GPU approaches, Intel is continuing to hype its incoming offerings to gamers. In its latest PR move, the company pushed out a Twitter video on January 29 which, in part, teases photorealistic graphics as part of its future goals.

Don’t just take any job. Join a movement. https://t.co/JH0TqASGtE #JobsatIntel #IntelCareer pic.twitter.com/y8hBPdRF88

— Intel Graphics (@IntelGraphics) January 29, 2019

Though the video is more of a recruitment tool and doesn’t specially mention any product or the Arctic Sound GPU by name, it provides an internal peek at what is being worked on at Intel. It also features the likeness of Raja Koduri, head of Intel’s Core and Visual Computing Group. He explains that Intel is starting from zero with its discrete graphics project, freeing itself from its existing integrated graphics solutions. But it is towards the end of the video where Koduri teases photorealism goals for its graphics, and how that fresh start can take a GPU to an entirely new level for consumers.

“I want a future where we can have those photorealistic immersive worlds. I want a future where I can have a conversation with somebody three thousand miles away as if they are standing right in front of me. I want to have games with virtual worlds that are as large as this entire universe,” said Koduri.

It remains questionable how and to what extent Intel can achieve photorealistic and truly immersive worlds, and building a new GPU to support that goal takes a lot of internal effort. Koduri hints at this in the video, saying, “This is what the best engineers want to do, and we have access to all of the right Lego blocks in this company.”

New Intel CEO Bob Swan also makes an appearance in the video. He mentions a bit on Intel’s history of leading the computing market, and where it could be taking the sector in the future. “We don’t just look at a simple product, we look at the evolution of computing architectures across multiple fronts of which graphics will play a key component of that as we continue to play a leadership role in driving high-performance computing into the future, ” said Swan.

Editors' Recommendations

Arif Bacchus
Arif Bacchus is a native New Yorker and a fan of all things technology. Arif works as a freelance writer at Digital Trends…
Intel’s new integrated graphics could rival discrete GPUs
The Intel Meteor Lake chip.

Intel has just announced an interesting update to its upcoming Meteor Lake chips: The integrated graphics are about to receive an unprecedented boost, allowing them to rival low-end, discrete GPUs. Now equipped with hardware-supported ray tracing, these chips have a good chance of becoming the best processors if you're not buying a discrete graphics card. Performance gains are huge, and it's not just the gamers who stand to benefit.

The information comes from a Graphics Deep Dive presentation hosted by Intel fellow Tom Petersen, well-known to the GPU market for working on Intel Arc graphics cards. This time, instead of discrete graphics, Petersen focused on the integrated GPU (iGPU) inside the upcoming Meteor Lake chip. Petersen explained the improvements at an architectural level, introducing the new graphics as Intel Xe-LPG.

Read more
Intel’s forgotten Arc GPU might still have some life
Intel Arc A770 graphics card.

Intel's most stealthy GPU appeared in yet another round of leaked benchmarks. The Intel Arc A580 was tested in OpenCL on Geekbench, and its score pits it against AMD's RX 7600. But does that really mean that it'll be on par with one of AMD's best graphics cards in this generation?

Although most of us have heard of the Arc A770 and the Arc A750, and even the entry-level Arc A380, the Arc A580 remains a bit of a myth. Announced well over a year ago, the GPU has been spotted in benchmarks every so often, but Intel hasn't released it to the market just yet. It's hard to say what the reason is behind this delay, as the specs of the card have been known for a long time.

Read more
How Intel could use AI to tackle a massive issue in PC gaming
Ellie looking concerned.

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn't introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the "limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes." Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to "classic source representations, while also improving quality over previous work."

Read more