Skip to main content

Intel Arc graphics use AV1 to improve Twitch streams

Intel has just announced that it will support AV1 video coding technology in the new Intel Arc GPUs.

The tech will offer hardware-accelerated encoding that may have a huge impact on video streaming quality, making it potentially attractive to streamers and viewers alike.

Intel's Arc AV1 demo featured two Elden Ring streams for comparison purposes.
Intel

AV1 stands for AOMedia Video 1 and is a royalty-free video coding format. It was first designed to support and improve the quality of video streams over the internet. Today, Intel announced that it will be adopting this format on its Arc GPUs, potentially giving a huge boost in video quality to streamed content.

Upon the release of Intel Arc Alchemist discrete graphics cards, AV1 is going to be Intel’s video encoding standard and will have an impact on the way content looks when streamed live. As such, considering that Intel is going to be the first in line to offer this kind of support for this technology, it could potentially make its GPUs much more interesting to streamers than they would have been otherwise. Of course, this depends on whether the technology is as good as it seems in Intel’s preview.

Intel promises to deliver up to 8K quality in both decoding and encoding in AV1. Decoding maxes out at 8K and 60 frames per second (fps) in 12-bit HDR quality, while encoding goes up to 8K resolution at 10-bit HDR. Intel refers to this as the industry-first full AV1 hardware acceleration and claims that the technology will prove to be up to 50 times faster than software encoding.

Intel's Arc AV1 demo featured two Elden Ring streams for comparison purposes.
Intel

Intel showed off a video of two separate streams of Elden Ring in order to demonstrate the power of AV1. To do so, game footage was captured via XSplit gamecaster in 1080p at 5Mbps. The first video used the H.265 advanced video coding (AVC) standard while the second video relied on Intel’s AV1.

Although at first glance, the difference in image quality may seem rather small, pausing reveals just how much more detailed the stream is when AV1 is being used. Environmental details, such as rocks, grass, and ground clutter, all have their own shape and texture. The stream on the left side, while it shows almost the exact scene from the game, is nowhere near as detailed and comes off as blurry in comparison.

The video goes on to display both background and foreground improvements, showing crisp graphics in the stream encoded in AV1 in every frame. Even individual blades of grass look much more pronounced in AV1, despite the fact that both streams are consuming the same bandwidth and are running at 1080p. The difference is definitely there, indicating that the technology shows a lot of potential when paired with Intel’s discrete GPU.

Intel Arc graphics cards are a huge milestone for Intel, marking the company’s entrance into the discrete GPU market. First found in laptops, they will be available in a desktop version later this year.

Editors' Recommendations

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
How Intel could use AI to tackle a massive issue in PC gaming
Ellie looking concerned.

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn't introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the "limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes." Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to "classic source representations, while also improving quality over previous work."

Read more
Intel’s Arc graphics cards have quietly become excellent
The backs of the Arc A770 and Arc A750 graphics cards.

Intel's Arc A770 and A750 were decent at launch, but over the past few months, they've started to look like some of the best graphics cards you can buy if you're on a budget. Disappointing generational improvements from AMD and Nvidia, combined with high prices, have made it hard to find a decent GPU around $200 to $300 -- and Intel's GPUs have silently filled that gap.

They don't deliver flagship performance, and in some cases, they're just straight-up worse than the competition at the same price. But Intel has clearly been improving the Arc A770 and A750, and although small driver improvements don't always make a splash, they're starting to add up.
Silently improving

Read more
Nvidia’s outrageous pricing strategy is exactly why we need AMD and Intel
Nvidia GeForce RTX 4090 GPU.

If you're finding it hard to keep up with the prices of graphics cards these days, it's not just you. GPUs have been getting pricier with each generation, and Nvidia's latest RTX 40-series is really testing the limits of how much consumers are willing to spend on PC hardware.

Nvidia may have the best GPUs available right now in terms of raw performance, but the way these new GPUs have been priced shows why the role of Intel and AMD is more important than ever.
GPU prices are through the roof

Read more