Skip to main content

Intel will leverage its chip-making expertise for quantum research

8th gen intel core launch
Image used with permission by copyright holder
Intel has detailed plans to forge its own path toward a chip that can facilitate quantum computing. The company will apparently eschew the strategies being implemented by other organizations working in this space in an attempt to adapt the silicon transistors commonly used in traditional computers to the task.

This represents a significant diversion from other groups looking to further the current state of quantum computing. At present, the superconducting qubits process seems to be the frontrunner in terms of popularity, while an implementation based around trapped ions has also demonstrated promising results.

Quantum computing diverges from traditional computing because qubits aren’t confined to the “on” and “off” states that restrict a standard bit. Intel’s silicon qubits would be able to represent data via electrons trapped inside modified versions of the transistors used in the company’s commercial chips, according to a report from Technology Review.

Intel hopes that its silicon qubits will be more reliable than the superconducting qubits being utilized elsewhere. Moreover, the fact that the company is applying its silicon qubits to standard chip wafers should help accelerate the research and development process.

Intel’s advantage over its competitors — assuming that its silicon qubits can stand up to the competition — is that the company is already very familiar with the process of manufacturing chips on an industrial scale. “The hope is that if we make the best transistors, then with a few material and design changes we can make the best qubits,” said its director of quantum hardware, Jim Clarke.

We’ll see how Intel’s project progresses compared to the many other groups currently working on quantum computing hardware. At present, the company is pursuing its silicon-based implementation, but it’s keeping its options open — research into superconducting qubits is also being conducted in-house.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Intel may already be conceding its fight against Nvidia
Two intel Arc graphics cards on a pink background.

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn't been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it's not surprising at all.
Arc Battlemage leaks
First, let's talk about what's new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it's alive and well. The cards might even be coming out this year, although given Intel's track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

Read more
Intel isn’t giving up on GPUs yet
The Intel logo on the Arc A770 graphics card.

Intel hasn't said much about its graphics cards lately. We saw the launch of the Arc A770 and A750 late last year, and the A580 just a few months ago, but after the departure of Raja Koduri from Intel's graphics division earlier this year, the future of Intel Arc has been a bit patchy. It now appears Intel is still planning to deliver on its road map, though.

A slide shared with Japanese gaming outlet 4Gamer shows that Intel is planning to launch a next-gen GPU in 2024. This lines up with Intel's initial road map, which promised that gamers would see next-gen Battlemage GPUs some time in early 2024.

Read more
Intel is cooking up an exciting DLSS 3 rival
Kena Bridge of Spirits on the Samsung Odyssey OLED G9.

Intel is working on a rival to Nvidia's DLSS 3. Intel and researchers from the University of California published a paper detailing the tech, currently dubbed ExtraSS, at Siggraph Asia 2023 (spotted by Wccftech). It accomplishes the same goal as DLSS 3 by generating new frames to improve performance. Unlike DLSS 3, however, ExtraSS uses frame extrapolation, not frame interpolation.

That latter method is how DLSS 3 (and AMD's FSR 3) works. It takes two sequential frames and compares them to generate a frame in-between. This naturally means you're playing on a slight delay, as the tech needs both the current and next frame to do its dirty work. Intel is proposing a technique that uses extrapolation, where it uses data only from previous frames to predict the next frame. That gets rid of the latency issue currently present in DLSS 3.

Read more