Skip to main content

Intel explores ‘spin qubits’ as the next wave in quantum computing

Programming on a silicon quantum chip

Quantum computing holds the most promise for advancing computing processes such as artificial intelligence, climate forecasting, and more. So far, though, quantum computing is in its infancy, with a great deal of research but few real-life applications. Every major technology company is working on advancing quantum computing, and as one of the leaders, Intel hopes to use “spin qubits” to help usher the technology into the mainstream.

Recommended Videos

In its most basic form, a quantum bit (qubit) is similar to the binary bit used in traditional computing. With quantum computing, information is indicated by the polarization of a photon. While standard computing, bits are always in one of two states, zero or one. With quantum computing, however, qubits can actually be in multiple states simultaneously. Without digging too much into the details, this phenomenon theoretically allows a quantum computer to perform huge numbers of calculations in parallel and to perform much faster than traditional computers at certain tasks.

Please enable Javascript to view this content

While most of the industry, including Intel, is working on one specific type of qubit, known as superconducting qubits, Intel is looking into an alternative structure known as “spin qubits.” While superconducting qubits are based on superconducting electronic circuits, as the name implies, spin qubits work in silicon and, according to Intel, overcome some of the barriers that have been holding quantum computing back.

This alternative approach takes advantage of the way single electrons spin on a silicon device and this movement is controlled through the use of microwave pulses. As an electron spins up, a binary value of 1 is generated, and when the electron spins down, a binary value of 0 is generated. Because these electrons can also exist in a “superposition” state where they can essentially act as if they are both up and down at the same time, they allow for parallel processing that can churn through more data than a traditional computer.

Spin qubits hold a number of advantages over the superconducting qubit technology that drives most contemporary quantum computing research. Qubits are fragile things, easily broken down by noise or even unintended observation, and the nature of superconducting qubits mean they require larger physical structures and they must be maintained at very cold temperatures.

Because they’re based in silicon, though, spin qubits are smaller in physical size and they can be expected to hold together for longer periods of time. They can also work at much higher temperatures and so don’t require the same level of complexity in system design. And, of course, Intel has tremendous experience in designing and manufacturing silicon devices.

Like all of quantum computing, spin qubit technology is in its nascent stages. If Intel can work out the kinks, however, spin qubits could help bring quantum computing to actual commercial applications much sooner than currently anticipated. Already, the company plans to use its existing fabrication facilities to create “many wafers per week” of spin qubit test chips, and should begin production over the next several months.

Mark Coppock
Mark Coppock is a Freelance Writer at Digital Trends covering primarily laptop and other computing technologies. He has…
Intel’s next-gen GPU might be right around the corner
The Intel logo on the Arc A770 graphics card.

Intel's next-gen Battlemage graphics cards have already been caught in shipping -- but not to actual customers. Prolific hardware leaker @momomo_us shared shipping manifests that list two Battlemage GPUs sent through the mail at the "Pre QS" stage of development. Still, it's definitely a sign that Intel's hotly-anticipated Battlemage GPUs are moving along.

https://twitter.com/momomo_us/status/1773396489844515059

Read more
Intel is cooking up an exciting DLSS 3 rival
Kena Bridge of Spirits on the Samsung Odyssey OLED G9.

Intel is working on a rival to Nvidia's DLSS 3. Intel and researchers from the University of California published a paper detailing the tech, currently dubbed ExtraSS, at Siggraph Asia 2023 (spotted by Wccftech). It accomplishes the same goal as DLSS 3 by generating new frames to improve performance. Unlike DLSS 3, however, ExtraSS uses frame extrapolation, not frame interpolation.

That latter method is how DLSS 3 (and AMD's FSR 3) works. It takes two sequential frames and compares them to generate a frame in-between. This naturally means you're playing on a slight delay, as the tech needs both the current and next frame to do its dirty work. Intel is proposing a technique that uses extrapolation, where it uses data only from previous frames to predict the next frame. That gets rid of the latency issue currently present in DLSS 3.

Read more
Confused about Core Ultra? We were too, so we asked Intel about it
Intel's new Intel Core Ultra badge.

It’s the start of something new for Intel: the Core Ultra era. With the launch of new Meteor Lake processors, Intel is ditching its old naming scheme for something new. Intel CPU names are going to start looking a little different, and you might be confused by what you see on a spec sheet. We’re here to get you up to speed.

At first glance, Intel is just dropping the “i” that has defined its lineup of CPUs for 15 years. There are a few new details in the naming scheme, though, and knowing them can help you navigate this new era for Intel.
The basics of Core Ultra

Read more