Skip to main content

Why the world’s fastest quantum computer … really isn’t a quantum computer

d wave 2000 qubit processor quantum computing 2x usc v4
USC Viterbi School of Engineering
In July 2016, Lockheed Martin made improvements to the Quantum Computation Center housed at the USC Information Sciences Institute, increasing its qubit capacity to 1,098.

It’s latest step in a process that’s been going on for six and a half years, as the international security and aerospace company attempts to carve out a space in the fast-moving field of quantum computing.

In twenty years, this technology could have a huge impact on everything from academic research projects to online cybersecurity — but Lockheed Martin’s usage demonstrates that some of the benefits of quantum computing could be even closer to fruition, even if a fully functional quantum computer is a ways off.

Quantum Leap

The first quantum system that Lockheed Martin bought from D-Wave was a 128-qubit processor codenamed “Rainier,” which otherwise went by the name D-Wave One. It was later upgraded to the 512-qubit “Vesuvius,” which was itself recently upgraded to the 1,152-qubit D-Wave 2X.

“It is commercially available — you can go buy one — but it’s really a research and development, experimental type of system,” said Greg Tallant, head of Lockheed Martin’s Quantum Computation Center (QCC). “It’s not ready for production in the sense of, you could buy one and use it in the way that you use computers now.”

Lockheed Martin’s usage of a quantum annealer is evidence of the enormous potential of quantum computing to transform numerous fields in the coming years.

Before making a purchase, Lockheed Martin representatives visited D-Wave at its headquarters in Vancouver multiple times. The company determined that the hardware had “potential and promise,” and so the decision was made to purchase one of their systems. The next step was to forge a partnership with the University of Southern California, which resulted in the Quantum Computation Center, a facility that is part of the institution’s Viterbi School of Engineering.

USC’s part of the bargain allows them to carry out research using the hardware, largely pertaining to the space that D-Wave’s system occupies compared to other quantum computers, and benchmarking the machine. Meanwhile, Lockheed Martin can take that information and investigate what applications would benefit its interests.

Validation and verification was the inspiration for the program, but now it’s expanding into other areas. Machine learning is a top priority, but there’s also space for the system to be applied in the field of planning and scheduling — perhaps not the most glamorous use of cutting-edge technology, but certainly a productive application.

“As the number of variables in the problem grows, the number of possibilities you have to consider grows exponentially,” Tallant explained. “The problem that’s used at an academic level to describe this kind of usage is the travelling salesman problem.”

D-Wave 2x
D-Wave
D-Wave

The travelling salesman problem hinges around a list of cities that a hypothetical product-pusher has to visit, and the distances between them. The solution is the shortest possible route between these locations that visits every city only once before returning to the point of origin. It can be solved using today’s computers, of course, but quantum hardware could potentially offer drastic speed-ups, especially as the number of cities grows.

Tallant and his team haven’t yet been able to demonstrate that their D-Wave hardware will be able to offer an advantage over classical computers in this application. However, progress is being made, and further advances are expected thanks to the powerful 1,152-qubit processor that the company purchased in July.

Leader of the Pack?

“There’s a little bit of a complication,” noted Tallant. “Their current chip is 1,152 qubits, but when you install these systems, they have to go through a calibration process. That process causes some of the qubits to essentially fail calibration, and not be usable in a computation sense.”

In effect, it’s a similar situation to using your 32GB iPhone for the first time, and finding that you don’t actually have 32GB of storage to work with — although, obviously, what’s going on under the hood is a little different.

“Our 1,152-qubit system that we have now came in at 1,098 when we calibrated it,” said Tallant. A host of other engineering improvements have been made to the system, but an upgrade to the hardware’s qubit count will hopefully offer tangible improvements in terms of the problems that the rig can handle.

“It’s a way to take something that you know how to do, and evolve it into something that you don’t know how to do.”

“You can think of the number of qubits as close to the problem size that you can treat,” Tallant added. “So, when you have a system that’s only got 512 qubits, you’re limited in the size of the problem to — in the best case — something that has 512 variables. In practicality, it’s more like in the order of 200 variables.”

These figures might cause you to raise an eyebrow if you’ve been keeping up with the field of quantum computing. In May, IBM proudly announced that its five-qubit quantum computer was going to be made accessible to academics and enthusiasts via the web-based IBM Experience platform.

Why would IBM be showing off its five-qubit system if D-Wave was already selling quantum computers that have access to a hundred times that amount?

It’s simple — the hardware being used by Lockheed Martin is not really a quantum computer.

Different Strokes

The systems at the heart of the IBM and Lockheed Martin’s quantum program utilize superconducting qubits, a promising implementation that researchers hope will eventually spawn a large-scale universal quantum computer. But it’s not ready yet.

IBM’s system doesn’t qualify as a large-scale universal quantum computer, because it only uses five qubits. In terms of its architecture and its approach, it’s a universal system, but the low qubit count means it can’t perform full quantum error correction.

Meanwhile, the D-Wave system being used by Lockheed Martin also doesn’t qualify. It’s technically a quantum annealer, rather than a quantum computer. Put simply, it can only tackle a very limited set of problems.

D-Wave
D-Wave
D-Wave

“The D-Wave system is not a general purpose computer. It solves a particular problem that’s referred to as the Ising Spin Glass,” explained Tallant. He refers to it as an “optimization solver,” which is used for problems where resources like time and fuel can be used most efficiently when a powerful computer studies a wide range of different possible configurations.

“You can think of quantum annealing like this,” said Tallant. “We’re going to program a machine with a problem that we know the answer to, and we’re also going to tell it the problem that we [don’t]. Then we’re going to do this evolution in time, where we mix the two together. As long as we follow all the fundamental rules of physics, when we get done in the end, we’re going to have the answer to our problem.”

“In a sense, it’s a way to take something that you know how to do, and evolve it into something that you don’t know how to do —you still get the answers, even not knowing how to do it.”

Fit For Purpose

D-Wave’s qubit counts are well ahead of the curve, compared to other outfits working towards a quantum computer. However, it’s difficult to make a direct comparison between the company’s hardware and researchers attempting to construct a universal system.

“It is most definitely not a universal quantum computer, it is an annealer,” said Tallant. “The D-Wave system is not universal; we don’t have all the couplings that you would need to implement that.”

That’s not to say that Lockheed Martin won’t move beyond an annealer once the technology has developed sufficiently. “We would love to have a universal system,” Tallant added.

Lockheed Martin’s usage of a quantum annealer is evidence of the enormous potential of quantum computing to transform numerous fields in the coming years. A universal system is the endgoal for researchers working in the field, but even despite the compromises necessary to deliver results in the here and now, we’re seeing hardware with industrial applications.

It’s easy to mythologize the quantum computer as a game-changing discovery that’s still years away. A universal system will indeed have a universal impact when it comes to pass — but by that time, quantum technology like D-Wave’s annealer will already be put to good use across a number of different fields.

Updated on 08-26-2016 by Brad Jones: Clarified comments on IBM’s universal quantum computer.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
The case for XR at work isn’t going away just yet
A vision of Meta's metaverse in the work setting.

There was a lot of hope in the mid-2010s that virtual reality would take off and become the next big thing in home entertainment. Many brands invested in VR headsets that would be connected to PCs and gaming consoles to take users seamlessly from a 2D environment to an immersive 3D world. While enthusiasts might have jumped on the bandwagon quickly, many consumers have yet to invest in the ever-prevalent technology, much to the chagrin of Meta (formerly Facebook) CEO Mark Zuckerberg and his burgeoning metaverse.

The concept of VR hasn't gone away and has gradually morphed into extended reality (XR) via further upgrades as we progress into the mid-2020s. Hardware has become smaller, lighter, and wireless. Companies including Meta and Apple market their XR headsets, the Meta Quest 3 and the Apple Vision Pro respectively, as entertainment devices with several enterprise capabilities built in. But other companies are taking a different approach.
Lenovo's vision of XR

Read more
Don’t believe the hype — the era of native resolution gaming isn’t over
Alan Wake looking at a projection of himself.

Native resolution is dead, or so the story goes. A string of PC games released this year, with the most recent being Alan Wake 2, have come under fire for basically requiring some form of upscaling to achieve decent performance. Understandably, there's been some backlash from PC gamers, who feel as if the idea of running a game at native resolution is quickly becoming a bygone era.

There's some truth to that, but the idea that games will rely on half-baked upscalers to achieve reasonable performance instead of "optimization" is misguided at best -- and downright inaccurate at worst. Tools like Nvidia's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR) will continue to be a cornerstone of PC gaming, but here's why they can't replace native resolution entirely.
The outcry
Let's start with why PC gamers have the impression that native resolution is dead. The most recent outcry came over Alan Wake 2 when the system requirements revealed that the game was built around having either DLSS or FSR turned on. That's not a new scenario, either. The developers of Remnant 2 confirmed the game was designed around upscaling, and you'd be hard-pressed to find a AAA release in the last few years that didn't pack in upscaling tech.

Read more
23% of PC gamers probably can’t play Alan Wake 2. Here’s why
Alan looks surprised in Alan Wake 2.

We've known for months that Alan Wake 2 will be one of the most demanding games on PC, but new details show just how taxing the upcoming title from developer Remedy will actually be. According to a now-deleted tweet from a Remedy employee, somewhere around 23% of PC players won't be able to play the game.

To be clear, the employee didn't say that number explicitly. In response to the outcry over the Alan Wake 2 system requirements, the employee shared that only cards with mesh shaders are officially supported, meaning any Nvidia 10-series or AMD RX 5000-series GPUs or older aren't officially supported.

Read more