Skip to main content

Why the world’s fastest quantum computer … really isn’t a quantum computer

d wave 2000 qubit processor quantum computing 2x usc v4
USC Viterbi School of Engineering
In July 2016, Lockheed Martin made improvements to the Quantum Computation Center housed at the USC Information Sciences Institute, increasing its qubit capacity to 1,098.

It’s latest step in a process that’s been going on for six and a half years, as the international security and aerospace company attempts to carve out a space in the fast-moving field of quantum computing.

In twenty years, this technology could have a huge impact on everything from academic research projects to online cybersecurity — but Lockheed Martin’s usage demonstrates that some of the benefits of quantum computing could be even closer to fruition, even if a fully functional quantum computer is a ways off.

Quantum Leap

The first quantum system that Lockheed Martin bought from D-Wave was a 128-qubit processor codenamed “Rainier,” which otherwise went by the name D-Wave One. It was later upgraded to the 512-qubit “Vesuvius,” which was itself recently upgraded to the 1,152-qubit D-Wave 2X.

“It is commercially available — you can go buy one — but it’s really a research and development, experimental type of system,” said Greg Tallant, head of Lockheed Martin’s Quantum Computation Center (QCC). “It’s not ready for production in the sense of, you could buy one and use it in the way that you use computers now.”

Lockheed Martin’s usage of a quantum annealer is evidence of the enormous potential of quantum computing to transform numerous fields in the coming years.

Before making a purchase, Lockheed Martin representatives visited D-Wave at its headquarters in Vancouver multiple times. The company determined that the hardware had “potential and promise,” and so the decision was made to purchase one of their systems. The next step was to forge a partnership with the University of Southern California, which resulted in the Quantum Computation Center, a facility that is part of the institution’s Viterbi School of Engineering.

USC’s part of the bargain allows them to carry out research using the hardware, largely pertaining to the space that D-Wave’s system occupies compared to other quantum computers, and benchmarking the machine. Meanwhile, Lockheed Martin can take that information and investigate what applications would benefit its interests.

Validation and verification was the inspiration for the program, but now it’s expanding into other areas. Machine learning is a top priority, but there’s also space for the system to be applied in the field of planning and scheduling — perhaps not the most glamorous use of cutting-edge technology, but certainly a productive application.

“As the number of variables in the problem grows, the number of possibilities you have to consider grows exponentially,” Tallant explained. “The problem that’s used at an academic level to describe this kind of usage is the travelling salesman problem.”

D-Wave 2x
D-Wave
D-Wave

The travelling salesman problem hinges around a list of cities that a hypothetical product-pusher has to visit, and the distances between them. The solution is the shortest possible route between these locations that visits every city only once before returning to the point of origin. It can be solved using today’s computers, of course, but quantum hardware could potentially offer drastic speed-ups, especially as the number of cities grows.

Tallant and his team haven’t yet been able to demonstrate that their D-Wave hardware will be able to offer an advantage over classical computers in this application. However, progress is being made, and further advances are expected thanks to the powerful 1,152-qubit processor that the company purchased in July.

Leader of the Pack?

“There’s a little bit of a complication,” noted Tallant. “Their current chip is 1,152 qubits, but when you install these systems, they have to go through a calibration process. That process causes some of the qubits to essentially fail calibration, and not be usable in a computation sense.”

In effect, it’s a similar situation to using your 32GB iPhone for the first time, and finding that you don’t actually have 32GB of storage to work with — although, obviously, what’s going on under the hood is a little different.

“Our 1,152-qubit system that we have now came in at 1,098 when we calibrated it,” said Tallant. A host of other engineering improvements have been made to the system, but an upgrade to the hardware’s qubit count will hopefully offer tangible improvements in terms of the problems that the rig can handle.

“It’s a way to take something that you know how to do, and evolve it into something that you don’t know how to do.”

“You can think of the number of qubits as close to the problem size that you can treat,” Tallant added. “So, when you have a system that’s only got 512 qubits, you’re limited in the size of the problem to — in the best case — something that has 512 variables. In practicality, it’s more like in the order of 200 variables.”

These figures might cause you to raise an eyebrow if you’ve been keeping up with the field of quantum computing. In May, IBM proudly announced that its five-qubit quantum computer was going to be made accessible to academics and enthusiasts via the web-based IBM Experience platform.

Why would IBM be showing off its five-qubit system if D-Wave was already selling quantum computers that have access to a hundred times that amount?

It’s simple — the hardware being used by Lockheed Martin is not really a quantum computer.

Different Strokes

The systems at the heart of the IBM and Lockheed Martin’s quantum program utilize superconducting qubits, a promising implementation that researchers hope will eventually spawn a large-scale universal quantum computer. But it’s not ready yet.

IBM’s system doesn’t qualify as a large-scale universal quantum computer, because it only uses five qubits. In terms of its architecture and its approach, it’s a universal system, but the low qubit count means it can’t perform full quantum error correction.

Meanwhile, the D-Wave system being used by Lockheed Martin also doesn’t qualify. It’s technically a quantum annealer, rather than a quantum computer. Put simply, it can only tackle a very limited set of problems.

D-Wave
D-Wave
D-Wave

“The D-Wave system is not a general purpose computer. It solves a particular problem that’s referred to as the Ising Spin Glass,” explained Tallant. He refers to it as an “optimization solver,” which is used for problems where resources like time and fuel can be used most efficiently when a powerful computer studies a wide range of different possible configurations.

“You can think of quantum annealing like this,” said Tallant. “We’re going to program a machine with a problem that we know the answer to, and we’re also going to tell it the problem that we [don’t]. Then we’re going to do this evolution in time, where we mix the two together. As long as we follow all the fundamental rules of physics, when we get done in the end, we’re going to have the answer to our problem.”

“In a sense, it’s a way to take something that you know how to do, and evolve it into something that you don’t know how to do —you still get the answers, even not knowing how to do it.”

Fit For Purpose

D-Wave’s qubit counts are well ahead of the curve, compared to other outfits working towards a quantum computer. However, it’s difficult to make a direct comparison between the company’s hardware and researchers attempting to construct a universal system.

“It is most definitely not a universal quantum computer, it is an annealer,” said Tallant. “The D-Wave system is not universal; we don’t have all the couplings that you would need to implement that.”

That’s not to say that Lockheed Martin won’t move beyond an annealer once the technology has developed sufficiently. “We would love to have a universal system,” Tallant added.

Lockheed Martin’s usage of a quantum annealer is evidence of the enormous potential of quantum computing to transform numerous fields in the coming years. A universal system is the endgoal for researchers working in the field, but even despite the compromises necessary to deliver results in the here and now, we’re seeing hardware with industrial applications.

It’s easy to mythologize the quantum computer as a game-changing discovery that’s still years away. A universal system will indeed have a universal impact when it comes to pass — but by that time, quantum technology like D-Wave’s annealer will already be put to good use across a number of different fields.

Updated on 08-26-2016 by Brad Jones: Clarified comments on IBM’s universal quantum computer.

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
AMD’s FSR 3 compromise just isn’t working
AMD presenting FSR 3 at Gamescom.

AMD made a compromise with FSR 3. The frame-generation tech was announced in November 2022, and it took nearly a year for it to show up in a game. Even now, months after release, FSR 3 is only available in 12 games, the lion's share of which are legacy titles and single-player games that are past their prime. Adoption wasn't working, hence the need for a compromise.

The compromise is AMD Fluid Motion Frames, or AFMF -- one of the least catchy acronyms, but I digress. This is driver-based frame generation. FSR 3 isn't available in a ton of games, but AFMF sidesteps that hurdle, so long as you have an AMD graphics card. You can use frame generation through the driver in basically any DirectX 11 or DirectX 12 game. Sounds pretty sweet.

Read more
I’m a VR enthusiast. Here’s why the Vision Pro doesn’t excite me
A person wears an Apple Vision Pro in a dim room.

For over a decade, I've eagerly read about Apple AR glasses and VR headset leaks, patent documents, and rumors. I've always believed that if any company had the resources to bring us into the future I've been waiting for, it's Apple. But now that the Vision Pro is nearly here, I've lost much of my excitement for Apple's first extended reality device.

I'll purposely avoid the easy targets. We all know a $3,500 Vision Pro is shockingly expensive. But that may not be an issue for some people.We've also heard many stories of discomfort becoming an issue, even in a half-hour demo of the Vision Pro. That's a problem for some people, but VR enthusiasts like myself are used to heavy headsets. I won't challenge Apple's decision to put digital eyes on the front of the headset.

Read more
Intel isn’t giving up on GPUs yet
The Intel logo on the Arc A770 graphics card.

Intel hasn't said much about its graphics cards lately. We saw the launch of the Arc A770 and A750 late last year, and the A580 just a few months ago, but after the departure of Raja Koduri from Intel's graphics division earlier this year, the future of Intel Arc has been a bit patchy. It now appears Intel is still planning to deliver on its road map, though.

A slide shared with Japanese gaming outlet 4Gamer shows that Intel is planning to launch a next-gen GPU in 2024. This lines up with Intel's initial road map, which promised that gamers would see next-gen Battlemage GPUs some time in early 2024.

Read more