Quantum computing’s potential is still far off, but quantum supremacy shows we’re on the right track

In this edition of the Radar column, we explore Google’s quantum supremacy milestone.

By Mike Loukides
November 1, 2019
Radar Column November

One of the most exciting topics we’ve been following is the development of quantum computing. We recently learned about a major breakthrough: Google says it has achieved “quantum supremacy” with a 53-qubit computer.

I will steer clear of a lengthy explanation of quantum computing, which I’m not really competent to give. Quantum supremacy itself is a simple concept: it means performing a computation that could not be performed on a classical computer.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

It’s very important to understand exactly what this means. Google performed a computation in a few minutes (3 minutes, 20 seconds to be precise) that would have taken more than 10,000 years on the most powerful computers we currently have. But that’s a speedup for one specific computation, and that computation has no practical value. (This explanation of the computation is the best I’ve seen.) Google has verified that the result is correct—a statistical distribution that is subtly different from a Gaussian distribution.

This is a major breakthrough, despite some controversy (though it’s worth pointing out that researchers John Preskill—who coined the term “quantum supremacy”—and Scott Aaronson accept Google’s understanding of quantum supremacy).

It’s important to consider what this achievement does not mean. It does not mean that cryptography is broken, or that we can achieve general artificial intelligence, or anything of the sort. Remember, this result is about one specific computation with no practical value; it’s meaningless, except perhaps as a random number generator that obeys an odd distribution. To break current cryptographic techniques, we’ll need quantum computers with thousands of qubits. And qubits don’t stack up as easily as bytes in a memory chip.

One fundamental problem with quantum computers is that the probability they’ll return an incorrect answer is always non-zero. To do meaningful computation on a quantum computer, we’ll need to develop quantum error correction. Error correction is well understood for classical computers; error correction for quantum computers isn’t. One error-corrected qubit (a “logical” qubit) may require more than a thousand physical qubits. So breaking cryptography, which may require thousands of logical qubits, will require millions of physical qubits. Quantum computers of that scale are still a long way off.

Quantum supremacy, now and in some imagined future, also doesn’t mean that digital computers become obsolete. Most of what we do on our computers—fancy graphics, email, databases, building websites, data analysis, digital signal processing—can’t be done with quantum computing. Certainly not now, and possibly never. Quantum computing is useful to speed up a relatively small number of very difficult computational problems that can’t be solved on classical computers. I suspect that quantum computers won’t be computers as such (certainly not laptops, unless you can manage a laptop that runs at temperatures close to absolute zero); they’ll be more like GPUs, specialized attachments that run certain kinds of computations.

I also suspect that, for quantum computers, Thomas J. Watson’s notorious (and perhaps apocryphal) prediction that the total market for computers would be five, might be close to the truth. But unlike Watson, I can tell you where those quantum computers will be: they will live in the cloud. Google, IBM, Amazon, and Microsoft will each have one; a few more will be scattered around at intelligence agencies and other organizations with three-letter names. The total market might end up being a few dozen—but because of the cloud, that will be all we need. Don’t expect a quantum computer on your desktop. It’s possible that some breakthrough in physics will make quantum computing units as common as GPUs—but that breakthrough isn’t even close to the horizon.

So, after all that cold water, why is Google’s achievement important? It’s important because it is what it says it is: A computation that would have taken more than 10,000 years on the fastest modern supercomputer has been done in a few minutes. It doesn’t matter that the computation is meaningless, and it doesn’t matter that scaling up to meaningful problems, like breaking cryptography, is likely to take another 10 to 20 years. Google has proven that it is possible to build a quantum computer that can perform computations of a complexity that isn’t conceivable for traditional computers. That’s a huge step forward; it proves that we’re on the right track.

Even though the computation Google has performed doesn’t have any applications, I wouldn’t be surprised if we can find useful computations that can be done on our current quantum computers, with 50 to 100 qubits. Random number generation is itself an important problem; quantum computers can be testbeds for researching quantum mechanics, and there are quantum algorithms for determining whether a message has been read by a third party. (And while these applications depend on the quantum nature of qubits, they don’t require “quantum supremacy” as such.) Utility is all a matter of perspective. I was introduced to programming in 1972, on computers that were incredibly small by modern standards—but they were still useful. And the first IBM mainframes of the 1950s were small even by the standards of 1972, but they did useful work. Scaling up took, literally, 60 years, but we did important work along the way. It’s easy to dismiss a 53-qubit quantum machine from the perspective of a laptop with 64 GB of RAM and a terabyte disk, but that’s like looking through the wrong end of a telescope and complaining about how small everything is. That’s not how the industry progresses.

Scaling quantum computing isn’t trivial. But the most important problem, getting these things to work in the first place, has been solved. Yes, there have been some small quantum computers available; IBM has quantum computers available in the cloud, including a 5-qubit machine that anyone can try for free. But a real machine that can achieve a huge speedup on an actual calculation—nobody knew, until now, that we could build such a machine and make it work.

That is very big news. — Mike Loukides


Data points: Recent research and analysis

Our analysis of speaker proposals from the 2019 edition of the O’Reilly Velocity Conference in Berlin turned up several interesting findings related to infrastructure and operations:

  • Cloud native is preeminent. The language, practices, and tools of cloud native architecture are prominent in Velocity Berlin proposals. From the term “cloud native” itself (No. 25 in the tally of the highest weighted proposal terms) to foundational cloud native technologies such as “Kubernetes” (No. 2), cloud native is coming on strong.
  • Security is a source of some concern. The term “security” not only cracked the top 5 terms, it surged to No. 3, following Kubernetes. This suggests that even as cloud native comes on strong, there’s a degree of uncertainty—and perhaps also uneasiness—about how to secure the new paradigm.
  • Performance is still paramount. Architects, engineers, and developers are using new tools, metrics, and even new concepts, to observe, manage, and optimize performance. This is as much a shift in language—with the terms “observability” rising and “monitoring” falling—as in technology.
  • Site reliability engineering (SRE) is growing. Terms associated with SRE continue to ascend the rankings. SRE is a very different way of thinking about software development. Our analysis suggests SRE-like terms, concepts, and practices are beginning to catch on.
  • Europe and the United States are different regions—and it isn’t just the metric system. For example, “observability” is a thing in Europe, but seems to be a slightly bigger thing in the US. It’s one of several terms that tend to be more popular on one side of the pond than on the other. Another term, oddly enough, is “cloud native,” which is more popular in the EU than the US.

Check out “What’s driving cloud native and distributed systems in 2019” for full results from our analysis of Velocity Berlin ’19 proposals.


Upcoming events

O’Reilly conferences combine expert insights from industry leaders with hands-on guidance about today’s most important technology topics.

We hope you’ll join us at our upcoming events:

O’Reilly Software Architecture Conference in Berlin, November 4-7, 2019

O’Reilly Velocity Conference in Berlin, November 4-7, 2019

O’Reilly Software Architecture Conference in New York, February 23-26, 2020

Post topics: Innovation & Disruption, Radar Column
Post tags: Commentary
Share: