[Cryptography] quantum computers & crypto

Rodney Van Meter rdv at sfc.wide.ad.jp
Fri Oct 22 04:51:55 EDT 2021


I’m not a regular subscriber to the list, but a colleague asked me to take a look at the discussion on quantum computers that went around last month. My apologies for bringing up something that has gone quiet, but I thought folks might be interested anyway. A bit about me and my work at the bottom; ignore the bio bits if you prefer, but disclaimers apply. I’ll also include a dump of URLs, mostly but not entirely to my stuff, which you can read or ignore as you see fit.

I enjoyed the thread with John, Henry, Phillip and a couple of others. Most of what was said I think is pretty accurate.

A few points (my apologies for being windy here):

As noted, quantum computers are still embryonic, but they are getting better at a pretty good clip. This leaves us a very long ways from, in the NSA’s terminology, a CRQC (crypto-relevant quantum computer). We’re not even really on a Moore’s Law curve in system capacity, measured in qubits, though there are claims that we are doubling the qubit fidelity/lifetime at better than once a year. Fidelity is essentially the probability that the system is doing what it is supposed to do.   There are a whole bunch of different numbers that describe the overall system; some specified as lifetimes, some as gate fidelities or memory fidelities. Fidelities are hovering around 99% now for superconducting systems known as transmons, the technology used by IBM, Google and Rigetti; claims of up to three nines (99.9%) have been made. These systems used to be limited primariliy by insanely low memory lifetimes, but are now dominated by the two-qubit gate fidelity.

IBM created a more comprehensive measure of system quality that they call Quantum Volume (sometimes abbreviated QV). The idea is good, but I think the actual definition is a mess. To avoid cheating by having only one qubit you can keep for a long time or many qubits you can run for just an instant, they focused on “square circuits”: being able to do $n$ gates on $n$ qubits and get the answer out with some reasonable probability. So far, so good — but then they decided to exponentiate this number to try to express the power of quantum computing, and now numbers are occasionally quoted in the millions and nobody really knows what you mean when you say, “We doubled quantum volume.” If they stick to their guns on the definition, the numbers will be astronomical in a few years and we’ll wind up talking about “log of QV”, and we’ll be back to something reasonable — the space-time product of algorithms on systems that work.

Today’s systems are NISQ, noisy intermediate-scale quantum systems. IBM claims they can entangle all of the qubits on a single chip.

But if we are really at three nines, we are “below threshold”, the level at which quantum error correction (QEC) starts to be more than an idea. Because extracting error syndromes requires operating on the qubits, QEC is a lot harder than classical channel ECC, where the syndrome extraction and correction process is so close to error-free that it can be ignored. QEC is also hard because of entanglement and error propagation inside systems, but let’s leave that aside. Being close to threshold means needing a code with a high code distance, so we really need a fourth nine. I would say that is now in sight within the next few years, and so the next challenge will be scalable hardware and control systems.

IonQ, which uses ion traps instead of superconducting chips, claims to have successfully demonstrated QEC.

The wildcard is Psi Quantum, a still-stealthy Valley startup that is trying to build optical QCs. It’s really hard, but if they succeed scalability and QEC/fault tolerance come quickly in their architecture.

There are lots of other hardware efforts; probably every university-based experimentalist is considering taking their work out of the lab, but they are also the ones who know how hard building the systems is. (Even if they are sometimes dismissive of engineering efforts or have a “my problems are the hardest” attitude.)

There are also a hundred or more quantum software startups, working on low-level control systems, compilers, or algorithms.

Do I think NSA is ahead of industry and academia? No, I do not. At least, not in hardware; they may very well have in hand theoretical advances in computational complexity or specific algorithms or just in effective implementation plans. But hardware requires a lot of lore, not just reading papers in Nature; if there was a major brain drain into Meade we would have seen it. I wouldn’t say it’s impossible for them to have a lot of stuff working in-house, but I don’t think it’s likely to be substantially ahead.

I’m fully confident that scalable, fault-tolerant, gate-based quantum computers capable of the full range of quantum algorithms will be coming. Exactly when is an interesting question. My thoughts on how it might play out:

Jon Dowling (RIP), who wrote a couple of books on quantum computing and quantum networking, believed that PQC as an effort is ultimately futile, that eventually all one-way functions and complexity-based crypto would fall to QCs. Personally, I doubt that.

So, if I were in NSA or a major vendor, I would definitely be working on PQC, including research, standardization, implementation and deployment as rapidly as is feasible. This is especially important given the possibility of capture-now, decrypt-later agents being out there.

I read the NSA brief that John pointed out. I found it interesting that NSA is choosing to recommend RSA-3072. If we succeed in building QCs, that’s only 27 times as hard as factoring a 1024-bit number.  In crypto terms, 27x harder is a very small increment in difficulty.

Just my opinions.  You can stop reading here if you like.


Okay, for those of you still with me: I hope you don’t think I’m arguing from authority, but FYI some background on me and my work, so you can understand how I’ve developed these opinions.

I worked in classical systems — architecture, OS, networking, and especially network-attached storage — for 17 years. I did work that helped lead to iSCSI, and I led the OS team for an IPsec gateway for a while, which means I’ve heard many crypto terms but can’t handle any of the math and definitely shouldn’t be trusted to implement it. In 2003, I switched to working on quantum computing, which has been my main research field since. I am the author of _Quantum Networking_, the first book on the topic. I’m Vice Center Chair of the Keio Quantum Computing Center and co-chair of the IRTF’s Quantum Internet Research Group (QIRG); obviously, I speak for neither Keio nor QIRG/IRTF/ISOC in any of this.

Way too many links:

If you want to see a chart that shows the importance of constant factors in execution time for Shor’s algorithm, see

Not long ago, an anti-quantum hype piece appeared. You might find something interesting in my response.

If you are interested in research on systems research in quantum computing, two years ago I wrote 186 tweets on the topic:

If you want about a 20-hour intro to QC, including Shor’s algorithm, check out our MOOC, which is currently free:

If you prefer paper, Bob Sutor’s book is a great intro, with the first half being the basic math (probability, linear algebra) needed. I highly recommend it to my own students:

For a good book on theory but little on hardware, Scott Aaronson’s is readable but rigorous:

My own book:

Finally, despite having just said you shouldn’t trust me with crypto, I am in fact working on a brief (30-40 pages?) intro to the topic aimed at quantum physicists and engineers. Work on it stalled about a year ago as other priorities reared their ugly heads. Feedback and even coauthors welcome.

More information about the cryptography mailing list