[Cryptography] quantum computers & crypto

Rodney Van Meter rdv at sfc.wide.ad.jp
Thu Nov 4 19:25:12 EDT 2021

Rodney Van Meter
rdv at sfc.wide.ad.jp
Professor, Faculty of Environment and Information Studies, Keio University, Japan

> On Nov 4, 2021, at 7:12, Jerry Leichter <leichter at lrw.com> wrote:
>> Another way to state the same thing is that a symmetric cipher with key of N bits is N/2-bits quantum-safe. This is because they're only vulnerable to Grover's algorithm, which is the quantum search algorithm.

I think it’s fair to say that that’s a general bound, but don’t forget the really important constant factors. For a very, very long time, quantum gates will be many orders of magnitude more expensive than classical gates, even (perhaps especially) once we have fully fault-tolerant systems.

>> So yeah, if you (e.g.) use AES-256, it's got quantum security of 128 bits.
> BTW, to the limited degree I understand this, it's a *really* worst-case estimate.  To actually get the speedup, you need an oracle that can tell you if you've "guessed right."  To be useful, the oracle has to itself be implemented as a quantum computation - i.e., on a mixed state:  If you first measure, then use a classical implementation of the oracle (i.e., you first measure, then try out the key you get using a classical implementation of AES), you've lost your advantage.
> Exactly what searches allow you to construct such an oracle is a very non-trivial question.  I don't think we know if it's possible for AES.  If it is possible, it could well require a huge number of quantum gates - i.e., the naive approach would be to essentially replicate a classical implementation using quantum gates, but that's so far beyond anything we can imagine doing today as to not be worth worrying about.  Of course there might be a better approach; I don't believe we know.

How do you determine classically whether you have succeeded in finding the key in a brute force search? I’m naive, but I’ve always assumed a simple bit-level or byte-level entropy test would do it. If it’s ASCII data, for example, the top two bits of each byte will be 11 more often than any other combination, so just a bit-level entropy should show a significant bias. That should be pretty easy to calculate. Highly compressed data like streamed MPEG should still show a certain amount of structure even as the main data has high entropy, shouldn’t it? Certainly a tougher test than for straight ASCII.

Without endorsement, here are three papers on the topic. I have read the Grassl one, but I don’t fully understand how the numbers get to be so ginormous. It has been on my list of things to study more deeply for quite a while. The other two I have not read. I don’t know what any of them do about a test for success of the break.



More information about the cryptography mailing list