# [Cryptography] quantum computers & crypto

Jerry Leichter leichter at lrw.com
Sun Nov 7 12:32:01 EST 2021

```>> The fix for this is of course well known:  Padding.  If you know the entropy of the information carried by the message lengths, you could even compute the minimum amount of equivocation the padding has to introduce to deny an attacker any advantage.
> It depends what your goal is. If the goal is to defeat traffic analysis, padding might help -- with caveats stated previously. If the goal is to make life harder for a potential crypto attacker trying to quickly check whether a particular decryption attempt revealed some unknown plain text, then padding does not help. The attacker will use the presence of padding as a queue that, yes, it worked. That's very easy to check if he application used the most popular form of padding, a tail of zeroes.
Padding should be random - both in length and content - for exactly this reason.  (It might seem that padding should be produced from a source whose statistics are matched, as closely as possible, to the actual data being transferred, but in the context we're considering, that's wrong:  An attacker trying to confirm the right guessed key against a padding block will see data confirming his guess, while if the padding is random, it will be *dis*confirmed, leading him in the wrong direction.)

Coming back to compression:  Compression raises the entropy of the plaintext *as a string of bits.*  An ideal compression algorithm produces a string of bits indistinguishable from random without knowing the compression algorithm.  Of course, practical compression algorithms pass some non-random data just to keep the algorithm going - data that in some cases (headers are a typical example) provide a nice bit of known plaintext for an attacker to work on.

But ... compression algorithms move some of the correlation that was present in the original data into the length of the compressed data.  And this is where things get interesting.  It took us 20 or more years to realize we needed to do it, formalize what it means for a cryptosystem to leak no semantic information *of a collection of bits,* and build systems to that level of security.  Given such an encryption algorithm, used for data at rest, compression is fine.  But in a more general system - which data in motion often is - things are not so simple.  Compressing *might* increase the data leaked via lengths.

-- Jerry

```