# [Cryptography] Entropy is forever ...

Jerry Leichter leichter at lrw.com
Sun Apr 19 07:27:14 EDT 2015

```On Apr 19, 2015, at 3:18 AM, John Denker <jsd at av8n.com> wrote:
> ...  The entropy is a property of
> the ensemble, not a property of the string.  In fact it is
> an ensemble average, the expectation value of the surprisal.
> It has to be based on measurements, not on philosophy, not
> on arbitrary hypotheses.
And the problem is that we almost never are in a position to measure the expectation value when designing a cryptographic system.  So we end up making assumptions.  In fact, we almost always end up making the same assumption:  The plaintext is drawn from a uniform distribution.  There are other areas of engineering where we make different assumptions about input distributions.  (There's no point even trying to compress a data stream under the assumption that it's drawn from a uniform distribution, for example.)  But crypto discussions usually start either there or at the opposite extreme, that the distribution is actually just a single point.

Interestingly, the uniform distribution case is the easiest one for cryptography - which is exactly why making the opposite assumption is good.  This is where work on semantic security comes from, dealing with questions like:  If, in fact, I never send anything but the encryption of the word "yes", how can I prevent an opponent from learning that from the encrypted stream?  Or even that I'm always sending the encryption of the *same* word?

...and then we come full circle to the recent discussion of whether the result of an encryption must be random, or "indistinguishable from random", relative to whatever model of "indistinguishability" you want to use, and what you want to apply it to.  For example, a stream cipher reveals the byte length of the cleartext in the byte length of the cipher text.  If the set of possible inputs is "yes"  or "no", a stream cipher based on a true random source - the canonical (actually single) example of a "provably random encryption) - reveals everything about the input in the output, even though every bit in the output is completely random.
-- Jerry

```