# [Cryptography] Entropy is forever ...

John Denker jsd at av8n.com
Fri Apr 17 14:59:26 EDT 2015

```On 04/17/2015 10:26 AM, Thierry Moreau wrote:

> Is there a lower entropy in "Smith-725" than in "gg1jXWXh" as a
> password string? This question makes no sense as the entropy
> assessment applies to the message source.

Agreed, the question makes no sense.

> To summarize, the entropy assessment is a characterization of a the
> data source being used as a secret true random source.

Agreed.  It is a fool's errand to assess the randomness
of a "random number".  For starters, there is no such
thing as a "random number".
-- If it's a number, it's not random.
-- If it's random, it's not a number.
-- You can have a random /distribution/ over numbers,
but then the randomness is in the distribution, not
in any particular number that may have been drawn
from such a distribution.

It is a well-known issue.
http://www.nsftools.com/misc/DilbertRandom.gif
http://imgs.xkcd.com/comics/random_number.png

Anybody with any sense would assess the /mechanism/
whereby the allegedly random distribution is generated.

> My answer is that each of the 4 RSA key pairs are independently
> backed by 2000 bits of entropy assurance.

No, because they are not /independent/.  (Sometimes
it may be computationally infeasible to exploit the
dependence, but that's a separate question.)

Obviously, entropy is computationally indistinguishable
from entropy ... but the converse does not hold:  Just
because you can't tell whether something is entropic or
not doesn't mean it /is/ entropic.

Suppose I prepare a one-time pad consisting of a sample
of 2000 random bits.  I print it once on red paper, and
print the same thing on blue paper.  I give you one of
them.  That gives you 2000 bits of information you wouldn't
otherwise have.  Then if I give you the other one, that
gives you zero additional information.  The amusing thing
is that it doesn't matter which one I give you first,
blue or red ... the first one is informative and the
second one is not.  Either one separately has 2000 bits
of information ... but the information is not additive.
In the language of chemistry: entropy is not an extensive
quantity.

This is another manifestation of the principle set forth
above:  Entropy is a property of the ensemble, i.e. a
property of the distribution ... not a property of any
particular sample that may have been drawn from such
a distribution.

> This mathematical formalism is difficult to apply to actual
> arrangements useful for cryptography, notably because the probability
> distribution is not reflected in any message.

That's not true.  That's not even the right way to
frame the issue.

The fact is, the formalism is a tool.
-- Some questions are easy to answer using entropy.
-- For that matter, some questions are easy to answer
without using entropy.
-- Some questions are hard to answer, whether you
use entropy or not.
-- In addition, there is a provably infinite number
of ways of abusing the tool, to get all sorts of
wrong answers.  You can't make anything foolproof,
because fools are so ingenious.

> The information theory is silent about the secrecy requirement
> essential for cryptographic applications.

Also not true.  Information theory can be used to
provide some very strong secrecy guarantees.  This
is often sufficient but not necessary, insofar as
there may be more convenient ways of meeting the
secrecy requirements ... but information theory is
by no means silent on the issue.

> Maybe the term entropy is used, more or less by consensus, with a
> definition departing from the information theory.

Any tool can be abused.  There is however a community
of people who know what they are doing, who use the
terminology and (!) the underlying concepts correctly.

> puts into question the Linux "entropy pool" depletion on usage

Linux /dev/random has lots of problems, but underestimating
the entropy -- by overestimating the effect of depletion --
is not one of them.

> Entropy is forever ... until a data leak occurs.

A leak changes the ensemble.  Entropy is a property of
the ensemble.
https://www.av8n.com/physics/thermo/entropy.html

```