# [Cryptography] entropy +- /dev/random

John Denker jsd at av8n.com
Mon Jan 26 14:47:25 EST 2015

```On 01/26/2015 06:07 AM, ianG wrote:
> This whole entropy accounting idea seems to be an implausible base
> for computing,

1) The following are both true, and both irrelevant:
-- Entropy is not a "base" for computing.
-- Computing is not a "base" for entropy, as von
Neumann famously pointed out.

> as a practical matter, in our field of endeavour, isn't required. The
> notion of cryptographic security is sufficient.

2) Entropy *is* required for practical security, for
the following reasons:

That's because any PRNG must be initialized.  If you
initialize it from some other PRNG, you have just
reduced it to the problem previously *not* solved.
The Cat in the Hat Comes Back.  To avoid infinite
regress, at some point in the chain you need Little
Cat Z to provide some actual entropy.

The /density/ of entropy in the output of the PRNG
might be exceedingly low, but the entropy must not
be zero.

==============

3) As an almost-separate issue, the fact that
/dev/random doesn't know what it's talking about
in terms of entropy does not change fact (2) above.

It is axiomatic that no matter what you are doing,
you can always do it wrong.  This however does not
mean that it is impossible to do things right.  Any
tool can be abused.  You are not however obliged
to abuse it.

> The notion that removing a bit also removes a bit of entropy appears
> to be a "bound" argument.  You can't "remove a bit of entropy."

Nonsense.

> It's clearly not entropy once you touch it,

"Touching" is not the right concept.  The relevant
operations are properly called /copying/ and /erasing/.

The second law of thermodynamics -- the law about
entropy -- imposes strict limits on the erasure
process.  If you have N copies of a bit, you can
copy, you have to get rid of the entropy somehow,
and that's going to cost you.

These are serious limitations, imposed by the laws
of physics, whether you like it or not, whether you
understand it or not.  If you build a mathematical
theory of computing that does not include these
phenomena, it does not correctly model the real
world.

If you copy a bit, it could be argued that one or
the other of them has entropy, but not both.  This
is sometimes (albeit not always) a useful way of
thinking about it.  This can be quantified in terms
of information gain and divergence.

There is an extensive literature on this subject,
featuring contributions from von Neumann, Shannon,
Kullback, Leibler, Landauer, Bennett, and others.
```