[Cryptography] squish*infinity = squish

John Denker jsd at av8n.com
Tue Dec 2 11:15:24 EST 2014

On 12/01/2014 09:59 PM, lists at notatla.org.uk wrote:

> I suspect the joke has been missed.

There's nothing like a good joke.

> https://www.schneier.com/blog/archives/2008/05/random_number_b.html
> http://www.links.org/?p=327

That is nothing like a good joke.


We are talking about some code that uses uninitialized memory
to seed the OpenSSL PRNG.  This is a problem, for reasons having
nothing to do with valgrind and/or purify.  OpenSSL could take 
that code out or leave that code in, and the real problem would
remain: The PRNG is not being properly seeded.

This can be understood as follows:  Every word contains three
types of data:
  -- determinism:  guaranteed predictable
  -- entropy:      guaranteed unpredictable
  -- squish:       not guaranteed predictable *or* unpredictable

If any given word contains zero real entropy, a million such 
words still contains zero entropy.

If you think uninitialized memory contains some amount of real 
entropy, give us a nonzero lower bound on the amount.  There
are reliable ways of providing entropy to a newly instantiated
PRNG, and uninitialized memory is not one of them.

I get tired of saying this.  Any self-respecting cryptographer
should know this already.  It has been in the literature since
1967 if not longer:  Knuth gives an example where combining a
bunch of crappy RNGs does *not* result in a good RNG.

I don't know how to say it any more clearly:  Just because
you can't guarantee it's predictable doesn't mean it is
guaranteed unpredictable.

Note that the OpenSSL code in question is still there;  see
and search for "really bad randomness".

True story:  Once I had the proverbial pointy-haired boss
tell me to stop fussing with the RNG and just use the program
counter at the last interrupt, because that was obviously not
predictable.  Now it turns out that this was an event-driven
system, so the PC at interrupt was almost always a constant,
i.e. the address of the null job.
 -- Chance of PHB figuring this out:            0%
 -- Chance of adversaries figuring this out:  100%

They say a word to the wise suffices.  Evidently that is not
applicable here, so I'll say it a few more times:
Combining a large number of guaranteed sources will improve
the /rate/ of entropy delivery.  Combining a large number of 
probably-good sources may improve the reliability, provided
the probabilities are guaranteed uncorrelated ... which they
usually aren't.  One guaranteed source is worth more than an 
arbitrary number of bogus sources.  Feeding squish into the 
PRNG is not an acceptable seed.  Feeding in more and more 
squish will not solve the problem.


Tell me again why we need to have an underhanded crypto
contest?  It seems to me there is enough broken code out
there already;  why do we need to write more?  Why not
take whatever resources were going to be used to judge
the contest, and apply them to reviewing the code that
is already out there?  The top prize goes for catching
the /oldest/ serious bug, since that one was obviously
the hardest to catch.

More information about the cryptography mailing list