[Cryptography] The ultimate random source

John Denker jsd at av8n.com
Fri Feb 21 05:03:24 EST 2014


On Thu, 20 Feb 2014, Bill Frantz wrote:

>>  I poured them into a 10" x 15" baking sheet

That's very much more sensible than packing them into a beaker,
which was the original proposal.

>> Now my statistics is a bit rusty
....
>>  6**600 overflows my HP15 calculator

Even if that were the right number, the only thing that would
matter would be the logarithm thereof.  For independent events,
the entropy would be 600*log(6), which is about 1551 bits.

This does not overflow even the humblest scientific calculator.
In fact, you can calculate it using a slide rule.

On 02/20/2014 11:13 PM, Dave Horsfall wrote:

> Your statistics is spot-on. 

Not really.

> My trusty HP-42S says 7.7758921822E466, which
> is pretty big :-)  With Unix "bc", I get:
> 
> 6^600
> 77758921822273638463652243033738663459096026047499749793055195367088\
> 51240595825119152763630559279561914762582119686987497940622017964689\
> 94358639583677597277966221496081386040428524624036837389354858526340\
> 57709256771299230380098822466080142001159030156662377138169091661053\
> 37802730615914058869840104453151780612057444464745298260875996761919\
> 24470372419816391019296687161848319000324789980278116093440553336474\
> 88533711459486991110405287751122235682242271887405273317376

That's not the right thing to be calculating.  It's off by a
factor of several million.  Even if it were the right question,
it would be pointless to grind out the 466-digit answer.

Now log(6^600) would not be a bad guess, and it serves as an upper 
bound to this component of the entropy, but as a matter of principle 
it's not the right thing.  It assumes that the occupation of each
site is independent, which surely isn't the case.  Look at it this 
way:  The color of the final M&M is completely determined by the 
state of the previous 599 M&Ms, so it contributes exactly zero entropy.

A better estimate of the multiplicity is 600! / 100!^6 and upon
taking the logarithm we find the entropy is approximately 1529 bits
... about 22 bits less than the original guess.

So this particularly obvious type of correlation makes only a 1.4% 
correction to the entropy.  That seems small in relative terms, but
on the other hand, if you sold this thing as a stream cipher and it
put out 1529 random bits followed by 22 completely predictable bits,
your customers would have you tarred and feathered.

  On the third hand, if all you needed was 150 bits for seeding a
  CSPRNG, then 1529 is plenty, even if there are a few nonidealities

On the fourth hand, there will be other types of correlation as well, 
depending on as-yet unspecified details of the mixing process.  You
need to get a handle on this, to demonstrate that the fundamental
front-end physics really is giving you some entropy.

To summarize the main points:
  a) Cryptography requires attention to detail.
  b) There are a lot of things that can make the entropy less than 
   you guessed it was.
  c) The security of a RNG depends on having a reliable lower bound 
   on the entropy.  Guesses, estimates, and upper bounds do not get
   the job done.
  d) Entropy is super-important, but it does not tell you everything
   you need to know.  It does not by itself solve all the world's
   problems.  Much depends on how the entropy is used.



More information about the cryptography mailing list