Extracting uniform randomness from noisy source

John Kelsey kelsey.j at ix.netcom.com
Mon Aug 12 13:54:53 EDT 2002


At 10:56 AM 8/12/02 +0100, Paul Crowley wrote:
...
>Here's the game.  Our attacker selects an algorithm MUNGE which takes
>an unbounded stream of random bits as input and generates random
>strings as output.  We then select a key K and reveal it to the
>attacker.  We take a secret unbounded stream, feed it to MUNGE, and
>hash the output, using the key K.  The attacker then makes as many
>guesses about the output as they want until they get it exactly right.

Hmmm.  It looks to me like there are two big assumptions in this model that
may get in the way of using it for real-world applications:  

a.  You have assumed your entropy input is in the form of random bit
strings.  (By "unbounded," do you mean of arbitrary length, or of infinite
length?).  In real-world systems, the input strings are not generally going
to be random in this sense.  (Did you mean something else by "random" here?)

b.  You have assumed the attacker never sees your output, but instead only
gets to try to guess it and see if he's right.  In real-world systems,
you're likely to see some outputs directly.  (In the world of
computationally secure algorithms, hashing your output before using it puts
the attacker in just the position of your model--he can verify correct
guesses, but nothing else.  In the world of computationally-unbounded
attackers, you can't necessarily depend on this property from your hash
function anymore, though you can do simple things to get it.)  

>The hash function is (H,t,p)-secure for entropy collection if for any
>choice of attacker such that the entropy of the output of MUNGE is H
>or greater and the total work of the attacker (including MUNGE) is
>bounded by t, then the probability of a correct guess is p.

Why is the entropy of the output (did you mean input) relevant here?  If
the attacker has a probability of a correct guess of less than 2^{-128},
why would I care about the precise entropy?  

In fact, input entropy when the input strings aren't chosen randomly can be
pretty tricky.  Not only is it possible in principle to choose a set of
input strings with large entropy, which just happen to always give outputs
of zero for your function (the reason for using the key, I guess), it's
also possible to have inputs with large entropy, but which will guarantee
an unacceptably high p.  The obvious example is an input distribution that
gives a 0 1/2 of the time, but the other half of the time gives an output
whose high-order bit is always one, and whose remaining 159 bits are
uniformly distributed random bits.  This has entropy of 

	 ( (1/2) (2) + (1/2) (2^{159} 2^{-159} 159))  = 80.5 bits, 

but it guarantees p >= 1/2 for any MUNGE function, independent of key.  

>  __  Paul Crowley

--John Kelsey, kelsey.j at ix.netcom.com


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com



More information about the cryptography mailing list