building a true RNG

Sampo Syreeni decoy at iki.fi
Sun Jul 28 05:21:56 EDT 2002


On 2002-07-27, David Wagner uttered to Sampo Syreeni:

>Just look at the first n bits of output; they are going to be a
>deterministic function of the first n' bits of input, and we can simply
>let f be the function mapping the first n' bits of input to the first n
>bits of output.

It's quite possible I don't get it in the finite case, but I don't see why
you can do this either. With statistical adaptation the n on the output
side will change based on the input stream, and |Y| cannot be both
constant and finite.

An example: presume we take a simple first order statistical model. If our
input is an 8-bit sample value from a noise source, we will build a 256
bin histogram. When we see an input value, we look its probability up in
the model, and discard every 1/(p(x)-1/256)'th sample with value x. When
this happens, the sample is just eaten and nothing appears in the output;
otherwise we copy. You *can* model this as a function from sequences to
sequences, but certain input sequences (like repeated zeroes) will make
the algorithm wait arbitrarily long before outputting anything. Hence, n
on the output side cannot be finite. (This might "lose entropy" in our
earlier, fuzzy sense, but it's near what I'm thinking. Once we discard the
right samples, higher order models would work just as well.)
-- 
Sampo Syreeni, aka decoy - mailto:decoy at iki.fi, tel:+358-50-5756111
student/math+cs/helsinki university, http://www.iki.fi/~decoy/front
openpgp: 050985C2/025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com



More information about the cryptography mailing list