passphrases with more than 160 bits of entropy
Perry E. Metzger
perry at piermont.com
Wed Mar 22 17:05:29 EST 2006
Victor Duchovni <Victor.Duchovni at MorganStanley.com> writes:
> Actually calculating the entropy for real-world functions and generators
> may be intractable...
It is, in fact, generally intractable.
1) Kolmogorov-Chaitin entropy is just plain intractable -- finding the
smallest possible Turing machine to generate a sequence is not
computable.
2) Shannon entropy requires a precise knowledge of the probability of
all symbols, and in any real world situation that, too, is
impossible.
Usually, the best you can do is produce (bad) bounds, and sometimes
not even that.
One thing that can be done, of course, is that you can prove, under
certain assumptions, that it would take an intractable amount of
computation to distinguish a particular PRNG from a true random
sequence with greater than 50% probability. However, this is very much
NOT the same as showing that the PRNG sequence contains an endless
stream of entropy -- in fact, the unicity distance very clearly shows
you how much "real" entropy you have, and it usually isn't
much. Merely because "too much" computation would be needed does not
mean that you've created entropy -- you've just made it hard for the
opponent to get at your PRNG seed.
"Anyone who considers arithmetical methods of producing random digits
is, of course, in a state of sin." -- John von Neumann
Perry
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com
More information about the cryptography
mailing list