Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

Aram Perez aramperez at mac.com
Wed Mar 22 18:29:07 EST 2006

On Mar 22, 2006, at 2:05 PM, Perry E. Metzger wrote:

> Victor Duchovni <Victor.Duchovni at MorganStanley.com> writes:
>> Actually calculating the entropy for real-world functions and  
>> generators
>> may be intractable...
> It is, in fact, generally intractable.
> 1) Kolmogorov-Chaitin entropy is just plain intractable -- finding the
>    smallest possible Turing machine to generate a sequence is not
>    computable.
> 2) Shannon entropy requires a precise knowledge of the probability of
>    all symbols, and in any real world situation that, too, is
>    impossible.

I'm not a cryptographer nor a mathematician, so I stand duly  
corrected/chastised ;-)

So, if you folks care to educate me, I have several questions related  
to entropy and information security (apologies to any physicists):

* How do you measure entropy? I was under the (false) impression that  
Shannon gave a formula that measured the entropy of a message (or  
information stream).
* Can you measure the entropy of a random oracle? Or is that what  
both Victor and Perry are saying is intractable?
* Are there "units of entropy"?
* What is the relationship between randomness and entropy?
* (Apologies to the original poster) When the original poster  
requested "passphrases with more than 160 bits of entropy", what was  
he requesting?
* Does processing an 8 character password with a process similar to  
PKCS#5 increase the entropy of the password?
* Can you add or increase entropy?

Thanks in advance,
Aram Perez

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com

More information about the cryptography mailing list