passphrases with more than 160 bits of entropy

Aram Perez aramperez at mac.com
Wed Mar 22 12:24:04 EST 2006


On Mar 22, 2006, at 9:04 AM, Perry E. Metzger wrote:

>
> Aram Perez <aramperez at mac.com> writes:
>>> Entropy is a highly discussed unit of measure.
>>
>> And very often confused.
>
> Apparently.
>
>> While you do want maximum entropy, maximum
>> entropy is not sufficient. The sequence of the consecutive numbers 0
>> - 255 have maximum entropy but have no randomness (although there is
>> finite probability that a RNG will produce the sequence).
>
> One person might claim that the sequence of numbers 0 to 255 has 256
> bytes of entropy.

It could be, but Shannon would not.

> Another person will note "the sequence of numbers 0-255" completely
> describes that sequence and is only 30 bytes long.

I'm not sure I see how you get 30 bytes long.

> Indeed, more
> compact ways yet of describing that sequence probably
> exist. Therefore, we know that the sequence 0-255 does not, in fact,
> have "maximum entropy" in the sense that the entropy of the sequence
> is far lower than 256 bytes and probably far lower than even 30 bytes.

Let me rephrase my sequence. Create a sequence of 256 consecutive  
bytes, with the first byte having the value of 0, the second byte the  
value of 1, ... and the last byte the value of 255. If you measure  
the entropy (according to Shannon) of that sequence of 256 bytes, you  
have maximum entropy.

> Entropy is indeed often confusing. Perhaps that is because both the
> Shannon and the Kolmogorov-Chaitin definitions do not provide a good
> way of determining the lower bound of the entropy of a datum, and
> indeed no such method can exist.

No argument from me.

Regards,
Aram Perez

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list