Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

John Denker jsd at av8n.com
Wed Mar 22 22:09:44 EST 2006


Aram Perez wrote:

> * How do you measure entropy? I was under the (false) impression that  
> Shannon gave a formula that measured the entropy of a message (or  
> information stream).

Entropy is defined in terms of probability.  It is a measure of
how much you don't know about the situation.  If by "message"
you mean a particular message that you are looking at, it has
zero entropy.  It is what it is;  no probability required.

If by "stream" you mean a somewhat abstract notion of an ensemble
of messages or symbols that you can only imperfectly predict, then
it has some nonzero entropy.

> * Can you measure the entropy of a random oracle? Or is that what  both 
> Victor and Perry are saying is intractable?

That is a tricky question for a couple of reasons.
  a) It will never be completely answerable because the question hinges
   on the word "random", which means different things to different people.
   Thoughtful experts use the word in multiple inconsistent ways.
  b) It also depends on just what you mean by "measure".  Often it
   is possible to _ascertain_ the entropy of a source ... but direct
   empirical measurements of the output are usually not the best way.

> * Are there "units of entropy"?

Yes.  It is naturally _dimensionless_, but dimensionless quantities
often have nontrivial units.  Commonly used units for entropy
include
  ++ bits
  ++ joules per kelvin.  One J/K equals 1.04×10^23 bits
For more on this, see
   http://www.av8n.com/physics/thermo-laws.htm#sec-s-units

> * What is the relationship between randomness and entropy?

They are neither exactly the same nor completely unrelated.
A pseudorandom sequence may be "random enough" for many
applications, yet has asymptotically zero entropy density.
A sequence of _odd_ bytes is obviously not entirely random,
yet may have considerable entropy density.

> * (Apologies to the original poster) When the original poster  requested 
> "passphrases with more than 160 bits of entropy", what was he requesting?

When you apply a mathematical function to an ensemble of inputs, it
is common to find that the ensemble of outputs has less entropy than
the ensemble of inputs. A simple pigeonhole argument suffices to show
that a function whose output is represented in 160 bits cannot possibly
represent more than 160 bits of entropy per output.  So if you want
the ensemble of outputs to have more than 160 bits of entropy, it is
necessary to do something fancier than a single instance of SHA-1.

> * Does processing an 8 character password with a process similar to  
> PKCS#5 increase the entropy of the password?

No.

> * Can you add or increase entropy?

Shuffling a deck of cards increases the entropy of the deck.

For more on this, see
   http://www.av8n.com/physics/thermo-laws.htm#sec-entropy

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list