Private Key Generation from Passwords/phrases

John Denker jsd at av8n.com
Thu Jan 18 16:55:48 EST 2007


On 01/18/2007 03:13 PM, David Wagner wrote:
 > In article <45AFB67A.40300 at av8n.com> you write:
 >> The /definition/ of entropy is
 >>
 >>               sum_i  P_i log(1/P_i)             [1]
 >>
 >> there the sum runs over all symbols (i) in the probability
 >> distribution, i.e. over all symbols in the ensemble.
 >>
 >> Equation [1] is the gold standard.  It is always correct.  Any
 >> other expression for entropy is:
 >>  a) equivalent to [1]
 >>  b) a corollary, valid under some less-than-general conditions, or
 >>  c) wrong.
 >
 > I disagree.  In the context of Physics, Shannon entropy may well be
 > the end-all and be-all of entropy measures, but in the context of
 > Cryptography, the situation is a little different.  In Cryptography,
 > there are multiple notions of entropy, and they're each useful in
 > different situations.

There is only one technical definition of entropy, and
only one technical definition of force, and only one
technical definition definition of power.  In parallel,
there are nontechnical and metaphorical uses of the same
words, as in "military force" and "political power".

We would be better off maintaining just the one technical
definition of entropy, namely S = sum_i P_i log(1/P_i).
If you want to talk about something else, call it something
else ... or at least make it clear that you are using the
term in a nontechnical or metaphorical sense.

 > For this particular application, I would suspect that Pliam's workfactor
 > or Massey's "guessing entropy" could well be more accurate.  See, e.g.,
 > the following for a short summary and for references where you can learn
 > more:
 >   http://www.cs.berkeley.edu/~daw/my-posts/entropy-measures

Pliam rightly points out the distinction between the "typical"
case and the "average" case.  Entropy is an /average/ property,
as I emphasized in my note this morning.

 > Shannon entropy is often a reasonable first approximation -- it's
 > usually good enough for practical purposes.  But it's just an approximation,

An approximation to what?
It's not an approximation to the entropy.

 > and in some cases it can be misleading.

As a general rule, /anything/ can be abused.

 > Example: Suppose I choose a random 256-bit AES key according to the
 > following distribution.  With probability 1/2, I use the all-zeros key.
 > Otherwise, I choose a random 256-bit key.

OK.....

 > The Shannon entropy of this
 > distribution is approximately 129 bits.

Yes.

 > However, it's a lousy way to
 > choose a key, because 50% of the time an adversary can break your
 > crypto immediately.  In other words, just because your crypto key has
 > 129 bits of Shannon entropy doesn't mean that exhaustive keysearch will
 > require at least 2^129 trial decryptions.  This is one (contrived)
 > example where the Shannon entropy can be misleading.

I never said the entropy was equal to the resistance to guessing.  The
crypto FAQ might have said that at one time ... but I am not responsible
for what other people say.  I am only responsible for what I say.

If you want to talk about work factor, call it work factor.
If you want to talk about entropy, call it entropy.

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list