Entropy Definition (was Re: passphrases with more than 160 bits of entropy)
jsd at av8n.com
Thu Mar 23 19:05:42 EST 2006
>>With some slight fiddling to get the normalization right, 1/2
>>raised to the power of (program length) defines a probability
>>measure. This may not be "the" probability you want, but it
>>is "a" probability, and you can plug it into the entropy definition.
John Kelsey wrote:
> No, this isn't right for the algorithmic information theory meaning at
OK, in a moment we will have gone through four plies of no-it-isn't
yes-it-is no-it-isn't yes-it-is. Let's get serious. The axiomatic
definition of a measure is
-- a mapping from sets to numbers
-- additive on the countable union of disjoint sets
And a probability measure has the further property of being
-- bounded above
I have checked that -- with due attention to trivial details --
.5 ^ (program length) satisfies this definition. If you wish to
renew the assertion that there is no such probability measure, please
explain which of the axiomatic requirements is not met. Please be
> For that measure, we can intelligently discuss the entropy of a
> specific random string, without reference to a probability model.
That's like saying we can talk about three-dimensional force, velocity,
and acceleration "without reference" to vectors.
Measure theory is the tried-and-true formalism for dealing with random
strings. It would be spectacularly contrary to ignore the formalism,
and just plain wrong to say the formalism is inapplicable.
> Indeed, what's the probability distribution of the sequence of bits
> defined by Chaitin's Omega?
Huh? Omega is so obviously a probability that usually the word probability
is used in its definition. See e.g.
I suppose a masochistic nitpicker could demand to see a proof that this word
is justified, but I'm not going to bother, for reasons given above.
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com
More information about the cryptography