Entropy Definition (was Re: passphrases with more than 160 bits of entropy)

Hal Finney hal at finney.org
Thu Mar 23 23:06:05 EST 2006


This is getting pretty far afield from cryptography but it is a topic
I find very interesting so I can't resist jumping in.

John Denker writes:
> OK, in a moment we will have gone through four plies of no-it-isn't
> yes-it-is no-it-isn't yes-it-is.  Let's get serious.  The axiomatic
> definition of a measure is
>    -- a mapping from sets to numbers
>    -- positive
>    -- additive on the countable union of disjoint sets
>
> And a probability measure has the further property of being
>    -- bounded above
>
> I have checked that -- with due attention to trivial details --
> .5 ^ (program length) satisfies this definition.  If you wish to
> renew the assertion that there is no such probability measure, please
> explain which of the axiomatic requirements is not met.  Please be
> specific.

This is true, in fact it is sometimes called the universal distribution
or universal measure.  In more detail, it is a distribution over all
finite-length strings.  The measure for a particular string X is defined
as the sum over all programs that output X of 1/2^L_i, where L_i is the
length of each such program.

Often the algorithmic complexity of a string is defined as the length
of the shortest program to output the string.  The universal measure is
based on the same idea, but takes into consideration that there may be
multiple programs that output the same string.  Each program of length L_i
contributes 1/2^L_i to the string's measure.  If there is only one short
program and all others are much longer, then the probability measure
is essentially 1/2^C where C is the length of this shortest program,
i.e. the algorithmic complexity.

The universal measure for a string can also be thought of as the
probability that, given a fixed Universal Turing Machine (UTM), a randomly
chosen input program will output that string.

So this is clearly a probability distribution (with some technicalities
regarding issues of program lengths being glossed over here) as John
Denker says.  However to go from this to a notion of entropy is more
questionable.

There are a countably infinite number of finite strings, and all of
them have non-zero probabilities under this distribution.  This means
that for most strings, the probability must be very low, asymptotically
approaching zero.  In fact you can show that for most strings of length n,
their measure is 1/2^n; this is equivalent to saying that most strings
are effectively random and have no shorter program to output them.

Shannon entropy is defined over a probability distribution.  That is,
given a probability distribution we can find its Shannon entropy by
summing -p_i / log2(p_i) over all members of the distribution.  If we
approximate the universal distribution by 1/2^n for strings of length
n, this sum is clearly divergent.  So there is not really a meaningful
Shannon entropy for this probability distribution, or in other words
the entropy is infinite.

John Kelsey asked:
> > Indeed, what's the probability distribution of the sequence of bits
> > defined by Chaitin's Omega?  

This probability distribution is defined only over finite strings and so
Omega is not within the universe of this distribution.  It should also be
noted that it is impossible for an n bit program to output more than n
bits of Omega (plus or minus an additive constant specific to the UTM).
Hence even if we consider successive approximations to Omega of ever
increasing length, their measures would tend asymptotically to zero.

Hal Finney

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list