Entropy of other languages

Travis H. travis+ml-cryptography at subspacefield.org
Wed Feb 7 20:19:55 EST 2007


On Wed, Feb 07, 2007 at 05:42:49AM -0800, Sandy Harris wrote:
> He starts from information theory and an assumption that
> there needs to be some constant upper bound on the
> receiver's per-symbol processing time. From there, with
> nothing else, he gets to a proof that the optimal frequency
> distribution of symbols is always some member of a
> parameterized set of curves.

Do you remember how he got from the "upper bound on processing time"
to anything other than a completely uniform distribution of symbols?

Seems to me a flat distribution has the minimal upper bound on
information content per symbol for a given amount of information!

-- 
Good code works.  Great code can't fail. -><-
<URL:http://www.subspacefield.org/~travis/>
For a good time on my UBE blacklist, email john at subspacefield.org.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 827 bytes
Desc: not available
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20070207/f83118e5/attachment.pgp>


More information about the cryptography mailing list