[Cryptography] A question on Shannon's entropy

Ray Dillinger bear at sonic.net
Thu Mar 3 11:50:30 EST 2016



On 03/03/2016 12:02 AM, mok-kong shen wrote:

> But in order to measure the randomness in a source, one has to
> take samples. These samples are necessarily strings of finite
> sizes in practice. Now suppose samples from one source looks quite
> random and samples from the other source happen to be (approximately)
> the sorted sequences of the first. In that case I surmise that there
> is yet some problem.

If in a sample of n things you find a strictly increasing order,
then the generator from which the sample is drawn is not
producing independent outputs - each is constrained by the
previous one.  Because Shannon Entropy is not defined on
sources unless they are producing independent outputs, this
source cannot be said to have an entropy measure.

That is not the same as saying the measure is low, or high, or
that the measure is any particular value - the measure simply
doesn't exist.  It would be like talking about the mass of
the color green or the aroma of an array.

				Bear

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20160303/fbee9c58/attachment.sig>


More information about the cryptography mailing list