[Cryptography] Entropy is forever ...

Bill Cox waywardgeek at gmail.com
Fri Apr 24 18:59:51 EDT 2015


On Tue, Apr 21, 2015 at 9:54 PM, <dj at deadhat.com> wrote:

> I've never seen the term surprisal used in any crypto math paper. I don't
> remember seeing it all before this email chain, but I have far from
> perfect memory.
>

I use surprise as described in this document:

    http://www.av8n.com/turbid/paper/turbid.htm

It says:

"Following Solomonoff (reference 33 and reference 34) and Chaitin
(reference 35) we quantify the surprisal of a string of symbols as follows:
    Let z be a string of symbols. The elements of z are denoted zk and are
drawn from some alphabet Z. The number of symbols in the alphabet is
denoted #Z.
    Let PC(z) be the probability that computer programs, when run on a
given computer C, will print z and then halt. The surprisal is defined to
be the negative logarithm of this probability, denoted $C(z) := − logPC(z).
In this paper we choose to use base-2 logarithms, so that surprisal is
measured in bits. Surprisal is also known as the surprise value or
equivalently the unexpectedness. It is intimately related to entropy, as
discussed below."

It's the best paper I've found on making practical use of thermal noise to
build a TRNG.  The specific case of using a 24-bit A/D converter, where we
know the low bits are actually random, seems a bit hackish to me, but the
concepts presented about random number generation generally are excellent.


> All the entropy extractors I have implemented have been based on proofs
> showing the algorithm converts one or more sources of some min-entropy to
> a slower source of data with higher min-entropy. All the CSPRNGs I've
> implemented have been based on proofs that a seed with defined min entropy
> fed into a given algorithm will yield an output with computation bounds on
> the work required by the adversary to determine the state of the PRNG.
>

Sounds good to me :-)


> I googled the term and it seem to be used in thermodynamics circles. The
> math definition doesn't seem to provide any cause to abandon Renye Entropy
> as the normal metric.
>

I think Turbid is the main source of this term as it's used to describe
true random number generators.  When I say entropy in statements like how
many "bits of entropy" are output by my "Infinite Noise Multiplier" device,
I mean how many bits of surprisal, as defined in the Turbid paper.  It
directly represents an attacker's probability of guessing the state, which
is perfect for security proofs.

Bill
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20150424/94415cc7/attachment.html>


More information about the cryptography mailing list