# [Cryptography] Entropy is forever ...

Wed Apr 22 00:54:33 EDT 2015

> On 04/21/15 20:36, John Denker wrote:
>> On 04/21/2015 07:10 AM, Thierry Moreau wrote:
>>
>>> Take a 256 bits PRNG seed and a 256+8=264 bits random message source
>>> with with a uniform distribution for the 2^264 messages except that a
>>> specific message occurs 16 times more often. The minimum entropy
>>> measure would thus be 260 (based on the message sample with the
>>> highest probability).
>>
>> The quantity calculated there is called the /surprisal/.
>> It is not properly called the entropy, or the min-entropy,
>> or anything like that.
>>
>
> http://en.wikipedia.org/wiki/Min_entropy
>
> This concept is used in NIST documents for a conservative estimate of
> randomness from a hardware random source.

It's also used widely in cryptographic mathematics, because it has
properties that make it convenient for making proofs.

I've never seen the term surprisal used in any crypto math paper. I don't
remember seeing it all before this email chain, but I have far from
perfect memory.

All the entropy extractors I have implemented have been based on proofs
showing the algorithm converts one or more sources of some min-entropy to
a slower source of data with higher min-entropy. All the CSPRNGs I've
implemented have been based on proofs that a seed with defined min entropy
fed into a given algorithm will yield an output with computation bounds on
the work required by the adversary to determine the state of the PRNG.

I googled the term and it seem to be used in thermodynamics circles. The
math definition doesn't seem to provide any cause to abandon Renye Entropy
as the normal metric.

Sadly NIST is far from consistent in its use of the term entropy and it
has it's own rather creative definition of 'full entropy'. Better called
'Good enough for NIST'. I've submitted a lot of comments to NIST about