[Cryptography] Is Ron right on randomness

dj at deadhat.com dj at deadhat.com
Sat Nov 26 10:42:04 EST 2016

>> No, the above #2 is not accurate. It does matter how good your entropy
>> source is. The leftover hash lemma gives you the expression for the
>> amount
>> of entropy you can extract from entropy sources - but doesn't tell you
>> how
>> and for the real constructions the answer is worse. Subsequent papers
>> given bounds for certain specific extractors. This can be be summarized
>> as
>> the 0.5 limit. If your input data has less that 0.5 bits of entropy per
>> bit of data, your extractor is swimming upstream slower than the stream
>> is
>> moving downstream.
> Reference please?  Because this would be news to me (and, I think, a lot
> of other people as well).

This is extractor theory which I'm by no means an expert in, but I did
just come back from the COST-IACR School on Randomness in Cryptography
which was a week long set of lectures mostly on the current state of
extractor theory from the professors who developed it so I'm feeling
pretty clued in right now - that will pass.

I've posted these before on this list, but the previous discussions
suggest they are not widely read. In particular the

Start at the start, you can't encrypt one bit from in the presence of a
weak source.
So you're going to have to do something else.

[b] But you can still do crypto:

The utility of your extracted random numbers depend on what you plan to do
with them:

Bounds for CBC-MAC and HMAC. AKA the 0.5 Limit.
But those bounds are in terms of computational bounds on the adversary,
not min entropy:

Last week's randomness school had a strong focus on developing chain rules
for conditional entropy under various entropy models (Min, Shannon,
Guessing, Collision, HILL, metric, metric*) but since the authors of these
paper were lecturing, I think I actually understood it. There are a lot of
papers to read, but this survey paper is a good place to start:

Multiple input extractors help you get past the 0.5 limit:

Extractors have different bounds when you consider the entropy against
computationally bounded adversaries (that's us!) the general case being
HILL entropy.

Is there a chain rule from which you can derive the entropy? It's an open
question, but this paper has counterexamples under some simple
assumptions, which is a no vote:
There were other cryptographers present who were not so sure.

There is a lot lot more, but these are good starting points.


More information about the cryptography mailing list