[Cryptography] Is Ron right on randomness

Jerry Leichter leichter at lrw.com
Wed Nov 23 17:37:22 EST 2016


>> 1. You need two things: an entropy source, and a whitener. No entropy
>> source is perfect, so you need a whitener no matter what. You don't have to
>> do anything fancy in your whitener. Any cryptographically secure hash
>> function (like SHA512) will do.
>> 
>> 2. Since you need a whitener no matter what, it doesn't really matter how
>> good your entropy source is, except insofar as it might take a long time to
>> collect enough entropy from a very poor source. All that matters is that you
>> have an accurate lower bound for how much entropy your source actually
>> provides, and this is the case no matter how good (or bad) your source
>> actually is. As long as you feed >N bits of entropy into your whitener, you can
>> safely extract N bits of true randomness out of it.
>> 
>> 3. You don't need more than a few hundred bits of randomness. 128 bits is
>> enough, 256 is a comfortable margin, 512 is serious overkill. Seed a
>> cryptographically secure PRNG with a few hundred bits of entropy and you
>> can safely extract gigabytes of key material out of it.
> 
> (I omitted #4)
> 
> Is the above accurate?  Is it a reasonable design point to use for OpenSSL's next CSPRNG?
It's tough to say, because the description is a bit odd.  I actually agree with the last point (though it's one that is often contentious):  A properly seeded DRNG is every bit as secure, and much easier to *show* secure, than some fancy mixed pool of uncertain "random" sources.  For example, if you're going to encrypt with AES anyway, you're already assuming it has properties that would make AES in counter mode with a random key secure.

So ... you only need enough bits to feed your DRNG.  But ... where do they come from?  If you have a way to get (say) 256 bits with 256 bits of entropy ... you're good to go.  But if you instead have 10 sources each of which gives you 256 bits which you are only willing to assume have 26 bits of entropy each ... what next?  Whitening is kind of beside the point.  If there are correlations among your sources, it won't help - though it will *look like* it does.  (In fact, *nothing* will help:  Adding up the 10 sources of 26 bits each is only correct when they aren't correlated; if they might be, and you just don't know, the only safe thing to say is that you have ... the same 26 bits you start with.)

*If* your sources are "known to be uncorrelated" (in whatever way you "know" that they actually give you 26 bits of unpredictability ... but let's not go there), then any function taking the 2560 bits into 256 bits that takes all the bits into account is a reasonable start.  But to avoid getting trapped by what "taking the bits into account" means, you're probably best off using some cryptographic primitive.  A cryptographic hash function will certainly do the trick.  Of course, if you want to keep your security assumptions to a minimum - and, as before, you're going to use AES-256 anyway - a secure MAC with a 256-bit output based on AES-256 is the way to go.

BTW, notice the direction of my argument, which pushes toward making everything rely on *one* primitive.  This sounds counterintuitive, but think about it:  If you design a system *no stronger than its weakest link*, relying on multiple primitives for those links cannot possibly strengthen your system - it can only weaken it.

I suppose you can describe this process as whitening, but it's a bit of a stretch.

                                                        -- Jerry



More information about the cryptography mailing list