# [Cryptography] RNG design principles

Ron Garret ron at flownet.com
Wed Nov 23 22:54:13 EST 2016

```On Nov 23, 2016, at 4:19 PM, John Denker <jsd at av8n.com> wrote:

>
>>> 1. You need two things: an entropy source, and a whitener. No entropy
>>> source is perfect, so you need a whitener no matter what.
>
> As always, I don't want to argue about terminology.  If the terminology
> isn't helping, we should choose different terminology.  That's a problem
> here, because in this context the word "entropy" has been thrown around
> so carelessly that it's a struggle to figure out what anybody means by it.

A fair point.  The definition I use is: entropy is information (i.e. components of the state of a system) whose values your attacker cannot guess more efficiently than by conducting an exhaustive search.  I don’t want to quibble over terminology either so I’m happy to use a different word, but I thought this was common usage.

>>> and this is the case no matter how good (or bad) your source
>>> actually is. As long as you feed >N bits of entropy into your whitener, you can
>>> safely extract N bits of true randomness out of it.
>
> Actually, not quite.  Whiteners aren't perfect either.  The laws of
> statistics don't permit them to be.  To do things properly, you need
> to feed slightly more than N bits in, to get N bits out.

Yes.  That’s why I said “>N” and not “N”.

> In any case, if we are going to write down a short list of pithy
> commandments, and use them as the basis for future design work, we
> ought to get the details right.

Sure, and I don’t doubt that my original draft could be improved.  But if you are going to critique my draft, you should critique what I actually wrote rather than attacking a straw man.

>>> 3. You don't need more than a few hundred bits of randomness. 128 bits is
>>> enough, 256 is a comfortable margin, 512 is serious overkill. Seed a
>>> cryptographically secure PRNG with a few hundred bits of entropy and you
>>> can safely extract gigabytes of key material out of it.
>
> That's more-or-less kinda usually true, although once again I would
> not promote it to the status of an axiom or commandment.
>
> Again it touches back to item (1), insofar as if you are using SHA512,
> then 512 bits of input is definitely not overkill.

This is debatable, but probably not worth the effort to debate because entropy is really easy to obtain so there is no reason not to err on the side of caution.  Use 1024 bits of entropy if you want, it doesn’t matter.  But don’t pay \$10,000 for specialized hardware to generate megabits of entropy per second, or to try to use quantum effects to produce entropy.  It’s simply not necessary.  Ever.

> More importantly, at some point, you have to address the re-seeding issue.

Why?  It is not hard to build a CSPRNG with a cycle time longer than the age of the universe.  The only reason to re-seed is if your system has been compromised.  Re-seeding is a policy issue, not a technical one.

> I'll let them speak for themselves, but there are people on this list
> who would vehemently object to item (3).  They have spent years building
> PRNGs that constantly re-seed themselves, to the point where the PRNG
> becomes in effect a denial-of-service attack on the HRNG.

Just because people have done this doesn’t mean it’s necessarily the right thing to do.

> *) What to do with a newly-booted (or worse, newly-created) system that
> needs a RNG right away but has not had time to accumulate the adamance
> called for in item (1).
>
> *) How to calibrate that this-or-that piece of hardware, so as to obtain
> the required lower bound on the amount of unpredictability it produces.

These are fair points, and they probably ought to be broken out into separate threads.  But I’ll open the bidding by asking:

1.  What is an example of a system that needs an RNG “right away” (and what does “right away” mean)?

2.  What is an example of system where you would have any doubt whatsoever that a very conservative lower bound was sufficient?

rg

```