[Cryptography] Proper Entropy Source

John Denker jsd at av8n.com
Wed Jan 22 19:27:36 EST 2020


On 1/22/20 3:47 PM, Bill Frantz wrote:

> In this area, I'm with John von Neumann, "Anyone who attempts to 
> generate random numbers by deterministic means is, of course, living 
> in a state of sin."

Yes.

> Fortunately, for most cryptography, we don't need randomness, we
> need unguessability.

OK, but beware that the word "randomness" means different
things to different people.  Some people /define/ randomness
to be synonymous with unguessability in this context.

> But, as John points out, that is squish.

I disagree.  That's not what I mean by the word.  Squish
exists even along the guessability axis:
 -- somethings are provably deterministic, and therefore
  100% trivially guessable.
 -- some things are provably unguessable, and therefore
  random enough for ordinary crypto purposes, even in
  quite demanding situations
 -- some things are neither provably guessable nor
  provably unguessable.  I call this squish.

I wrote:

>> Also keep in mind the fundamental equation: squish + squish =
>> squish.
>> 
>> In other words: Combining multiple squishy sources does *not* 
>> produce randomness.  It might make it harder for you to predict, 
>> but again, that is not the criterion.

> And harder to predict is something we want.

No!  Harder for the good guys to predict is not the criterion.
It does us no good at all.  What matters is whether the
/adversary/ can guess it.

> If I can combine 100 sources that each give me 1 bit of
> unguessability, I have a good start on having a cryptographically
> useful unguessable number. 

That's entirely correct, provided (!) we are talking about
real provable unguessability, *not* squish.

> High precision timing of packet arrival may be useful here.

I'm skeptical of that.  It depends on how high the resolution
is.  Packet arrival times are something that the adversary
might well be able to observe -- or maybe even manipulate.

> Ideally, you have a hardware generator with a theoretical reason to
> think it is random.

Yes, that is ideal.  And that is sufficiently cheap and easy
that I cannot imagine why people keep hatching grossly nonideal
schemes.

> If you don't trust the organization that made your hardware source,
> combine it with some squish to make the opponent's job harder.

I disagree for multiple reasons:

1) If the adversary has pwned the platform, all your data are
belong to them.  They won't mess with the RNG, because there
are a dozen easier lines of attack.

2) From an engineering point of view, squish is worthless.
The goal is to build a reliable system.  Squish is by
definition not reliable.
 -- If the source was random enough i.e. unguesable enough
  without the squish, adding squish doesn't help.
 -- If the source was *not* random enough i.e. guessable
  without the squish, adding squish doesn't help.

As the proverb says: 
	Do not confuse the presence of one thing
	with the absence of another.
It does not matter how much squish the source produces;  what
matters is how much randomness aka unguessability it produces.
If it produces randomness at a moderately low rate, that's
usually OK, because you can accumulate it over time.

3) Using multiple unreliable sources is almost never an
acceptable substitute for one reliable source.  It "might"
make sense if you could prove that the failures were anti
correlated, but I've never seen such a proof.  Instead what
happens is that the bad guys defeat each of the unreliable
sources, then eat your lunch.


More information about the cryptography mailing list