[Cryptography] randomness for libraries, e.g. OpenSSL

John Denker jsd at av8n.com
Sun Nov 27 13:51:54 EST 2016


On 11/26/2016 07:38 AM, Salz, Rich asked:

> Meanwhile, back in the real world...  What should OpenSSL do, given
> the wide number of platforms and huge uninformed community that
> depends on it, do?

That's a good question.  I do not have a comparably good answer,
but there may be some partial answers.

It has been suggested that one should:
	Use what the platform provides
	(rather than rolling your own).

As a corollary:  We need to to inveigle the OS providers and
hardware providers to solve the problem.

Let's discuss the pros and cons of that:

  1) Pro:  It's cheaper.  Having everybody solve the problem
   on a per-app (or even per-library) basis leads to lots of
   duplication of effort.

  2) Pro:  It's more complete.  There are lots of random things
   the operating system has to do on its own, so an OpenSSL-only
   solution will not solve the whole problem.

  3) Pro:  It's safer in the long run.  Having everybody roll
   their own increases the chance that somebody will make a
   mistake.

  4) Pro: It is unreasonable to ask software authors to solve
   a problem that simply cannot be solved by software alone.
   As some guy named John von Neumann said in 1949:
	Anyone who considers arithmetical methods of producing
	random digits is, of course, in a state of sin.  For,
	as has been pointed out several times, there is no such
	thing as a random number – there are only methods to
	produce random numbers, and a strict arithmetic procedure
	of course is not such a method.

  5) Con:  We have a situation where:
    a) the users want the app to "just work"
    b) the app authors what the library to "just work"
    c) the library authors want the OS to "just work"
    d) the OS authors want the hardware to provide what is needed
    e) but it often doesn't.
   So, part of the problem is that within the "huge uniformed
   community" the users are particularly uniformed about whom
   to blame.  They don't think about security at all until it
   fails, and then the blame the proximate party, hence the
   proliferation of per-app schemes.  (I call them schemes,
   not solutions, for a reason.)

   The "huge uninformed community" thinks the RNG problem is
   about 100 times easier than it really is.

  6) We don't even have a good specification for what the platform
   "should" provide.  Among other things:
     -- /dev/random offers no guarantees of availability
     -- /dev/urandom offers no guarantees of quality

   This is a terrible situation, but one can almost understand how
   it developed.  Given the state of the art until recently, if
   there had been any specified standards, the system would have
   been unable to meet them.  The growing demand for security in
   general and randomness in particular has outpaced the supply.

   By way of example, here is something that might go into such
   a specification:  There should be *one device* ... or if for
   back-compatibility there are two, they should behave the same.
   The device should guarantee 100% availability and 100% high
   quality, high enough for all practical purposes.

   Let's be clear:  a proper RNG device should never block, *and*
   there should never be any temptation -- or even possibility --
   of using the device (or the corresponding intra-kernel function
   call) before the RNG is well and truly initialized.
 
      Of course a PRNG can always be attacked, but we can make
      it just as hard as a dozen other things in the system.  If
      the attacker can read the PRNG state, he can also read
      your passphrases and certificates directly, which tells
      us we simply must prevent such attacks, at a point well
      upstream of the PRNG.

==============================

Bottom line:  There are big problems here that need to be fixed.
However, I don't think that fixing them within OpenSSL is the
right way to go.


More information about the cryptography mailing list