[Cryptography] randomness for libraries, e.g. OpenSSL

ianG iang at iang.org
Mon Nov 28 12:20:25 EST 2016


On 27/11/2016 13:51, John Denker wrote:
> On 11/26/2016 07:38 AM, Salz, Rich asked:
>
>> Meanwhile, back in the real world...  What should OpenSSL do, given
>> the wide number of platforms and huge uninformed community that
>> depends on it, do?
>
> That's a good question.  I do not have a comparably good answer,
> but there may be some partial answers.
>
> It has been suggested that one should:
> 	Use what the platform provides
> 	(rather than rolling your own).

Which suggests a further question - what is a platform, and what is not 
a platform?  The short answer is that OpenSSL is not a platform.  So we 
don't need to dig any further in this thread.  (In other threads we 
might see a sense of the CPU-as-platform versus the OS-as-platform, but 
that doesn't effect us here.)


> As a corollary:  We need to to inveigle the OS providers and
> hardware providers to solve the problem.


That.  Rather than spending energy on solving the problem in code, which 
is a really hard objective because of many factors, some of which are 
listed below, put the energy into Inveiglement.

Browbeat the ones without /dev/urandom to add it, or an equivalent. 
Browbeat the ones with /dev/urandom to surface their threat model and 
their security model and improve their best efforts.


> Let's discuss the pros and cons of that:
>
>   1) Pro:  It's cheaper.  Having everybody solve the problem
>    on a per-app (or even per-library) basis leads to lots of
>    duplication of effort.
>
>   2) Pro:  It's more complete.  There are lots of random things
>    the operating system has to do on its own, so an OpenSSL-only
>    solution will not solve the whole problem.
>
>   3) Pro:  It's safer in the long run.  Having everybody roll
>    their own increases the chance that somebody will make a
>    mistake.
>
>   4) Pro: It is unreasonable to ask software authors to solve
>    a problem that simply cannot be solved by software alone.
>    As some guy named John von Neumann said in 1949:
> 	Anyone who considers arithmetical methods of producing
> 	random digits is, of course, in a state of sin.  For,
> 	as has been pointed out several times, there is no such
> 	thing as a random number – there are only methods to
> 	produce random numbers, and a strict arithmetic procedure
> 	of course is not such a method.


So, to do a proper RNG it seems that we have to have access to hardware. 
  E.g., Denker's unplugged microphone port is one such.  But in order to 
gain blessing against John von Neumann's curse, one has to enter the 
dirty world of hardware.  A library that has side-effects like accessing 
random pieces of hardware or relying on CPU opcodes is an unclean beast. 
  Such decisions should be reserved for the OS or the application.


>   5) Con:  We have a situation where:
>     a) the users want the app to "just work"
>     b) the app authors what the library to "just work"
>     c) the library authors want the OS to "just work"
>     d) the OS authors want the hardware to provide what is needed
>     e) but it often doesn't.
>    So, part of the problem is that within the "huge uniformed
>    community" the users are particularly uniformed about whom
>    to blame.  They don't think about security at all until it
>    fails, and then the blame the proximate party, hence the
>    proliferation of per-app schemes.  (I call them schemes,
>    not solutions, for a reason.)
>
>    The "huge uninformed community" thinks the RNG problem is
>    about 100 times easier than it really is.


I'd say this is Context more than Con.  Because of e) the library isn't 
in a position to solve the "just work" problem.

>   6) We don't even have a good specification for what the platform
>    "should" provide.  Among other things:
>      -- /dev/random offers no guarantees of availability
>      -- /dev/urandom offers no guarantees of quality
>
>    This is a terrible situation, but one can almost understand how
>    it developed.  Given the state of the art until recently, if
>    there had been any specified standards, the system would have
>    been unable to meet them.  The growing demand for security in
>    general and randomness in particular has outpaced the supply.
>
>    By way of example, here is something that might go into such
>    a specification:  There should be *one device* ... or if for
>    back-compatibility there are two, they should behave the same.
>    The device should guarantee 100% availability and 100% high
>    quality, high enough for all practical purposes.
>
>    Let's be clear:  a proper RNG device should never block, *and*
>    there should never be any temptation -- or even possibility --
>    of using the device (or the corresponding intra-kernel function
>    call) before the RNG is well and truly initialized.
>
>       Of course a PRNG can always be attacked, but we can make
>       it just as hard as a dozen other things in the system.  If
>       the attacker can read the PRNG state, he can also read
>       your passphrases and certificates directly, which tells
>       us we simply must prevent such attacks, at a point well
>       upstream of the PRNG.

I think,

a) the random v. urandom war is over.  Equality won, they are both the 
same.  FreeBSD got it right, in that of the practically zero users who 
understand and need the difference, they should do some navel gazing, 
and if that doesn't help, all of them are capable of solving it 
elsewhere.  Of the rest, they should use urandom.

b) the offer of no guarantee of quality is actually commensurate with 
the guarantee of security and the guarantee of reliability - nothing is 
entirely secure, and nothing is entirely reliable.

By adopting a "best efforts" approach, the efforts get better;  by 
trying to measure the asymptoticalities of security, reliability and 
entropy ... *and* deliver these to the outside world in a digestible 
simple number, we seem to get distracted by the numerology and forget 
the goal.  The world is more Pareto - we want a better result here 
without compromising the better results in other places - than 
numerological.

c) the guarantee of availability will trump the desires for quality in 
every real world competition.  It's up to our best efforts to ensure 
that quality remains upwards not downwards.


> ==============================
>
> Bottom line:  There are big problems here that need to be fixed.
> However, I don't think that fixing them within OpenSSL is the
> right way to go.



Agreed!



iang


More information about the cryptography mailing list