[Cryptography] [RNG] on RNGs, VM state, rollback, etc.

ianG iang at iang.org
Sun Oct 20 05:28:32 EDT 2013


On 20/10/13 06:36 AM, Viktor Dukhovni wrote:
> On Sat, Oct 19, 2013 at 05:22:55PM -0400, Jeffrey I. Schiller wrote:
>
>> That is a value judgment, one where you let security be more important
>> than anything else. That is a mistake.
>>
>> There are plenty of applications where it is better to have things
>> work then to have them not work in the name of security. Consider an
>> embedded controller running a critical resource (like your heart
>> pacemaker). It is better to have it fail by using poor entropy then to
>> fail completely and leave you dead.
>
> Indeed.
>
> Some years back a few brave users of Postfix attempted to use the
> GnuTLS OpenSSL API compatibility layer to run Postfix over GnuTLS.
>
> It was found that the GnuTLS library would call exit() if it did
> not find an entropy on startup instead of returning an error to
> the application.  This approach was deemed safer by GnuTLS.  When
> this was discovered, Postfix dropped support for the GnuTLS OpenSSL
> emulation.  From the TLS_README file:
>
>      NOTE: Do not use Gnu TLS.  It will spontaneously terminate a
>      Postfix daemon process with exit status code 2, instead of
>      allowing Postfix to 1) report the error to the maillog file,
>      and to 2) provide plaintext service where this is appropriate.
>
> I don't know whether this has changed since, but I concur that the
> security/availability tradeoff is not always clear-cut.


Good example.  I'm going to get off the fence and say that the RNG 
should never block.

The security/availability tradeoff is *always* a business decision.  In 
the above example, it is a decision of Postfix.

The problem for the tool above, TLS, is that it doesn't know the 
business, nor the business decisions.  So it tries to make its product 
as universal as it can.  Which means, in the eyes of the tool-maker, it 
should be as secure as it can be.

But not more.  The trap is sprung when in order to make the product that 
little bit more secure, and defeat the bogeyman threats, an impractical 
requirement is loaded onto the business.

Examples abound:

   * In the above, Gnu TLS exits.  Bad.
   * With RNGs, they block.  Not good because programmers can't deal 
with it, and blocked startup sequences can cause deadlocks, and and 
and... (I recall that FreeBSD 'fixed' this by making /dev/random never 
block, and equivalent to urandom.  Highly controversial at the time.)
   * With SSL, to solve the bogeyman MITM, they imposed certs and turned 
off opportunistic opportunities -- and consequently split the security 
for the web, and made it always vulnerable to an easy downgrade attack. 
  A billion or so of damages for that 'perfect' decision.

A product should be as secure as it can be *at no cost to the user*, but 
no more.  Beyond that, the user has to make the decisions, and if the 
user really cares, the user can read the doco and understand the 
limitations.

If they don't do their research, and they need to, then they are 
negligent.  It is not up to the tool to predict how negligent or needy 
the caller is.  There is no such thing as perfect.



iang


More information about the cryptography mailing list