[Cryptography] RSA is dead.

ianG iang at iang.org
Tue Jan 21 04:06:22 EST 2014


On 21/01/14 01:10 AM, Phillip Hallam-Baker wrote:
> On Mon, Jan 20, 2014 at 3:15 PM, Jonathan Hunt <j at me.net.nz> wrote:
> 
>> On Mon, Jan 20, 2014 at 11:39 AM, Jerry Leichter <leichter at lrw.com> wrote:
>>> On Jan 20, 2014, at 12:49 PM, John Kelsey <crypto.jmk at gmail.com> wrote:
>>> This is one reason I find all the whining about the NSA/RSA business a
>> bit of revisionist history.


(I think that's precisely what it is:  "Historical revisionism, the
critical re-examination of presumed historical facts and existing
historiography" or, the rewriting of history to better interpret with
new logic and/or new facts.  Always controversial, often painful,
sometimes pejorative, but never wrong.)


>> You can't look at what RSA did in the light of
>> what we know today.  You have to look at it based on what was known or
>> reasonably strongly suspected at the time.  Certainly at the time DUAL EC
>> DRBG was added to the NIST standards, and RSA added it to BSAFE, NSA was
>> accepted in the role of "helper".  The demonstration that it *could* have a
>> trap door didn't show it *did* have a trap door - and after all NSA was
>> fulfilling its role of helping to improve the security of American
>> communications, no?  (Well, that *was and is*  one of its legally-defined
>> roles, and that was the one we all saw, repeatedly, in public.)


Such a ncie story.  It is a textbook spook approach, the story has to be
reasonable, plausible, benign and even insulting to criticise.


>> Here is the presentation from 2007
>> http://rump2007.cr.yp.to/15-shumow.pdf
>> demonstrating that when the constants are chosen they are able to
>> break DUAL EC. Note, not speculating, but demonstrating a working
>> attack (using their own chosen constants). "In every experiment 32
>> bytes of output was sufficient to uniquely identify the internal state
>> of the PRNG."
>>
>> So the only unknown after 2007 was, does someone have the secrets from
>> the NIST specified constants? This is MUCH worse than some theoretical
>> weakness that may or may not turn out to be important. This is a
>> practical break.
>>
>> No competent crypto company could be recommending DUAL EC after 2007.
>> No speculation about whether they should or shouldn't have trusted NSA
>> is needed. After 2007, DUAL EC was a known badly broken PRNG,
>> demonstrated a public presentation for respected crytographers. To
>> continue to leave it as the default for the next 5 years is a total
>> failure at their core business.
>>
> 
> They were a little subtler.
> 
> The NIST standard permits the use of user defined curves. They didn't trust
> the Fort Meade folk either. The scheme is secure if you choose your own
> curves but most people don't.


Defaults are seen as a way to improve security, never a way to reduce
security.  Yet, by definition, if there is a way to improve, there must
be a way to reduce.  What's that about?

Assuming 99% of the users are not competent at this level, they cannot
be relied upon to choose the setting with increased security.  The
existence and taking of a choice of two options is at least as likely to
lead to decreased security as increased security.  These probabilities
sink into ridicule as the number of options increases...

So the only secure choice for the 99% is to never touch the default,
*and* to rely entirely that the designer has chosen the only one mode,
and it be secure.  As the default.

Yet, the DUAL_EC system is apparently, allegedly only secure if you
change the defaults.

Fort Meade knew this.  This attack on our cognitive dissonance can be
eliminated entirely by removing the default.  Defaults are security sin.
 Oh, my, that was easy.


> In fact the use of a deterministic RNG with that type of trapdoor is
> arguably a best practice. It provides a way to audit the operation of a
> manufactured device.
> 
> The behavior of the device is transparent and deterministic if the backdoor
> constants are known and pseudo random and non predictable otherwise.
> 
> The device itself has no way to tell if it is being fed trapdoor constants
> or not and thus no way to tell if is being audited or not.


I certainly agree with the logic.  The comment I saw from some NSA
official indicated that anything other than this approach was another
sin.  I wish I could recall where that was (John recently called me on
my misinterpretation, so I'm stuck!).

But this obviously places a lot of power in the tester's hands, without
a clear way to mitigate it.

"Works for us!"  but is this an *exportable* best practice?




iang


More information about the cryptography mailing list