[Cryptography] [FORGED] Re: ratcheting DH strengths over time

Ray Dillinger bear at sonic.net
Sun Nov 22 17:47:50 EST 2015



On 11/21/2015 03:38 AM, ianG wrote:

> One can mix and match responses: for example, use ECDHE with a
> server-specific elliptic curve. But it's better to use ECDHE with a
> larger shared curve. Compared to a server-specific curve, a larger
> shared curve saves bandwidth; in most situations ends up saving CPU
> time; and has larger benefits against all known attacks.

How difficult is it to create and validate a set of curve
parameters that are secure, if we are using straight,
unoptimized bignum math to do our ECC rather than tuned
and tweaked code that is hardcoded for moderately faster
computation on some particular curve?

Because I think it's silly for very many people to all be
using the same curve for long periods of time.

As I see it, many people using the same curve all the
time makes that curve into a target.  You may be able to
make a tuned software implementation %30 faster, and
that's a huge net LOSS in security.

If we take djb's premise at face value, and frankly I think
he's right, our attacker model should be the cost PER KEY
of breaking any key in a very large batch.  If everybody's
using the same curve, the same computation searches for
everybody's keys.

A tuned-code implementation of a single curve that is 30%
faster, is a net loss in security because it gives an
opponent a stationary target (which doesn't change for
DECADES) that has a billion times the value of any one
user's traffic.  It gives an opponent means and
justification to have already done extensive precomputation
and compiled large rainbow tables, and justifies for your
opponent the expense of creating hardware implementations
that are %4000 percent faster than your tuned software
implementation and buy truckloads of them.

It is far better to force the opponent to use general-purpose
hardware and software and give them no opportunities or
justification for calculating and storing huge data required
to speed up their work to break keys on a single curve.

So why aren't the users all coming up with a new curves on
a regular basis, especially when creating new high-value
keys?  It may be too compute-intensive to do as part of a
handshake for a transient key, but for LONG-term keys, or
for a server key that lasts a week or more?  Why should
any two of them be on the same curve?

				Bear

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20151122/2f3c1ef4/attachment.sig>


More information about the cryptography mailing list