[Cryptography] IETF discussion on new ECC curves.

Trevor Perrin trevp at trevp.net
Thu Jul 31 15:25:14 EDT 2014


On Thu, Jul 31, 2014 at 9:12 AM, Phillip Hallam-Baker
<phill at hallambaker.com> wrote:
> On Wed, Jul 30, 2014 at 4:42 PM, Bear <bear at sonic.net> wrote:
>>
>>> On Mon, Jul 28, 2014 at 7:48 PM, Bear <bear at sonic.net> wrote:
>>> > On Sat, 2014-07-26 at 14:32 -0400, Phillip Hallam-Baker wrote:
>>
>>> > So the first fermat-test prime number below 2^512 is 2^512 - 569? It's
>>> > a nice nothing-up-my-sleeve number, anyhow. What's the problem with it?
>>> > Are there some requirements I don't know?
>>
>>> Yep, that is exactly what Microsoft did. The problem is that it is not
>>> exceptional speed wise. The fast moduli are 2^521 and 2^480.
>>
>> Despite the non-exceptional speed, 512 = 2^9 is THE next
>> nothing-up-my-sleeve number for a bit width, and 2^512 -
>> 569 is therefore THE next nothing-up-my-sleeve number for
>> an exponent modulus.
>>
>> The minute there is debate, there is reasonable suspicion
>> that someone is trying to influence the debate for purposes
>> of subverting security.
>
> Exactly.

There's always room for debate.

E.g. why are you pushing so hard for 2^512-569?  What do you know
about it I don't?  Why do you want the high-security curve to be a
slow one that will be used less than a faster one?

And why primes with efficient reduction?  Aren't even slower random
primes THE choice for avoiding suspicion (like Brainpool)?

(Though of course "random" primes requires a verifiably-pseudorandom
process, so more to debate there too, see BADA55....)


>> Gratz to Microsoft, whatever their past sins, for making a
>> recommendation that marks them out as DEFINITELY not playing
>> that role in the current round.
>
> Its important to remember that organizations are made up of people. In
> this case the person driving this is Brian LaMacchia who is well known
> in the community. He set up the PGP key server most people still use
> at MIT.
>
> It also goes the other way. Saying 'I trust the NSA' is ridiculous
> because it is an organization made up of people.

IMO we should focus on the curves instead of the people involved.

Many people (like myself) think it's unlikely that NIST/NSA backdoored
those curves.  There's no known algorithm that could have done so.
Even if you hypothesize an undiscovered 1-in-a-trillion "bad curve"
property, NIST/NSA choosing curves to match this property would leave
government systems vulnerable to any adversary who discovered that
property.

Many people are interested in new curves not due to the backdoor
issue, but due to the possibility of higher performance, including
high performance at "extra-strength" (>>128 bit) security levels.


> At the IRTF meeting I suggested we split the baby and adopt 2^255-19
> for fast encryption and 2^512-569 for signature and high security
> encryption. I am thinking that is the right approach.

I don't think you should make a decision ignoring performance
criteria.  There's not much reason to think that a 2^256 security
level will provide security over a longer or shorter duration than
2^224 or 2^260.  At that level you're worried about unpredictable
break-throughs, e.g. quantum computing which would destroy the
security of all these.  Moore's law will never get there.

There's also not much reason to think that fixating on an particular
bit length and prime form is less arbitrary than choosing based on
objective performance criteria, IMO.


> DJB does have some deployment momentum for his curve. There is a switching cost.
>
> If someone is worried that the NSA has bongoed the curves or has a way
> to bongo them then they should be using the 2^512 curve and not
> Curve25519.It is pointless worrying too much about the minutiae of
> lower security options.

That doesn't follow, and is unfair to 25519.  25519 was chosen based
on objective performance and rigidity criteria.


> Since I am doing PKI and static encrypted data, I have long term
> security concerns and any decision I take now will have long term
> consequences. I want 50+years security on stored data keys and at
> least 20+ years on PKI key signing keys.
>
> So for me 2^512-569 is the logical choice.
>
>
> For the TLS group, there is a big performance issue and so a 2^128
> work factor is defensible, particularly for PFS keys.

Performance is an issue for many systems, not just the web  - think
mobile or embedded, e.g.

And these systems would *also* like to have a viable "extra-strength"
curve option.

An extra-strength curve that is twice as fast (which is the current
Goldilocks vs 2^512-569 ratio) would be applicable in more places.


> I would even
> defend deriving a 256 bit key for AES 256 from a 256 master secret and
> a 128 bit PFS mixin.

I'm not sure what key agreement you're imagining.  For long-term
confidentiality, there is a small argument that one might want
*larger* ephemeral DH keys compared to identity keys, as identity keys
are probably replaced more quickly, and confidentiality might have to
last for decades.


Trevor


More information about the cryptography mailing list