[Cryptography] Opinions on signatures algorithms for post-quantum crypto?

Jerry Leichter leichter at lrw.com
Tue Dec 8 06:28:40 EST 2015

> I read the pseudo-code for NTRU.  I have not attacked this problem enough to get a feel for it, but my dumb arm-chair crypto knee-jerk reaction is that it is a bit scary.  There are plenty of NP-complete problems where it is difficult to state an instance that is hard to solve, such as graph isomorphism.
Ahem.  Babai has recently claimed a proof that graph isomorphism is "almost polynomial" and in particular cannot be NP-complete.  This is a big enough result to have hit the mainstream press.  One link:  http://people.cs.uchicago.edu/~laci/quasipoly.html

>   This particular NP-complete problem looks harder, but I am concerned...
NTRU is "not known to be vulnerable to quantum attacks" - a much weaker statement than "NTRU is known not to be vulnerable".  There is surprisingly little literature on NTRU, given how long it's been around.  Most of the papers that show up on a quick search are about fast implementations - which is nice, but irrelevant from the point of view of security.
Jaulmes and Joux http://www.iacr.org/archive/crypto2000/18800021/18800021.pdf claimed a polynomial chosen-ciphertext attack.  That's quite some time ago and presumably the algorithm was modified to counter the attack.

http://dimacs.rutgers.edu/Workshops/Post-Quantum/Slides/Silverman.pdf is a good overview.  It includes the fact that by 2006 an attack (the transcript attack) was known that destroyed the then-extent version of the system.  It was modified again, and the new version is provably secure.

The general claims about NTRU follow an RSA-like pattern:

1.  The underlying problem is hard (much harder, in fact, than factoring)
2.  There's no known reduction from the cryptosystem to the hard problem, but we think there ought to be one.  (Unlike RSA, proposed variants *have* been shown not to have such reductions, i.e., they were attackable.)

It's really hard to know how NTRU will hold up.  The pattern of successful attack followed by modification to avoid the attack has in the past often been an indicator that the underlying idea is unsound:  You close off one attack, and another appears a few years later.  Then again, it's not clear how else you might develop a secure system.  DH has survived multiple rounds of this kind of back and forth; ECC less so. (RSA has proven remarkably robust, in some senses.)

The fact that so much of what's out there spends so much time saying "NP-complete!" and then selling the speed and simplicity and small sizes is ... worrying.

Given who NSA has proven to be, one needs to be very concerned about the way they are pushing cryptography.  Just as we're starting to get a handle on some of the subtle issues in ECC - e.g., that the choice of curve can have fundamental effects on the difficulty of writing correct, secure implementations - the NSA comes along and says "OK, time to move on to all new algorithms that no one (maybe) or NOBUS (No One But Us) really understands the implications of", so move on back to complexity and uncertainty.  Are they pulling us out of the quicksand or pushing us into another trap?
                                                        -- Jerry

More information about the cryptography mailing list