[Cryptography] Which big-name ciphers have been broken in living memory?

ianG iang at iang.org
Sun Aug 17 07:53:14 EDT 2014


On 17/08/2014 09:37 am, Philipp Gühring wrote:
> Hi,
> 
>> Do we need NIST or IETF to put the dragonglass blade into them?  An RFC
>> that lists deprecated algorithms, updated on a yearly basis?
>>
>> (That's a serious question, btw.  As far as I know, they don't have an
>> answer to the overall question...)
> 
> I thought that the standardisation institutes were responsible for that.


What are the national standardisation institutes responsible for?
Seriously?

Not the net:

On 11/08/2014 19:03 pm, Jon Callas wrote:
> If you want to do serious work in Russia (banking, government)
> you must do a number of GOST things. If you want to work in
> Korea, you need to do SEED. If you want to work in Japan, you
> don't *have* to do Camellia etc., you can do AES, it's just
> that they'll pick a product that has those over yours.



I think we are all guilty for assuming their interests have an
unchallengeable connection with the interests of the global community.

And, guilty of letting them tell us what to do.  How many people do you
know who stood up and said NIST was wrong about the keylengths for CAs?

We assume too much, we know what the net requires, they do not.  We have
a job to do, why don't we do it?


> Isn´t NIST regularly publishing what they currently think is acceptably
> useable? 


NIST seems to publish a very short list.  http://www.keylength.com/en/4/
shows 2,3-DES (they call it DEA) and AES.

What they don't say is anything else:  RC4?  MD5?  Camellia?  Skipjack?
 IDEA?  Maybe the sense of this thread is that RC4 is still a big-name
because nobody bigger has told people to stop using it?


> In Austria, I think I regularly heard statements about the current
> viability and expiration plans of various algorithms and keylengths. (for
> commercial and governmental applications)


Yes.  Back in 2000 or so, Lenstra and Verheul published their now
classic paper which draw a connection between the different types of
algorithms.  This made it tractable to think in bit-numbers, which even
managers could reason with.  128 bits of strength now made sense.

The result of that is a series of national statements on what they think
are the appropriate lengths.  For the rest of us, there is
http://keylength.com .

But that is about key lengths, assuming that the sizes of the keys are
reasonable proxies for the strength of the result.  This is probably a
good starting point if the algorithm is unchallenged, e.g., it is
big-name in the sense I was using the term.

(Yes, we don't have a really good title for that as yet.)


> Expirations are usually planned and notified ahead of time there.

Ah.  Tell me about the SSL process for that?  Or any IETF process?

> Recently I received a letter stating that they had to revoke a certificate
> in my citizen-card (smartcard), since the algorithms that were used for it
> had expired now, and that I would have to get a new card now if I wanted a
> new certificate.


:)  So, let's say they were doing a responsible design effort to produce
your next smartcard citizen's ID.  The would want to pick some
algorithms that had at around 15 years left to go, after they started
rollout.  That's 10 years for the card to match passport, rolling out
for the next 5 years.  Now add 5 years before that for the rework.

So they need a 20 year timeframe.

But, if they are following national standards, they would likely then
have to add another 10 years to that, because that organisation has its
own lifecycle issues.

Which becomes tougher if you also consider that the smartcard has its
limitations:  you need a public key algorithm that is strong out to 25
years that will run fast on a smartcard?


> But since those are all just national standards, perhaps it´s really a
> good idea to write a RFC about it and update it yearly, yes.


If the IETF had a consensus about this they would, yes.  They would have
done it ages ago.

Problem is, IETF has a consensus to keep multiple algorithms in play,
probably because of the game theory aspects of securing rough consensus.
 What they do not have is any consensus on retiring old stuff, probably
for the same reason.

Meanwhile, your poor sysadm has to trawl through openssl config strings
because he's not actually represented in the consensus of IETF WGs.
It's his fault of course, he could always join, he knows how to use
email ;-)



iang



More information about the cryptography mailing list