[Cryptography] recommending ChaCha20 instead of RC4 (RC4 again)

ianG iang at iang.org
Sat Mar 15 11:31:05 EDT 2014


On 13/03/2014 22:43 pm, Bill Frantz wrote:
> On 3/13/14 at 6:19 AM, dan at geer.org wrote:
> 
>> Let's stipulate that you are entirely correct.  How do we react if
>> we are to learn the lessons of history, etc.?  Can a lack of
>> speedups-to-come be itself relied upon enough to factor that into
>> design decisions yet to be made, such as to put aside any need to
>> design in resistance to a sped-up future or to demand specialized
>> chipsets for devices that will have no remote management interface?
> 
> First, we get no relief from the danger of exhaustive search. It is
> trivial to parallelize.
> 
> If we are interested in security, then we must (a) be willing and
> financially able to throw away the device, (b) be able to upgrade it, or
> (c) be willing to lose security. The cynic in me says we will always
> choose the (c), at least until we have been personally burned.

until we've been personally Feinsteined, yes.

It is for this reason that story telling is so important.  Humans assess
risk on the basis of what happens to their friends.  If enough of them
get hit, then the personal probability assessment rises.  If not enough
of them are hit, then the risk is considered remote.

I suspect that this is essential, because what other objective metric is
there that doesn't get perverted or misused by someone or other?



To make that point, over on the Cryptogram today, there are a couple of
unintended back and forths between Schneier and Doctorow over the "where
are we on the NSA/GCHQ thing?" that lead the latter to this analogy:

http://www.theguardian.com/technology/2014/mar/11/gchq-national-security-technology

But: If I had just stood here and spent an hour telling you about
water-borne parasites; if I had told you about how inadequate
water-treatment would put you and everyone you love at risk of
horrifying illness and terrible, painful death; if I had explained that
our very civilisation was at risk because the intelligence services were
pursuing a strategy of keeping information about pathogens secret so
they can weaponise them, knowing that no one is working on a cure; you
would not ask me ‘How can I purify the water coming out of my tap?’”



Which is to say that we have failed as an industry, and our normal
metrics or objectives lie in tatters, so much so that we allowed a bad
actor to weaken our infra so badly that it is beyond our capability to
roll back the damage.  Now this bad actor has taken on a viral
proportion, and is now the pathogen of strongest intent, such that even
us technically competent people are not capable of defeating or
protecting ourselves, nor is there easy advice for grandma to keep her
house (or more topically today, nor is there easy advice to ones
teenaged daughter to keep her nudity from being hoovered up by GCHQ).

Doctorow's point is that when the individual defence fails in general
society, because of the viral proportions, the solution is "public
health".  Which means nationally administered and imposed protection
because a free choice approach does not account for infection.  (He
doesn't really explore what happens when the pathogens are the national
administration.)



Back to the personal Feinstein moments of individuals when their
embedded devices fail at the behest of attackers, who we can now
identify as likely either inside the supply chain or outside the supply
chain, a complete set.

If people stop believing in institutions such as standards bodies,
certification bodies, and governments, the question is, what or whom
will they trust?  And what could actually deliver that trust?

It seems that without a good answer to that, there isn't much point in
choosing one technical approach versus another.  To me at least.  It is
possible to impress techies with tech, but ordinary people don't think
that way.



iang


More information about the cryptography mailing list