[Cryptography] Plan to End the Crypto War

Jerry Leichter leichter at lrw.com
Sat Jan 9 08:30:45 EST 2016


> With key split: 100 times every day (365/year) the 5 of the 10 people give their part to the (validated) requestor. What are they going to do now? The requestor has the single golden key. The part holders must watch how the golden key, is used, and destroyed to ensure it was used for the purpose it was intended and not stolen....
Am I the only person who's even *looked* at the paper?  If we're talking about Chaum's proposed system, it works *nothing like this*.  The nodes in the system reveal pieces of information that identify a *single user* of the system, and the messages he's sent. There's no single "golden key" which, once revealed, opens up the entire system.

You might ask why users can't simply establish new identities.  That's actually another part of the assumptions:  That the nodes only allow the participation of users who provide "good identities":  The paper suggests mobile phone numbers, photos, and so on.  These are combined (by all the nodes - the user presents partial information to each node) to create the identity that the system uses; you can't just make up your own.  The *assumption*, of course, is that users can't effectively lie, and that the information available to the nodes will always effectively identify them.  Government "Internet access license", anyone?

The algorithms and techniques look nice, but the whole business with accountability really only works under the assumption that someone puts together a service that "follows the rules" - *and everyone, particularly all the bad guys, choose to use that service, not simply implement their own*.  Granted, that does provide *some* degree of protection:  If *most* people use the government-approved system, anyone trying to reach mass audiences (spammers?) will be unable to do so anonymously.  But it seems like quite a lot of government involvement to relatively little effect.  The bad guys want to talk anonymously *to each other*, and will no more choose to use the government-approved system than they would be to simply talk openly.

Note the difference between *anonymity* and *privacy*.  You could also have said, back in Clipper days, that the bad guys wouldn't use Clipper, they'd use their own stuff.  But if the vast majority of people had used the encryption delivered by Clipper chips in their consumer goods, the bad guys would be forced to use it at all their touchpoints with the outside world.  Yes, they would have secure communication among themselves, but when they went to rent a truck or buy a plane ticket, they would have to use a back-doored mechanism.

It's clear that this has been NSA's driving strategy since then:  Grab as much as you can from everywhere you can, because even if the bad guys do manage to exchange secure messages among themselves, they will have to also rely on (insecure-to-NSA) systems to actually *do* anything in the real world.

There's some validity to this approach, because everyone needs privacy in many of their messages.  But *anonymity* isn't really something most people want or need most of the time - and, in particular, most people almost never need or even *want* to receive anonymous messages.  The real-world interactions that most communications mediate inherently require that the parties identify themselves to each other.  There's not much point in buying something on Amazon if you don't provide them with sufficient information for them to deliver it to you!  (Mix networks for packages, perhaps?)  So its hard to see how a government-approved anonymity system would find many customers.

The "accountable" part of Chaum's "accountable anonymity" strikes me *mainly* as a solution in search of a problem.
                                                        -- Jerry



More information about the cryptography mailing list