[Cryptography] North Korea and Sony
leichter at lrw.com
Thu Dec 11 16:44:37 EST 2014
On Dec 11, 2014, at 10:22 AM, dan at geer.org wrote:
> Jerry, you and I are in violent agreement that there is enormous
> fragility in the buildout of systems as presently deployed.
> So many things come to mind, the most pithy being Mike O'Dell's
> comment that "Left to themselves, creative engineers will deliver
> the most complicated system they think they can debug."
There's also the quote from one of the Unix designers - I think - that programmers build the most complex systems they can understand. Since debugging is widely understood to be much harder than programming, this means we regularly build systems we aren't smart enough to debug.
> The financial services industry has proven Mike's contention at scale, viz., that we humans can (will), in general, build systems too complex to then operate.
Good engineers look at systems that are too complex and think about how to make them better. Financial guys look at systems that are too complex and think "I see ways to make money off that."
Many years ago, a friend was starting out as a lawyer dealing with the financial industries. He described multiple ways that people found to make money by exploiting loopholes in the laws and regulations. I came to a realization: "Financial engineering and lawyering" and "software engineering" both have a large component that involves finding bugs in existing specifications and systems. The difference is that over on the finance side, finding loopholes is seen as a great way to make gobs of money; while over on the software side, it's seen as an opportunity to make the specs and systems better. Of course, on the software side we have the "bad hackers". But few of us celebrate them.
> Let us hope that mandated electronic health records plus
> 15K hospitals plus three dozen Federal agencies plus all the world's
> reinsurers will not re-prove that same contention.
The rollout of Obamacare is certainly not reason for optimism.
Obamacare also revealed an old, but recently growing, problem with the perception of what software developers can actually do. A huge system that will deal with millions of people to implement a complex, sometimes badly written law - to roll out in one big bang in 18 months? Sure, no problem. A system that will block "inappropriate material" on line, without blocking what shouldn't be blocked? "If they can figure out a way to do X, surely they can do *this*. Let's mandate it...."
> But as to Geithner and stress tests, my last column(*) was precisely
> on that point and I did, indeed, steal directly from the financial
> market experience in proposing a prescription for national-scale
> "digital life."
> Stress Analysis
Indeed, we're on the same page. But ... the largest Internet players do this kind of thing already. Netflix has its Chaos Monkey, which goes out and randomly breaks things on the live infrastructure, just to test that the overall system will survive. Google does exercises every year in which "bad things" happen - e.g., an earthquake knocks out communications with Mountain View. Having seen the Google exercises close up ... they are taken very seriously, and help make the overall system much more resilient.
One wonders, however, whether the willingness and cultural ability to embrace such exercises is tied to a certain kind of "by the seat of your pants" Internet way of doing things. How would deliberately induced chaos for testing and improvement fit into a top-down, control-everything culture like Apple's? I would think not well. Are there alternate models that *would* work? Does Apple have an approach? (It's not the kind of thing they would *ever* talk about.)
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 4813 bytes
Desc: not available
More information about the cryptography