[Cryptography] The Laws (was the principles) of secure information systems design

Jon Callas jon at callas.org
Tue Jul 19 12:01:02 EDT 2016


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256


> On Jul 12, 2016, at 1:30 PM, Peter Fairbrother <peter at m-o-o-t.org> wrote:
> 
> I've been revising the principles, and came up with this. It's an early version.
> 
> As ever, corrections and suggestions are welcome.
> 
> Calling them Laws is perhaps a bit overreaching - but on reflection I thought that's mostly what they are, break them and the system won't be secure.

I agree that it's overreaching. They're not laws, not in the way that a physicist or mathematician would consider laws. Relatively few are actionable. They're far more like aphorisms.

> 
> I will put the Laws up on the 'net shortly, hopefully with a link for suggestions and comments.
> 
> 
> 
> The Laws of secure information systems design:
> 
> 
> 
> Law 0: It's all about who is in control

Is it? I know what you mean, and yet I have no idea what you mean. 

> 
> Law 1: Someone else is after your data

Well, Robert Morris Sr. had a fantastic principle that security boils down to a 2x2 matrix of how much someone else wants your data, and how much you care if it's known. If you don't care about something and they don't care, then we have something close to a no-op. If you don't care and they do, then they're going to get it and you don't care. If you care and they don't, you're wasting effort -- and this is itself interesting. It's only in the quadrant where they care and you don't want them to get it that things get *really* interesting. Part of good security is learning to divert effort from protecting things you care about from things they don't care about to things they do.

In any event, this law seems to imply that only that interesting quadrant exists.

> 
> Law 2: If it isn't stored it can't be stolen

That's clever, but the more I think about it, the less I understand it. What does "store" mean? Is a value in a register stored? Is data in transit stored? I think I can argue that they both are and that it isn't unreasonable to think that everything is stored. If everything is stored, then the law is meaningless because it becomes a false-implies-true tautology.

It also seems to be in contradiction to your #13.

> 
> Law 3: Only those you trust can betray you

I know what you're trying to say. I'd think something better might be, "talking securely to a snitch is still insecure." Or something like that.

> 
> 
> 
> 
> Law 4: Attack methods are many, varied, ever-changing and eternal

Is this any different than the aphorism that the defender has to protect everything, but the attacker only has to win once?

Personally, I despise that particular aphorism because it is to me defeatist. It seems to basically say, "why bother, the attacker has already won."

But each of them, your #4 and the one I reference are to me trying to say:

Attackers adapt, nature doesn't. Friction doesn't develop countermeasures to lubricants.

> 
> Law 5: The entire system is subject to attack

Well, no. Your system has to have an attack model and part of the attack model is a declaration of what's out of scope. We assume that the attacker can't read your mind. We assume that a value kept in registers won't end up on the memory bus. We assume that pinned memory won't get swapped. 

Sometimes those assumptions are thwarted. For example, pinned memory might not be pinned in a crappy VM, the CPU might be emulated, the compiler might have optimized something away (like clearing the key). In such a case, the problem is real but the lesson is different.

The attacker will attack the weakest point. The attacker will do the least work to win.

> 
> Law 6: A more complex system has more places to attack

But is also has more places to defend.

Complexity gets a bad rap for some good reasons. But let's also remember the Einstein quote that something should be as simple as possible and no simpler. 

In my experience as a designer, complexity arguments are the first refuge of amateurs.

In Don Norman's "Living with Complexity" there is much wisdom. He discusses what he calls "Tesler's Law" for Larry Tesler, that the complexity of a system remains a constant. You can shuffle it around but you can't eliminate it. He notes that two systems lauded for their simplicity are the classic "Macintosh" GUI and the Unix command line. The GUI is a few knobs that do a lot of complex things simply, and the command line is a lot of knobs each of which is simple but in combination are both powerful and subtle. Each is both simple or complex depending on your point of view.

> 
> Law 7: Holes for good guys are holes for bad guys too
> 

I think this can be tighter. Words like "holes" are pejorative to start with. No one, no one puts in holes for the good guys. Even the LEA people whom I mock as wanting a "magic rainbow unicorn key" don't want a *hole*.

> 
> 
> 
> Law 8: Kerckhoffs's Principle rulez! - usually...

Usually? And what do you mean? Why not just state, "the only part of your system that should be secret is the key" (Kerckhoff's principle) and be done with it?

> 
> Law 9: A system which is hard to use will be abused or unused

Here's my equivalent: The most secure system is the one people will actually use.

Compare and contrast with complexity. Simple systems are often hard to use.

> 
> law 10: Design for future threats

Great advice. Completely non-actionable.

You have to work with the tools you have. You have to play the cards that are in your hand.

> 
> Law 11: Security is a Boolean
> 

Not really.

This is in contrast to the old, "Security is a journey, not a destination." Booleans are destinations.

There are plenty of security systems that are good for X but not for Y. Threat models again are key.

At the end of it, security in general and cryptography in specific is about adaptation and picking the right thing. 

Let's also discuss other things like usability, scope, appropriateness to the threat, and so on.

"Security is not a boolean" is in my opinion an better law than "security is a boolean.

Worst of all, they're both great laws.

> Law 12: People offering the impossible are lying

The old aphorism, "if something sounds too good to be true, it probably is" is better.

> 
> Law 13: Nothing ever really goes away

Well. Lots of things do. I could list lots of them. I can also list a lot of things I wish would go away.

> 
> Law 15: "Schneier's law" [1] holds illimitable dominion over all... including these laws
> 

Wouldn't it be better to just quote "Schneier's Law" as you put it, or to just simplify that to "it's easy to make a system you can't break, but hard to make one that others can't"? 


Here's a few of my own principles:

The first rule of security is "You can't protect everything."

No compromises is a compromise.

Layering is important. 

There's no sense is strengthening the strong links of the chain.

You can land a man on the moon, but you can't land a man on the sun.

	Jon

> 
> -- Peter Fairbrother
> 
> [1] "Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around."
> _______________________________________________
> The cryptography mailing list
> cryptography at metzdowd.com
> http://www.metzdowd.com/mailman/listinfo/cryptography


-----BEGIN PGP SIGNATURE-----
Version: PGP Universal 3.3.0 (Build 9060)
Charset: us-ascii

wsBVAwUBV45Ow/aTaG6hZJn9AQjdYgf/aGlqB7QzQd5q4GtRoOm4kYF7/wWjCXx4
CI32dJC6VQwk356b+8DmuA1tsvqxti+3rR4dGfINnzO0yUN+dmjEEifJg0SSXLRl
B2yj7RDZopBnPawf5DPTaA/Ka95neClffk0R8zNCkAxJMX1xmfchto5dEeI1ZwUH
UmUrQKX885R7jvSkLmTc+7KbCFLz/ZppkZ+Lqbjc70jK9oNfH4x6jrW5ht/UC1NY
IK8vozI2u3UrBMn3N/rEWV5yguHFoN0kaSIXdbU+kIr0tKnGwQinW3UjT152sRYf
QbBIkRyJv3ucOFsy0OyCYI+Hftnz5GJBqh76XqobqlpaBpzqmR1zdQ==
=93JB
-----END PGP SIGNATURE-----


More information about the cryptography mailing list