[Cryptography] What do we mean by Secure?

Ray Dillinger bear at sonic.net
Mon Feb 9 22:55:41 EST 2015


"Security" means the parts of the system which can be used against
anyone who doesn't understand them by anyone who does.

On 02/09/2015 06:16 AM, Kent Borg wrote:

> It really *is* a hard thing to build a secure connected car. No one has
> done it, and until we can hash out contradictory expectations about what
> the proper properties of this car are, it will remain impossible. (Does
> this car have an app? Oh, hell, now the boundaries of the system BMW has
> to defend probably just exploded. Does the AAA have access? Do the cops
> have access? Does your mechanic have access? Does the authorized BMW
> mechanic have access? Does some BMW engineer have access? Does the
> engine computer port have access? Does the handsfree bluetooth gizmo
> have access? Do the CAN-connected brakelights have access? Does the
> finance company have access? What are the security questions to recover
> access for the owner?)


The fact that anyone is trying to answer all these questions AT THE
SAME TIME, and without discussing each with the car owners for an
extended period first, is probably an indication that failure is
imminent.

BMW's failures with ConnectedDrive are nothing short of comical, but
that's the only possible result, in security terms, of trying to
change a hundred different things at once.  The complexity of the
security model changed too fast and exceeded the understanding of
the users.

With most systems, features that the user does not understand and
does not personally use are harmless.  Security on the other hand
is more or less defined by the fact that every last feature the
users do not understand can be used against them by someone who
does.

And that applies to ALL the users.  The car owners had no idea, of
course, so the system they didn't understand could be used against
them by thieves who did.  But they're just the tip of the iceberg.

The implementers didn't understand their use cases, either, or
they'd have realized that knowledge of the VIN is not something
that identifies an authorized user.  The people making the error
messages did not understand even that (admittedly laughable)
security model, or they'd have known that disclosing the VIN
in every error message amounted to making an authentication token
available.  And that's two more cases of people who didn't
understand something, so it could also be used against them by
anyone who did.

The people to whom this authority was being extended didn't
understand it either - if they'd understood how their "app"
worked to extend that so-called authority to literally everyone
in possession of a cell phone and an understanding of the app,
they'd have known that the authority they had just gotten meant
nothing and was in fact dangerous, and would have demanded its
immediate removal.  And that which they did not understand, once
again, could be used against them by someone who understood it.

BMW tried to change a hundred things all at the same time. People
with no experience (no bone-deep understanding) of newly-changed
parts of the system were trying to design, implement, and use other
new parts of the system.  The result was fore-ordained.

Understanding as measured across such large groups, changes only
gradually, so if you change a security model suddenly or by a
hundred things at once you will fail just as hard as BMW did.

The existing security model is - one person has access, and is
authenticated via a mature technology using a physical token at
least moderately hard to duplicate (a key).  That person may
extend access to others by deliberately and knowingly having a
key made for them.

This is a security model that the users UNDERSTAND!

That understanding puts BMW ahead of most software companies,
and they can arrive at a good solution much faster in the long
run if they move slowly enough to keep that advantage.

Make *one* change.  Wait until all the people who need to understand
it do.  Wait for the thieves to point out the problems with it and for
people to understand how thieves interact with it.  Wait until people
have had time to make whatever adjustments in behavior (or demands
for limitations to it) that they need to make.  Then, you can make
another change.  Repeat. It may seem excruciatingly slow, but trust
me, it's faster this way.

Every security "feature" is a potential hole until its users fully
understand it.  And every security feature designed in the presence
of or depending on features not yet fully understood, compounds
the problem.


				Bear



-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20150209/a6fdda15/attachment.sig>


More information about the cryptography mailing list