the joy of "enhanced" certs

Leichter, Jerry leichter_jerrold at emc.com
Wed Jun 4 17:50:49 EDT 2008


On Wed, 4 Jun 2008, Perry E. Metzger wrote:
| As some of you know, one can now buy "Enhanced Security" certificates,
| and Firefox and other browsers will show the URL box at the top with a
| special distinctive color when such a cert is in use.
| 
| Many of us have long contended that such things are worthless and
| prove only that you can pay more money, not that you're somehow more
| trustworthy.
| 
| An object lesson in this just fell in my lap -- I just got my first
| email from a spammer that links to a web site that uses such a cert,
| certified by a CA I've never heard of ("Starfield Technologies, Inc.")
| Doubtless they sell discount "Enhanced Security" certs so you don't
| have to worry about paying more money either. I haven't checked the
| website for drive by malware, but I wouldn't be shocked if it was
| there.
| 
| I'm thinking of starting a CA that sells "super duper enhanced
| security" certs, where we make the company being certified sign a
| document in which they promise that they're absolutely trustworthy.
| To be really sure, we'll make them fax said document in on genuine
| company letterhead, since no one can forge letterhead.
This message, shortly after our discussion of trust, makes me think of
the applicability of an aspect liguistic theory, namely speech acts.
Speech acts are expressions that go beyond simply communication to
actually produce real-world effects.  The classic example:  If I say
"John and Sarah are married", that's a bit of communication; I've passed
along to listeners my belief in the state of the world.  When a
minister, in the right circumstances, says "John and Sarah are married",
those words actually create the reality:  They *are* now married.

There are many more subtle examples.  A standard example is that of
a promise:  To be effective as a speech act, the promise must be
made in a way that makes it clear that the promiser is undertaking
some obligation, and the promiser must indeed take on that obligation.
There's a whole cultural context involved here in what is needed for
an obligation to exist and what it actually means to be obligated.
(Ultimately, the theory gets pushed to the point where it breaks;
but we don't have to go that far.)

In human-to-human communication, we naturally understand and apply the
distinction between speech acts and purely communicative speech.  It's
not that we can't be fooled - a person who speaks with authority is
often taken to have it, which may allow him to create speech acts he
should not be able to - but this is relatively rare.

When exchanging data with a machine, the line between communication and
speech acts gets very blurry.  (You can think of this as the blurry line
between data and program.)  When I go into a store and ask for
information, I see myself and the salesman as engaging in pure
communication.  There are definite, well-understood ways - socially and
even legally defined steps - that identify when I've crossed over into
speech acts and have, for example, taken on an obligation to pay for
something.  When, on the other hand, I look at a Web site, things are
not at all clear.  From my point of view, the data coming to my screen
is purely communication to me.  From the computer's point of view, the
HTML is all "speech acts," causing the computer to take some actions.
My clicks are all "speech acts" to the server.  Problems arise when what
I see as pure communication is somehow transformed, without my consent
or even knowledge, into speech acts that implicate *me*, rather than my
computer.  This happens all too easily, exactly because the boundary
between me and my computer is so permeable, in a Web world.

Receiving an SSL cert, in the proper context (corresponds to the URL
I typed, signed by a trusted CA), is supposed to be a speech act to
me as a human being:  It's supposed to cause me to believe that I've
reached the site I meant to reach.  (My machine, of course, doesn't
care - it has no beliefs and has nothing at risk.)  The reason the model
is so appealing is that it maps to normal human discourse.  If my friend
tells me "I'll bring dinner," I don't cook something while waiting for
him to arrive.

Unfortunately, as we've discussed here many times, the analogy is
deeply, fundamentally flawed.  SSL certs don't really work like trusted
referals from friends, and the very familiarity of the transactions is
what makes them so dangerous:  It makes it too easy for us to treat
something as a speech act when we really shouldn't.

Enhanced security certs simply follow the same line of reasoning.  They
will ultimately prove just as hazardous.

Going back to promises as speech acts:  When a politician promises to
improve the economy, we've all come to recognize that, although that's
in the *from* of a promise, it doesn't actually create any obligation.
"Improving the economy" isn't something anyone can actually do - even if
we could agree on what it means.  Such a promise is simply a way of
saying "I think the economy should be better".  Politicians make
statements in this form because at some level, even though we know
better, we *do* treat them as speech acts.  It's a many-millenia-long
struggle between those trying to rouse the rabble and the "rabble"
trying to avoid being improperly roused.

Somehow, we're going to need to develop a better way for humans to
understand which computer communications are "just information," and
which should be treated as speech acts.
							-- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list