Difference between TCPA-Hardware and other forms of trust

Seth David Schoen schoen at loyalty.org
Thu Dec 18 14:22:41 EST 2003


Jerrold Leichter writes:

> Given this setup, a music company will sell you a program that you must
> install with a given set of access rights.  The program itself will check
> (a) that it wasn't modified; (b) that a trusted report indicates that it
> has been given exactly the rights specified.  Among the things it will check
> in the report is that no one has the right to change the rights!  And, of
> course, the program won't grant generic rights to any music file - it will
> specifically control what you can do with the files.  Copying will, of course,
> not be one of those things.
> 
> Now, what you'll say is that you want a way to override the trusted system,
> and grant yourself whatever rights you like.  Well, then you no longer have
> a system *anyone else* can trust, because *you* have become the real security
> kernel.  And in a trusted system, you could certainly create a subject that
> would automatically be granted all access rights to everything.  Of course,
> since the system is trusted, it would include information about the
> override account in any reports.  The music company would refuse to do
> business with you.
> 
> More to the point, many other businesses would refuse to do business with you.

There's the rub.

The prevalence of systems that can be trusted by third parties who do
not trust their owners affects what applications are possible, and it
affects the balance of power between computer owners and others.

If very few such systems are deployed, it would be absurd to say that
"the music company would refuse to do business with you" -- because the
music company has to do business to keep its doors open.  Businesses
that refuse to serve customers will not last very long.

In the entertainment case, the antagonism is already clearly expressed.
("Open war is upon you", as I recall Theoden being told in that movie
last night.)  The more so-called "legacy" systems do not do DRM or do
not do it very well or easily, the more difficult it is for publishers
to apply DRM systems and succeed in the market.  The more systems do
DRM natively, or easily or cheaply, the easier it is to be successful
publishing things restricted with DRM.  In either case, the publishers
still have to publish; before the creation of DVD-A and SACD,
publishers of audio CDs couldn't very well say "CD-DA, we hates it!
Nasty, tricksy format!" (sorry, um) "We are going to stop publishing
in the CD-DA format because it isn't encrypted."  Even today, they
would be hard-pressed to do so, because DVD-A and SACD players are
extraordinarily rare compared to audio CD players.

The question of whether the supposed added profit that comes with being
able to enforce DRM terms provides an important "creative incentive"
comparable to that provided by copyright law goes back to the era
immediately before the adoption of the DMCA, where the Bruce Lehman
White Paper argue that it did (that copyright law's incentive was
becoming inadequate and an additional control-of-the-public and
control-of-technology incentive would be required).  Indeed, the group
that pushed for the DMCA was called the Creative Incentives Coalition,
and it said that thus restricting customers was really all a matter of
preserving and expanding creative incentives.

http://www.uspto.gov/web/offices/com/doc/ipnii/

I think Bruce Lehman was wrong then and is wrong now.  On the other
hand, the _structure_ of the argument that the prospect of restricting
customers provides an incentive to do something that one would not
otherwise do is not incoherent on its face.  The interesting question
about remote attestation is whether there are (as some people have
suggested) interesting and important new applications that customers
would really value that are infeasible today.

For example, it has been argued by Unlimited Freedom that there would
be incentives to invest in useful things we don't have now (and things
we would benefit from) only if attestation could be used to control
what software we used to interact with those things.

In the entertainment case, though, there is already a large
entertainment industry that has to sell into a base of actually
deployed platforms (unless it wants to bundle players with
entertainment works) -- and its ability to refuse to do business with
you is constrained by what it can learn about you as a basis for
making that decision.  It's also constrained if its rationale for
refusing to sell to you would also imply that it needs to refuse to
sell to millions of other people.  Only if enormous numbers of people
in the future can preserve the benefit of creating uncertainty about
their software environment's identity will entertainment publishers
and others lack the ability to discriminate against people who use
disfavored software.

> Yes, you can construct a system that *you* can trust, but no one else has
> any reason to trust.  However, the capability to do that can be easily
> leveraged to produce a system that *others* can trust as well.  There are
> so many potential applications for the latter type of system that, as soon
> as systems of the former type are fielded, the pressure to convert them to
> the latter type will be overwhelming.

I find this puzzling because I don't see how the leveraging happens.
I'm puzzled as a pure matter of cryptography, because if my computer
doesn't come with any tokens that a third party can use to establish
trust in it, I don't see how it can prevent me from doing a
man-in-the-middle attack whenever someone tries to exchange a key with
it.

In the TCG model, there are already keys shipped with the PC, inside
the TPM.  These keys make signatures that have a particular
significance.  People who are concerned about the effects of attestation
have proposed changes in which the meaning of these signatures is changed
slightly, or in which the keys are disclosed to the computer owner, or
in which the keys are not preloaded at all.  These changes are
proposed specifically in order to preserve an aspect of the status
quo: that other people can't trust your computer without also trusting
its owner.

So if these proposals don't have that effect, I'd be glad to hear why
not.

-- 
Seth David Schoen <schoen at loyalty.org> | Very frankly, I am opposed to people
     http://www.loyalty.org/~schoen/   | being programmed by others.
     http://vitanuova.loyalty.org/     |     -- Fred Rogers (1928-2003),
                                       |        464 U.S. 417, 445 (1984)

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list