Privacy-enhancing uses for TCPA

Peter N. Biddle peternbiddle at hotmail.com
Tue Aug 6 22:08:25 EDT 2002


Neither of us really had the time to clearly articulate things last time, so
I am glad you brought it up. My perspective is primarily from an
architectural one, and it boils down to this:

Platform security shouldn't choose favorites.

I don't want there to be any second class data citizens, as the
determination of who is a "first class" citizen and who isn't seems
arbitrary and unfair, especially if you happen to be second class. The
technology should be egalitarian and should be capable of treating all data
the same. If a user wants data to be secure, or an application wants it's
execution to be secure, they should be able to ask for and get the highest
level of security that the platform can offer.

You point out that legal and societal policy likes to lump some kinds of
data together and then protect those lumps of data in certain ways from
certain things. Policy may also leave the same data open for some kinds of
usage and or exploitation in some circumstances. This is a fine and
wonderful thing from a policy perspective. This kind of rich policy is only
possible in a PC if that machine is capable of exerting the highest degrees
of security to every object seeking it. You can't water the security up; you
can only water it down.

I don't think that the platform security functions should have to decide
that some data looks like copyrighted information and so it must be treated
in one way, while other data looks like national secrets and so should be
treated differently. The platform shouldn't be able to make that choice on
it's own. The platform needs someone else (eg the user) to tell it what
policies to enforce. (Of course the policy engine required to automatically
enforce policy judgement on arbitrary data would be impossible to manage. It
would vary from country to country, and most importantly (from my
architectural perspective) it's impossible to implement becuase the only SW
with access to all data must be explicitly non-judgemental about what good
or bad policy is.)

More in-line:

----- Original Message -----
From: "Seth David Schoen" <schoen at loyalty.org>
To: <cypherpunks at lne.com>; <cryptography at wasabisystems.com>;
<mail2news at anon.lcs.mit.edu>
Sent: Tuesday, August 06, 2002 12:11 PM
Subject: Re: Privacy-enhancing uses for TCPA


> AARG!Anonymous writes:
>
> > I could go on and on, but the basic idea is always the same, and
hopefully
> > once people see the pattern they will come up with their own ideas.
> > Being able to write software that trusts other computers allows for an
> > entirely new approach to security software design.  TCPA can enhance
> > freedom and privacy by closing off possibilities for surveillance and
> > interference.  The same technology that protects Sony's music content
> > in a DRM application can protect the data exchanged by a P2P system.
> > As Seth Schoen of the EFF paraphrases Microsoft, "So the protection of
> > privacy was the same technical problem as the protection of copyright,
> > because in each case bits owned by one party were being entrusted to
> > another party and there was an attempt to enforce a policy."
> > (http://vitanuova.loyalty.org/2002-07-05.html, 3rd bullet point)
>
> I would just like to point out that the view that "the protection of
> privacy [is] the same technical problem as the protection of
> copyright" is Microsoft's and not mine.  I don't agree that these
> problems are the same.

You say above that you don't agree the the problems are the same, but you
don't specify in what domain - policy, technical, legal, all of the above,
something else? The examples you give below are not technical examples - I
think that they are policy examples. What about from the technical
perspective?

> An old WinHEC presentation by Microsoft's Peter Biddle says that
> computer security, copyright enforcement, and privacy are the same
> problem.  I've argued with Peter about that claim before, and I'm
> going to keep arguing about it.

The term I use is "a blob is a blob"...

> For one thing, facts are not copyrightable -- copyright law in the
> U.S. has an "idea/expression dichotomy", which, while it might be
> ultimately incoherent, suggests that copyright is not violated when
> factual information is reproduced or retransmitted without permission.
> So, for example, giving a detailed summary of the plot of a novel or
> a movie -- even revealing what happens in the ending! -- is not an
> infringement of copyright.  It's also not something a DRM system can
> control.

Isn't copyright a legal protection, and not a technical one? The efficacy of
copyright has certainly benefited greatly from the limitations of the
mediums it generally protects (eg books are hard and expensive to copy;
ideas, quotes, reviews and satires are allowed and also (not coincidentally)
don't suffer from the physical limitations imposed by the medium) and so
those limitations can look like technical protections, but really they
aren't.

I agree that copyrighted material is subject to different policy from other
kinds of information. What I disagree on is that the TOR should arbitrarily
enforce a different policy for it becuase it thinks that it is copyrighted.
The platform should enforce policy based on an external (user, application,
service, whatever) policy assertion around a given piece of data.

Note that data can enter into Pd completely encrypted and unable to be
viewed by anything but a user-written app and the TOR. At that point the
policy is that the app, and thus the user, decides what can be done with the
data. The TOR simply enforces the protections. No one but the app and the
TOR can see the data to attempt to exert policy.

> But privacy is frequently violated when "mere" facts are redistributed.

I swear that *I* was arguing this very point last time, and you were saying
something else! Hmmm. Maybe we agree or something.

> It often doesn't matter that no bits, bytes, words, or sentences were
> copied verbatim.  In some cases (sexual orientation, medical history,
> criminal history, religious or political belief, substance abuse), the
> actual informational content of a "privacy-sensitive" assertion is
> extremely tiny, and would probably not be enough to be "copyrightable
> subject matter".  Sentences like "X is gay", "Y has had an abortion",
> "Z has AIDS", etc., are not even copyrightable, but their dissemination
> in certain contexts will have tremendous privacy implications.

The platform should treat this kind of data with the highest degree of
security and integrity available, and the level of security available should
support local policy like "no SW can have access to this data without my
explicit consent". The fact that the data is small makes it particularly
sensitive as it is so highly portable, so there must be law to allow the
legal assertion of policy independently from the technical exertion of
policy, and there has to be some rationalization between the two approaches.
While bandwidth limits the re-distribution of many kinds of content, it
doesn't with this kind of info. (And of course bandwidth limitations aren't
really technical protections and are subject to the vagaries of increased
bandwidth. Not a good security model.)

Not only should the platform be able to exert the highest degrees of control
over this information on behalf of a user, it should also allow the user to
make smart choices about who gets the info and what the policy is around the
usage of this info remotely. This must be in a context where lying is both
extremely difficult and onerous.

Common sense dictates that the unlawful usage of some kinds of data is far
more damaging (to society, individuals, groups, companies) than other kinds
of data, and that some kinds of unlawful uses are worse than others, but
common sense is not something that can be exercised by a computer program.
This will need to be figured out by society and then the policy can be
exerted accordingly.

> "Technical enforcement of policies for the use of a file within a
> computer system" is a pretty poor proxy for privacy.
>
> This is not to say that trusted computing systems don't have interesting
> advantages (and disadvantages) for privacy.

I am not sure I understand the dichotomy; technical enforcement of user
defined policies around access to, and usage of, their local data would seem
to be the right place to start in securing privacy. (Some annoying cliche
about cleaning your own room first is nipping at the dark recesses of my
brain ; I can't seem to place it.) When you have control over privacy
sensitive information on your own machine you should be able to use similiar
mechanisms to achieve similiar protections on other machines which are
capable of exerting the same policy. You should also have an infrastructure
which makes that policy portable and renewable.

This is, of course, another technical / architectural argument. The actual
policy around data like "X is gay" must come from society, but controls on
the information itself originates with the user X, and thus the control on
the data that represents this information must start in user X's platform.
The platform should be capable of exerting the entire spectrum of possible
controls.

Peter
++++

> --
> Seth David Schoen <schoen at loyalty.org> | Reading is a right, not a
feature!
>      http://www.loyalty.org/~schoen/   |                 -- Kathryn
Myronuk
>      http://vitanuova.loyalty.org/     |
>
> ---------------------------------------------------------------------
> The Cryptography Mailing List
> Unsubscribe by sending "unsubscribe cryptography" to
majordomo at wasabisystems.com
>

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com



More information about the cryptography mailing list