dangers of TCPA/palladium
AARG!Anonymous
remailer at aarg.net
Tue Aug 6 14:55:24 EDT 2002
A few questions and comments for Peter Biddle. First, I want to
thank him for speaking up; it is great to get detailed answers from an
authoritarian source.
Since this is the cryptography list, could you say something about the
crypto involved:
- Like TCPA, you have a secure chip, which you call the SCP, "secure
crypto processor," somewhat analogous to the TPM, right? And it
presumably generates a unique per-machine public key, of which the
secret component never leaves the chip? And then someone (who?)
issues a cert on this public key, and that is the basis for the trust
in the remote attestation?
- One feature missing from TCPA is secure memory. This is some memory
which can only be accessed by, what, trusted code? Does this mean
that only Microsoft-signed code can run secure? (I don't think so,
but many people have this idea.)
- To clarify: you said the TOR could load long after boot, even days
after booting? So it is not really a microkernel that runs below all
of Windows - it is more like a device driver which handles the secure
memory and the SCP hardware, so if you aren't using Palladium-aware
apps you don't need to load it?
- There is also a new CPU mode, which I think is related to the secure
memory. Could you explain how these work together? Is it that the
secure memory is only visible in the new CPU mode, and/or that the
new CPU mode can only run out of secure memory?
- At some point the SCP must produce a hash or fingerprint of an
application, or a portion of an application, for remote attestation
(and also for local sealing/unsealing). Is this done when the code
is loaded into secure memory? Or how/when?
- Regarding privacy: in TCPA the user reveals his per-chip key only
to the Privacy CA, when then gives him a cert on his newly-generated
identity key. Then remote entities only see the identity key cert.
However this means that the remote entities must trust the Privacy CA
since that is the only one who vouches for the legitimacy of the TPM.
So the user in practice doesn't have that much choice of Privacy
CA, it has to be one of the ones that the remote service provider
has authorized to be trusted by it. Does your system avoid this
limitation?
- What if someone wanted to make a Palladium-enhanced Napster? They
could use the security features to pass around lists of songs and
users who had them, but not make this detailed information available
to users. This would make it harder for the record companies to get a
list of all the participants in order to sue and shut them down. The
program could also share secure hashes of songs and make it difficult
for the RIAA to load bogus songs into the system with rogue clients.
Extensions to facilitate software piracy are left to the imagination.
Is Palladium "open" enough even for this kind of hackerish application?
Or will Microsoft be a gatekeeper who can control which apps can run?
- Speaking of piracy, how exactly does Palladium solve the BORA problem?
(Break Once Run Anywhere). Would all apps be distributed sealed to
a particular machine? Would this mean that all programs would have
to run in secure memory? Otherwise when they are unsealed and loaded
into insecure memory in plaintext, someone could save that data, right?
- How about with DRM, if someone hacks their hardware and is able
to break the restrictions on a piece of content (i.e. they manage
to get a copy of the unsealed form). Then they can distribute that
anywhere, right? Anyone will be able to play it once one person
extracts it from the secure "vault". So don't you still have a
BORA problem? Many people charge that this means that Microsoft has
a secret plan to shut down pirate applications! See Ross Anderson's
"FAQ".
Thanks again for your informative comments.
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com
More information about the cryptography
mailing list