Do You Need a Digital ID?

Jerrold Leichter jerrold.leichter at
Wed Mar 23 09:57:47 EST 2005

| Jerrold Leichter wrote:
| > I don't think the 3-factor authentication framework is nearly as
| > well-defined
| > as people make it out to be.
| > 
| > Here is what I've always taken to be the core distinctions among the three
| > prongs:
| > 
| > 	Something you know
| > 		Can be copied.
| > 		If copied illicitly, you can't tell (except by noticing
| > 			illicit uses).
| > 
| > 	Something you have
| > 		Cannot be copied.
| > 		Can be transferred (i.e., you can give it to someone
| > 			else, but then you no longer have it)
| > 		Hence, if transferred illicitly, you can quickly detect it.
| > 
| > 	Something you are
| > 		Cannot be transferred.
| > 		Cannot be changed.
| > 		Inherently tied to your identity, not your role.
| > 
| > This classification, useful as it is, certainly doesn't cover the space of
| > possible authentication techniques.  For example, an RFID chip embedded
| > under the skin and designed to destroy itself if removed doesn't exactly
| > match any of these sets of properties:  It's not "something you have"
| > because it can't be transferred, but it's not "something you are" because
| > it can be changed.  Attempting to force-fit everything into an incomplete
| > model doesn't strike me as a useful exercise.
| but business rules for public(/private) key infrastructure can describe that
| only the associated authenticating entity is the only one in possession of
| the private key ("something you have") .... as a way of relating the
| objective of having a specific entity's exclusive ability to access and
| utilize a private key to three factor authentication.
I business rule or law can choose to define anything it wishes.  "Separate but
equal" was the law of the land for many years.  Many people don't understand
what that meant.  It wasn't in the law as a *goal*; it was in the law as an
*assumption*:  If a state said it was creating two school systems, one for
blacks, one for whites, courts took their word for it that the two were
"equal", regardless of any actual facts.  What we are discussing is what rules
might be reasonably consistent with the real security properties of the
objects and methods involved.

| almost all of the existing "something you have" authentication objects are
| capable of being counterfeited to a greater or lesser degree. ... in
| general, you will find that almost all "something you have" authentication
| objects are subject to being copied ...
Of course.  These classifications are idealizations.  In fact, *any* security
property, as applied to a real-world system, is an idealization.  One of the
errors of the early burst of enthusiasm about RSA was the assumption that
mathematics could prove things about the security of real-world systems.  It
can give you upper bounds - the system is breakable by a mathematical attack
so it's no stronger than this - but that's pretty much all.  (We continue to
see a bit of the same hyperbole concerning quantum cryptography, though most
people have learned their lesson by now.)

When we say a smart card "can't be copied", all we are really saying is that
we will not be considering copying as a mode of attack to be defended against.
If you want to be a bit more quantitative, you might instead say that the
calculated cost of copying a smart card significantly exceeds the value that
an attacker could put at risk through that attack modality.

At a lower level of abstraction, someone *did* consider copying attacks, and
added defenses to the card, its reader, how it was used, etc., exactly to get
us to the point where we could stop worrying about that particular threat.
(And, of course, many such security decisions ultimately come down to ... very
little.  We can reasonably well rate the security of a safe because we have
many years of experience with attacks and defenses; and further safes are
fairly simple physical objects subject to well understood physical attacks.
The same can't be said of smart cards or really any digitally based system.  I
expect to see a continuing see-saw between attack and defense for the rest of
my life.  That's not to say nothing is changing:  The barriers to entry for
attackers are much higher than they used to be.  But they can still be

| most models serve a useful purpose until somebody comes up with a better or
| more applicable model.
| many of the 3-factor authentication implementations actually use some
| representation that allows the actual occurence to be implied by something
| else.
| for instance "something you know" authentication can be done as a
| "shared-secret" where both the originator and the relying party are both in
| possession of the shared-secret. an example are keys in symmetric key
| cryptography.
| however, it is possible to have "something you know" authentication where the
| secret is not shared. For instance if there is a hardware token that is
| certified to only operate when the correct PIN has been entered .... the PIN
| represents "something you know" authentication w/o having to share the secret
| with any other party (the relying party assumes that the correct PIN has been
| entered by a) being confident of the operation of the particular hardware
| token and b) the hardware
| token having done something known & expected)....
I think this concentrates too much on the implementation technique rather than
on the goals.

| in any case, for all of the deployed existing authentication systems
| involving any one of the three factor authentication paradigms, all of the
| methods are vulnerable to copying to one degree or another. as a result, I
| would assert that criteria of being able to copy or not is not useful
| .... in all of the different three factors, it isn't whether they are
| copyable .... it is the difficulty with which they can be copied....
Again, of course.  The question is:  What security properties do I want my
system to have?  What security properties can some physical realization
actually deliver, to an appropriate degree of assurance against an appropriate
set of attacks, and at an appropriate cost?

| i would further assert that the meaningful aspects represented by the
| three=factor authentication model is not the native characteristic of the
| components but how the individual being authenticated interacts with the
| components .... i.e.
| 1) something you know .... implies that the person has to know the value
| 2) something you have ... implies that the person is in possession of the
| thing or value ... but doesn't actually know or have it memorized
| 3) something you are .... implies that it represents some physical
| characteristic of the person ... w/o the person needing to either know or
| otherwise possess the object or value.
This is mainly restating the names of the modalities, without adding any real

In designing a system, I certainly agree that the security properties are
not the only ones of relevance.  The distinctions as you draw them say nothing
about the security properties, but they certainly are relevant in other ways.
"Something you know" can't be a 100-digit string, because humans can't
generally remember such things.  "Something you are" runs into all kinds of
social acceptability issues - e.g., fingerprint identification systems have
an association in many people's minds with their use to identify criminals.
"Something you have" is likely to be very tightly constrained at a bar on a
nude beach!

Using two-factor authentication *can* give you "the best of both worlds" in
terms of security attributes, but it can never do anything but give you the
*worst* of the acceptability/usability properties of the two factors.  I
challenge you to argue for a two-factor system *simply on the basis of the
definitions you have above, without reference to the differences in security
properties I list above.  A couple of years ago, Citibank (I think) started
adding photos to their credit cards.  Their ads focused on the security; the
general argument was something along the lines of "because no one but you
looks like you".  Even *advertising* discussed - in friendly terms - the
non-transferability of "something you are"!

| all three methods can be implemented as static value or shared-secret
| implementations ... where the characteristic of the particular
| authentication mode is expressed as some static value and is vulnerable to
| shared-secret eavesdropping or skimming. "Something you know" shared-secrets
| can be eavesdropped and fraudulently used. A magstripe plastic card
| "something you have" is expressed as transmission of the contents of the
| magstripe, which can be skimmed and used to create counterfeit/copied
| cards. A "something you are" feature is expressed as biometric template
| which can be eavesdropped and used in fraudulent transmissions (or
| counterfeited in things like the gummy bear attack).
Yes.  The distinction is only meaningful as a description of how a person
retains the authentication information.  To *use* it to authenticate, you
have to transfer *something* to another party, and they have to be able to
verify it, based on *something* they have stored.  None of those later
actions can meaningfully be described by the have/are/know trichotomy.
(Well, I suppose if you are authenticating to another person, he may KNOW
what you look like, HAVE a picture of you, BE your twin brother.  :-) )

| rather than interpreter 3-factor authentication as physical characteristics
| which are classified as being copyable or not-copyable ... 3-factor
| authentication is frequently interpreted as how the entity being
| authentication relates to the authentication process.
That's fine for *describing* the system, and useful for analyzing its usability
or acceptability.  But it's not the whole story.

							-- Jerry

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at

More information about the cryptography mailing list