Microsoft marries RSA Security to Windows

Arnold G. Reinhold reinhold at world.std.com
Tue Oct 15 09:54:45 EDT 2002


At 8:40 AM -0700 10/11/02, Ed Gerck wrote:
>"Arnold G. Reinhold" wrote:
>
>> I can see a number of problems with using mobile phones as a second
>> channel for authentication:
>
>Great questions. Without aspiring to exhaust the answers, let me comment.
>
>> 1. It begs the question of tamper resistant hardware. Unless the
>> phone contains a tamper resistant serial number or key, it is
>> relatively easy to clone. And cell phones are merging with PDAs. If
>> you have secure storage, why not implement a local solution on the
>> PDA side?
>
>Cloning the cell phone has no effect unless you also have the credentials
>to initiate the transaction. The cell phone cannot initiate the authentication
>event. Of course, if you put a gun to the user's head you can get it all but
>that is not the threat model.

If we're looking at high security applications, an analysis of a 
two-factor system has to assume that one factor is compromised (as 
you point out at the end of your response). I concede that there are 
large classes of low security applications where using a cell phone 
may be good enough, particularly where the user may not be 
cooperative. This includes situations where users have an economic 
incentive to share their login/password, e.g. subscriptions, and in 
privacy applications ("Our logs show you accessed Mr. Celebrity's 
medical records, yet he was never your patient." "Someone must have 
guessed my password." "How did they get your cell phone too?") Here 
the issue is preventing the user from cloning his account or denying 
its unauthorized use, not authentication.

>
>A local solution on the PDA side is possible too, and may be helpful where
>the mobile service may not work. However, it has less potential for wide
>use. Today, 95% of all cell phones used in the US are SMS enabled.

What percentage are enabled for downloadable games? A security 
program would be simpler than most games.  It might be feasible to 
upload a new "game" periodically for added security.

>
>> 2. Even if the phone is tamperproof, SMS messages can be intercepted.
>> I can imagine a man-in-the-middle attack where the attacker cuts the
>> user off after getting the SMS message, before the user has a chance
>> to enter their code.
>
>Has no effect if the system is well-designed. It's possible to make 
>it mandatory
>(under strong crypto assurances) to enter the one-time code using the *same*
>browser page provided in response to the authentication request -- which
>page is supplied under server-authenticated SSL (no MITM).

You may be right here, though assuming SSL lets one solve a lot of 
security problems associated with traditional password login.

>
>> 3. Cell phones don't work everywhere. Geographic coverage is limited.
>> Most U.S. phones don't work overseas. Reception can fail inside
>> buildings and cell phone use is prohibited on commercial airplanes
>> in-flight (the airlines are planning to offer Internet access in the
>> near future). And what happens if I choose to TEMPEST shield my
>> facility?
>
>No solution works everywhere. Cell phones are no exception. But it is
>possible to design the system in a such a way that the user can use 
>a different
>access class (with less privileges, for example) if the cell phone does
>not work. After all, the user is authenticated before the message is sent to
>the cell phone.
>
>That said, cell phone coverage is becoming ubiquitous and the solution also
>works with pagers (while they still exist), email accounts (blackberrys) and
>other means of communication -- including voice.

Security tokens work everywhere I can think of.  I'm not sure the 
cell companies are spending much to push into rural areas given the 
current economy.  Might be a new market for Iridium, but that doesn't 
work well inside buildings.

>
>> 4. The cell phone network can get clogged in times of high stress,
>> e.g. a snow storm at rush hour, a natural disaster or a terrorist
>> incident. Presumably some people who use two factor authentication
> > have important work to do. Do you want them to be locked out of their
>> computers at such critical times?
>
>Let's be careful with generalizations. During the tragic events of 9/11, cell
>phones emerged as the solution for communication  under a 
>distributed terrorist
>attack.

The WTC collapse took out a major portion of lower Manhattan's 
landline capacity. Cell phones were better than nothing, but many 
people experienced difficulty placing calls.  It is simply too 
expensive to design a switched system to handle all the calls people 
want to make in a major crisis. Military systems include priority 
tags to deal with this.

This does raise an interesting possibility: giving SMS messages 
priority over voice could be very useful in an emergency. SMS 
messages take much less bandwidth than voice and the entry mechanism 
on most cell phones is very slow. So existing cell infrastructure 
might be able to handle all the SMS traffic generated a crisis. 
Anyone know if cell phone companies are doing this?

>
>Second, as I hint somewhere above, the important point here is not to rely on
>something that will never fail (which is, clearly, impossible) but to offer
>the user a recourse -- alternative ways to get the job done, even 
>under reduced
>privileges.

Reduced privileges won't do in a crisis. That's when people need 
maximum privileges: to authorize emergency expenditures, reroute 
circuits, reboot servers or reprogram firewalls defeat cyber-attacks. 
The Internet is based on technology designed to survive nuclear war. 
Making its most valuable applications dependant for security on 
other, less robust communication systems seems a bad idea.

>
>> 5. Cell phones are vulnerable to denial of service attacks. A simple
>> RF jammer could prevent an individual or an entire building from
>> accessing their computers.
>
>See above.

Ditto.

>
>> 6. People are generally cavalier about their cell phones. They wear
>> them on belt pouches, leave them in cars and gym lockers, let
>> strangers borrow them. I left mine in a coat pocket that I checked at
>> a restaurant and ended up with a $40 long distance bill. Habits like
>> that are hard to change. On the other hand, a token that goes on a
>> key chain or is worn as jewelry taps into more security conscious
>> cultural behavior.  Human factors are usually the weak link in
>> security, so such considerations are important.
>
>I see your argument backwards -- you are more likely to notice that
>you lost or forgot your cell phone than a hardware token that you
>seldom use, or even notice.  Cell phones also prevent a silent compromise
>more effectively (as in theft, which involves no violence) because you
>have a tendency to notice its absence -- the cell phone is sueful for many
>other purposes!  And, you don't have to carry additional devices, that you
>may lose or forget.

I'd like to see a personal token that is used for multiple 
applications. Particularly if it were on your key chain you'd notice 
its loss soon enough.  It would be handy if it told the time of day 
and also included an LED flashlight (on a separate battery).

>
>Additionally, just having my cell phone does not get you anywhere. You
>also need my credentials to begin the authentication process and they are
>NOT in my cell phone.

Maybe not in YOUR cell phone, but I'll bet a lot of people store 
passwords in their cell phone address books. With the combined 
cellphone/PDAs of the future it will be common to do so. And for a 
high value attack, you have to assume the password is already 
compromised.

>
>> 7. It's a tax on logins. SMS messages aren't free.
>
>;-) nothing is free, but an SMS message is pretty close to that -- $0.05
>to $0.10 per message. It's pay-as-you-go versus investing money
>upfront on a hardware device that will also need replacement.

The price of an SMS message is actually twice that -- both the sender 
and receiver pay. If I am authorizing access to a page of content for 
which I am only charging a buck or so, that dime is a big expense.

For someone who needs access privileges many times a day, $0.10 per 
login can add up to several hundred dollars a year.  The latest 
MacMall catalog lists a 16MB USB pen drive for $29.99. A security 
token should cost less than that, patents aside.

I also see a problem in agreement labor (union) situations. There 
might be demands that the company pay for the workers' cellphones if 
they are required to have one.

>
>> 8. If I lose my token, I can use my cell phone to report it promptly.
>> If I lose my cell phone...
>
>You use a pay phone or your wired phone. You can also use email.

Tried to find a pay phone lately? But I admit this is a quibble.

>
>> 9. Improved technology should make authentication tokens even more
>> attractive. For one thing they can be made very small and waterproof.
>> Connection modes like USB and Bluetooth can eliminate the need to
>> type in a code, or allow the PIN to be entered directly into the
>> token (my preference).
>
>It's costly, makes you carry an additional thing and -- most important
>of all -- needs that pesky interface at the other end.

That "pesky interface at the other end" seems less of an issue in the 
context of a deal between RSA and Microsoft. This is an opportunity 
to do the right thing and build it into the operating system once and 
for all.  Support for USB and Bluetooth tokens that reveal a public 
key and can sign a nonce should also be included in the OS. There is 
no need for a PKI here. Token binding would be done by the 
application owner when access is initially arranged.

>
>> 10. There is room for more innovative tokens. Imagine a finger ring
>> that detects body heat and pulse and  knows if it has removed. It
>> could then refuse to work, emit a distress code when next used or
>> simply require an additional authentication step to be reactivated.
>> Even implants are feasible.
>
>There is always room for evolution, and that's why we shan't run out of
>work ;-)
>
>However, not everyone wants to have an implant or carry a ring on their
>finger -- which can be scanned and the subject targeted for a more serious
>threat. My general remark on biometrics applies here -- when you are the
>key (eg, your live fingerprint),  key compromise has the potential to be
>much serious and harmful to you.

If a personal (as opposed to special purpose, for 
bank-vice-presidents-only) tokens become more common, their 
possession won't mean that much.  In a cellphone-based security 
world, traffic analysis of SMS messages could reveal a lot about who 
does important work.

>
>BTW, what is the main benefit of two-channel (as opposed to just two-factor)
>authentication? The main benefit is that security can be assured 
>even if the user's
>credentials are compromised -- for example, by writing their 
>passwords on stick-it
>notes on their screen, or under their keyboards, or by using weak 
>passwords, or
>even having their passwords silently sniffed by malicious sofware/hardware,
>problems that are very thorny  today and really have no solution but to add
>another, independent, communication channel. Trust on authentication 
>effectiveness
>depends on using more than one channel, which is a general 
>characteristic of trust
>( http://nma.com/papers/it-trust-part1.pdf  )

A second channel is very helpful in establishing trust, but for most 
applications, that trust can be stored in something the user carries. 
Often it is more important that the second channel be truly secure 
than that it operate contemporaneously with access.  I would hope 
that any Microsoft/RSA standard solution allows for both models.


Arnold Reinhold

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com



More information about the cryptography mailing list