Lorrie Cranor on privacy, online voting and Internet censorship

R. A. Hettinga rah at shipwright.com
Mon Feb 24 13:43:17 EST 2003



ACM: Ubiquity - At the Crossroads of Technology and Policy

Lorrie Cranor on privacy, online voting and Internet censorship. 

Dr. Lorrie Faith Cranor is a Principal Technical Staff Member at AT&T
Labs-Research, where she has done work in a variety of areas where
technology and policy issues interact -- including online privacy,
electronic voting and spam. She is chair of the Platform for Privacy
Preferences Project (P3P) Specification Working Group at the World
Wide Web Consortium and author of the book Web Privacy with P3P
(O'Reilly 2002).

UBIQUITY: Let's start with a quote from your Web site: "It's always
difficult to explain what field I'm in." Why is it difficult, and what
field are you in?

CRANOR: I think of my work as being at the intersection of multiple
areas, so I never fit into one particular box. My work involves
knowledge of both the technical and social aspects of things. Some of
the labels people have used to describe this kind of work include
social informatics and value-sensitive design. Both of those are
reasonable labels for a lot of what I do, but most people don't really
know what they mean. So while appropriate, they don't really give the
whole picture.

UBIQUITY: How do you explain your interest in privacy? 

CRANOR: I'm interested in privacy as it relates to technology. There
are a variety of aspects of that. One is how technologies often erode
privacy, and thinking about ways to prevent erosion of
privacy. Another aspect is looking at ways that technology could help
increase privacy. It's not enough just to know a lot about
privacy. You should know the philosophical underpinnings and the
policy issues. It's not enough to understand the technology. You need
to think about both areas. They go together.

UBIQUITY: Can you give a good example of the intersection between
privacy and technology?

CRANOR: During the past few years I've spent a lot of time on a
project called P3P, the Platform for Privacy Preferences. With P3P,
our goal was to create a computer-readable way for Websites to
describe their privacy policies so that users wouldn't have to read
the long legalese privacy statements at every Website. The project was
developed by the World Wide Web consortium. Two groups of people
worked on it. There was a group of lawyers and policy advocates, and
there was a group of technology people. The technology people focused
on the technical mechanism necessary to make it work. The policy
people focused on the vocabulary that was used to talk about privacy.

UBIQUITY: Which group did you belong to? 

CRANOR: Throughout much of this project I was the interface between
these two groups. I went back and forth between them. Sometimes the
technical group would push back on the policy group and say, "These
things that you want in the vocabulary don't make sense from a
technical perspective. So get rid of them." And the vocabulary people
would say, "How can we get rid of them? These are really important."
Having the understanding of both sides was helpful in working out
solutions; figuring out what was important from the policy perspective
and figuring out ways of dealing with it technically.

UBIQUITY: Does the average person on the street worry much about privacy? 

CRANOR: I think privacy is one of the things that everybody cares
about, at least in Western societies. Do they worry about it? It
depends upon their knowledge of how their privacy might be invaded. I
think a lot of people are unaware of how easily their privacy might be
invaded. But once that has been pointed out to them, most people don't
like it.

UBIQUITY: How easy would it be for an individual with wicked
intentions to invade the files on somebody attached to the Internet?
For example, say you wanted to find out where the person searched on
the Web. I know there's no quantitative answer to that, but give it a

CRANOR: You can find out a lot if you know a few key pieces of
information. For example, if I know somebody's Social Security number,
then with some skill at impersonation, I can easily access financial
records, health records and various other records. As far as getting
somebody's clicks out of the computer, it depends on the
circumstances. How do they access the Internet? Does their ISP keep
records of those sorts of things? Can I install monitoring software
on their computer?

UBIQUITY: How easy is it to get to somebody's hard disk? 

CRANOR: It depends on how well the computer is secured and how you
access the Internet. It's fairly difficult to access files on a
computer that uses a dial-up modem. For example, someone might e-mail
an attachment that included software that would cause the computer to
automatically e-mail the files back to the attacker. But that would
only work if the recipient opened the attachment. That's one way it
could be done. Or an attacker might exploit browser bugs and lure
someone into visiting a certain site. For people who have broadband
connections, if they haven't put up personal firewalls, and depending
on their operating system, it may be very easy for somebody to get on
their hard drive. If you don't have your sharing settings set
appropriately, then one of your neighbors could click on their
Microsoft neighborhood and get on your hard drive.

UBIQUITY: Do you think that the danger that is suggested by the movies
and the popular press is exaggerated?

CRANOR: I think there are specific episodes in a lot of movies that
are exaggerated. But the overall danger is quite real. I view this
issue of people getting on to your hard drive as more of a security
issue than a privacy issue. It is more or less solvable by using good
security software. The more interesting problems from the privacy
perspective are the issues of where we have massive databases that
provide useful functions for people. If you pull together information
from different data bases, you can basically build up files on people,
and from all those bits of data, derive new pieces of information.

UBIQUITY: Have you seen any real life examples of this? 

CRANOR: One example comes from a professor at Carnegie Mellon
University, Latanya Sweeny, who did a study where she accessed a
drivers' license database. She was able to demonstrate that with small
pieces of information, not a first name, you could uniquely identify
people. For example, with somebody's date of birth and their zip code,
she uniquely identified something like 99 percent of the people in
Cambridge, Massachusetts. By putting those two pieces of information
together somebody can figure out who you are pretty easily.

UBIQUITY: How would you characterize the main problem in online
privacy? What's the root of all this difficulty?

CRANOR: Look at what distinguishes online privacy issues from off-line
privacy issues. In some ways there's nothing new. We've had privacy
concerns forever. But with online privacy issues what is added is the
ability to quickly and easily connect databases that are not
physically located in the same place. Computerization allows us to
easily search multiple databases together. It used to be that if I
wanted records about somebody, I had to drive to the town hall, go
through the file cabinets and pull out records. If a person moved
around, I'd have to go to each of the town halls in all the places he
or she lived, and pull out birth certificates, tax records and all
that stuff. Now, all that information is online. You can get it
without leaving your house, if you know how to look for it.

UBIQUITY: Let me refer back to you. You did your doctorate on
voting. Is there a good connection there?

CRANOR: I looked at electronic voting, which is again an issue
concerning the combination of technical and the social issues. One
aspect of my work on voting was that we have secret ballots. We want
privacy in the ballot boxes. So I did some privacy-related work as
part of that.

UBIQUITY: In your work at AT&T Labs, are you focused primarily on
online privacy? What specific goal are you working toward?

CRANOR: For the past five years or so my goal has been to finish the
P3P specification, which I did. Not just me personally, but the whole
working group at W3C. And I wrote a book about it that was published
by O'Reilly a few months ago. Now I'm tying up some loose ends related
to P3P. We had developed some P3P software here at AT&T. We're
currently working on a new version of that. But thinking more
long-term, I've gotten very interested in how to make privacy
software, or privacy tools, accessible to users, because privacy is a
fuzzy, difficult concept. It's not obvious how to build good software
that's related to privacy, or also security. I'm looking generally at
those sorts of issues.

UBIQUITY: You said before that everybody appreciates and values
privacy. Do you think that they generally value other people's
privacy? For instance, do they think that surveillance tools in public
places put forth any kind of a problem? Or are people in general
fairly blasé about that?

CRANOR: I think it varies. We know most of these things from the
various public-opinion surveys that are done. Certainly the responses
you get change right after there's been some sort of scary
event. Right after September 11, people seemed more willing to give up
some privacy. Now, I believe most of the polls show that it's swinging
back the other way. I think some people are quick to say, "Given this
event, we're willing to put in whatever safeguards are needed, even if
that means giving up privacy." There are other people who will say,
"No, it's not worth the tradeoff." I also think there are many things
that are put in place in the name of security or safety that force us
to give up privacy, but it's not clear that they actually increase our
security or safety.

UBIQUITY: Let's dwell a minute over the word privacy, which I'm sure
you've given a lot of thought to. Is there a right to privacy when
you're going down an escalator in a public building? Do you have a
right not to have your picture taken?

CRANOR: Well first you have to back up and say, "Is there a right to privacy?" 

UBIQUITY: Go ahead, let's do that. 

CRANOR: In some countries there is a right to privacy that's written
into the constitution or laws. In the US that is not the case. There
is no explicit Constitutional right to privacy, although we do have
some Constitutional rights to some aspects of privacy. Many people
argue that even if we don't constitutionally have that right, we
should have privacy as a fundamental human right. Getting back to the
question of do you have a right to privacy in a public building, it
depends on what you mean by that. If you're talking about having a
picture taken of your face in a public building, I believe that US law
would say that you don't have a right to privacy in that context.

UBIQUITY: What do the privacy advocacy groups say about this? Do they
assert that there is a right to privacy in a public place, or that
there should be a right?

CRANOR: It also depends on who's taking the picture. There's a
difference between whether the private business owner is taking the
picture, or whether the police are taking the picture. It also depends
on what happens to the picture. Many businesses have surveillance
cameras all over the place that they never look at unless a crime is
committed in which case, they'll pull the tapes and investigate
it. But there's also a situation like the police setting up cameras
and taking pictures of everybody who goes in to watch the Super
Bowl. I think the privacy advocacy groups have different reactions to
these two situations.

UBIQUITY: Because of the motive? 

CRANOR: In part because of the motive, but also in part because of how
the information is used. In the Super Bowl case, the idea is to use
technology that is rather faulty to begin with to identify the faces
of potential terrorists. In part, due to how bad the technology is,
the system doesn't really work. It will have many false
positives. Looking at faces of people when 99.99 percent or perhaps
even 100 percent are completely innocent in the hopes of finding a
terrorist is very different than reviewing surveillance camera tapes
when a crime has actually been committed.

UBIQUITY: Conversationally you say tapes, but isn't it quite rapidly
converting to digital files?

CRANOR: Yes, in some cases it might be digital and in some cases it
might be tape.

UBIQUITY: Theoretically they could keep this information for a long time. 

CRANOR: It's not so much how long it's kept as to how it's
accessed. Let's say that we have a store that keeps tapes. They don't
rewrite the tapes, they keep them forever. They have a huge closet
they just keep throwing the tapes into. Soon they have 10 years' worth
of tapes. It would be very difficult to go back and find something
that happened three years and 29 days ago. They would need a very
specific reason to look on a specific date for something. Otherwise,
the only way to find something is to have somebody sit there in real
time and look through tape after tape after tape. If you have all of
these tapes digitized and you have tools that let you automatically
scan the images looking for something, it makes the information a lot
more accessible. Now it can be used for more speculative hunting
instead of a very targeted investigation.

UBIQUITY: What's your personal opinion about this? In other words, are
you very strongly on one side or the other on the value of these
surveillance kinds of technologies, such as retina scanning, or are
you in somewhere the middle?

CRANOR: I think I come down fairly strongly on the privacy
protection/anti-surveillance side, although perhaps not as strongly as
some of the more extreme privacy activists.

UBIQUITY: Are the most extremist, to use your phrase, completely
against any of these technologies?

CRANOR: It's not just being for or against the technology. It's a
combination of technology and laws and a whole package of things. An
extreme view on one end would be to say there should be no
surveillance and we should have laws in the US that guarantee a right
to privacy in all situations. There are people who feel fairly
strongly that we need to have legislative guarantees of privacy and we
should have very minimal or no surveillance technology. My feeling is
that we should be very cautious when using surveillance
technologies. I wouldn't say that there's no place for them, but I
think they tend to be overused. And they tend to be used

UBIQUITY: Let me ask you about an event that was in the news recently
about a small elementary school, I believe it was in London. The
school is introducing retinal scanning in the school cafeteria. The
punch line is that the reason for doing it was completely benign, if
you can believe the people. It was to prevent insensitive kids from
embarrassing poor kids who don't have the money to pay for their

CRANOR: So basically they are arguing that will protect the privacy of
the poor kids.

UBIQUITY: Do you buy that? 

CRANOR: Yes and no. Certainly I think it's a good idea to protect the
privacy of the poor kids. I'm not so sure about doing it with retinal
scan technology that could potentially be a privacy
invasion. Biometrics technology has multiple uses. There wasn't enough
information in the media reports to know fully what kind of system
they were using. The problem with a lot of these systems is that they
keep a database of the biometrics information. If somebody were to
break into and steal that database, they would have all of this
information about the kids' retinas. In the future when these kids
become adults there may be other applications of retina scans. The
person who stole the retina database now has basically the password
needed to get into these other systems. That would be a bad thing. On
the other hand, there are ways of implementing biometrics systems
where you never actually store that biometrics identifier. It gets
encrypted and used in a way that won't allow you to impersonate
somebody in the future. I also have a lot of concerns about having it
deployed in a low security area such as an elementary school. The
people there aren't necessarily experts in using biometrics
technology. It's possible that it's being done perfectly well, but I'm
very skeptical.

UBIQUITY: Let's move to the topic of your dissertation, and for that
matter to the general topic of your first appearance in Ubiquity. You
did research on electronic voting. Briefly explain the thrust of the

CRANOR: There were basically two aspects to the research that I
did. The first aspect was trying to figure out the logistics of
building a technical implementation of an Internet voting system that
would protect people's privacy, while at the same time making sure
that people could only vote once and only eligible voters could vote.

UBIQUITY: What was the other aspect? 

CRANOR: The other aspect was voting theory work that looked at
developing a new type of voting system that allowed people to
basically cast a contingent vote. So in an election where you have
three or more candidates, sometimes, it's not advantageous to vote for
your first choice if your first choice doesn't have much of a chance
of winning. But if you would like to vote for your second choice,
because that might influence the outcome of the election, but you'd
also like to vote for your first choice so you get it on the record
that that person had some support. So I developed a type of voting
system that would allow that.

UBIQUITY: I believe that some voting theorists have opposed this
idea. It's strongly asserted that a system that has a lot of runoffs
when no candidate gets 50 percent of the vote is anti-minority because
a minority will almost never get 50 percent of the vote. So it
effectively says that minorities will not to get elected.

CRANOR: Yes, to some extent. I think it becomes more of an issue in
the case where you're electing multiple people, say, a town
council. One way to do it is to have all of the seats be general
seats. And the other way to do it is to divide the town into
districts. Let's say 20 percent of your population is African
American, spread evenly throughout the town. If your town is divided
into districts, then 20 percent of the people in every district will
be African American. If we assume that people always vote for the
person of their same race then no African Americans will ever win. In
the case of general seats -- let's say there are five seats and 20
percent of the town is African American -- the African American voters
will win one seat.

UBIQUITY: On the more general question of Internet voting, how is it
progressing? Will it be a significant reality?

CRANOR: I am afraid that Internet voting will be a reality, and I
think it's a bad idea, which surprises people. Usually people think
that because I have done Internet voting work, I must be excited that
Internet voting is progressing. It's one of those things that the more
you know about, the less you like it.

UBIQUITY: What do you dislike about Internet voting? 

CRANOR: For one thing, there are huge security risks, especially when
people are talking about Internet voting from home or from work as
opposed to, say, going to a polling place that happens to be connected
to the Internet. That's no fun. If we have to go to the polling place
it really doesn't matter if it's connected to the Internet or whatever
because we still had to go to a polling place. So when people,
especially in the press, talk about Internet voting they're talking
about voting from anywhere; from home in your pajamas is the classic
example. That means voting over the existing insecure Internet using
the existing insecure computers in my house that might have viruses or
mis-installed software on them. I think that it is really dangerous,
especially in elections where there's something really important at

UBIQUITY: Are there cases where Internet voting has been done successfully? 

CRANOR: There has been a lot of experimentation with Internet
voting. It's been used fairly successfully for some association
elections, stockholder elections and things like that. Electronic
voting supporters point to these as evidence that it works. But there
wasn't much at stake in those elections. There wasn't much of a reason
for malicious people to disrupt those sorts of elections. There are
also many examples of elections that haven't worked. It's typically
not due to security problems, but due to either the software
malfunctioning or user interface issues; users just not being able to
figure out how to make it work.

UBIQUITY: Do you think of these problems as essentially unsolvable? Or
will they be solved a few years from now?

CRANOR: I don't like to say that problems are unsolvable. But I think
that in order to have the kind of comfort level that I would like to
have in the security of the system, we need a major change in the type
of infrastructure that we would be using. That is, we need people to
have computers in their homes that are more secure devices than what
we currently have. And we need an Internet infrastructure that's more

UBIQUITY: Is it possible that your fear or distaste for Internet
voting is as much derived from social policy as it is from technology

CRANOR: Yes. There definitely are some social issues as well. One
thing that comes up is, why are people excited about Internet voting
in the first place? Often people say that it will increase voter
turnout because it will be easier for people to vote, so more people
will participate. Well, there's very little evidence to show that that
would actually happen. There have been a number of attempts to
increase voter turnout -- increasing the number of hours that the
polls are open, increasing the number of polling places, and various
other initiatives. What they find is that the people who voted in
previous elections love this, because it makes it easier for them to
vote in the next election. But the people who didn't vote in the
previous election still don't vote. So it's not clear that having
Internet voting would, in fact, encourage more people to actually
vote.. Or if it did encourage additional voters, are those people who
are not willing to take the time to go to a polling place be willing
to educate themselves about the issues that they're voting on. So it's
not clear that there is a social benefit to Internet voting.

UBIQUITY: What is Publius? 

CRANOR: Publius is a system that Avi Rubin, Marc Waldman and I
developed to allow people to publish information on the Web in a way
that makes it very difficult to censor. Avi was one of my colleagues
here at AT&T, and Marc Waldman was a student at NYU.

UBIQUITY: How is it being received? 

CRANOR: We did a public trial where we released the software and set
up servers to let people play with the system, and it was very
interesting. A lot of civil libertarians thought this was just
wonderful. We had a lot of positive press articles and won a Freedom
of Expression award for this. But some people said that we were
irresponsible to put out such a system. There was an article in
"Scientific American" that had our pictures, and, under it, it labeled
us as being irresponsible.

UBIQUITY: Why irresponsible? 

CRANOR: Because we developed a system where anybody could publish
anything, and it would be very difficult for it to be stopped. People
suggested all sorts of nasty things that might be published that
people might be outraged about and asked, "Well, did you think of
that?" It is a double-edged sword, but we feel that on balance it is
better to allow people to publish ideas. You can always refute
objectionable things with more ideas.

UBIQUITY: Where does that stand now? 

CRANOR: Our prototype system is available for people to play
with. Marc Waldman has continued to work on new systems that grew out
of Publius. Publius was one of the first censorship-resistant
publishing systems. I think it inspired many others to do work in this
area and develop even better systems. It was also one of the
peer-to-peer systems to come out around the time when peer-to-peer was
closely associated with music swapping, and it demonstrated that there
are other uses for peer-to-peer.

R. A. Hettinga <mailto: rah at ibuc.com>
The Internet Bearer Underwriting Corporation <http://www.ibuc.com/>
44 Farquhar Street, Boston, MA 02131 USA
"... however it may deserve respect for its usefulness and antiquity,
[predicting the end of the world] has not been found agreeable to
experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'

The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at wasabisystems.com

More information about the cryptography mailing list