[Cryptography] NY DA Vance's 'Smartphone Encryption and Public Safety'

Jerry Leichter leichter at lrw.com
Fri Nov 20 12:49:51 EST 2015


If you want to argue this to the already convinced - which is who you're talking to in this group - that's one thing.  If you want to argue it to the general public ... there are a couple of fundamental misunderstandings in what you say that will leave holes in your argument that the opposition will drive a truck through:

> Does DA Vance realize that NY State *requires the use of full-disk
> encryption* for all of its laptops?  "Full disk encryption is required
> for all State issued laptops that access or contain SE information.
> Full disk encryption products must use either pre-boot authentication
> that utilizes the device’s Trusted Platform Module (TPM), or Unified
> Extensible Firmware Interface (UEFI) Secure Boot."  Source: New York
> State Information Technology Standard No: NYS-S14-007, IT Standard:
> Encryption.
And they probably require that an unlock key be available to the state.  That's best practice for any commercial entity that provides equipment to its employees.  The employer owns the machine; the user has no privacy rights on that machine.  In fact, I would go so far as to say that it would be gross incompetence for a NY State government agency to issue a laptop and allow encryption on it that it cannot reverse.  (Yes, an employee can install their own encryption utilities.  That would be against policy and in some situations would constitute a firing offense.)

> 1.  It is my understanding that -- absent specific legislation, e.g.,
> CALEA -- manufacturers are under no obligation to build products that
> make it easier for law enforcement to gather information, whether
> subject to a warrant or not.  In short, Apple & Google are not
> extensions of the govt and/or its law enforcement wing(s), and this is
> a very good thing for any democracy.
It's a not completely settled question as to whether a CALEA-like law applying to cell phone makers would pass Constitutional muster.  CALEA has it easier because the companies it aims at are already highly regulated utilities.  Apple and Google are not.  But if you got into a debate on this, I guarantee you that your opposition will say "We have a long, long established understanding that regulations to promote safety - and public safety - are legitimate.  Why do you think there are brake lights on your car?  This is just another public safety regulation."

> 2.  The claims of a proper trade-off between "the loss in personal
> security" and the "gain in societal safety".  I'm not a Constitutional
> lawyer, but this particular trade-off has already been made in the
> Fourth Amendment, and just recently (June 25, 2014) underlined by the
> U.S. Supreme Court w.r.t. cellphone data (Riley v. California).
> 
> http://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf
Riley says that the police can't access the information *without a warrant*.  Vance is careful to say that he wants access *with a warrant*.  The opposition here is not stupid - they've been arguing that they don't need a warrant, but now that the Supreme Court has spoken, they are falling back from a lost position.

> I would seriously challenge this "gain in societal safety" on the
> grounds that a universal vulnerability in cellphones destroys the
> "herd immunity" of the entire cellphone community, and puts everyone
> else at *increased* risk of various worms and botnets.
You started off this item by talking about the courts.  Keep in mind that except in very unusual circumstances, courts do not decide matters like "allowing access to encrypted phones is/is not riskier than allowing fully secure encryption".  This is exactly the kind of "finding" on which they defer to the legislature.  Note that most "findings" by legislatures are based on ... nothing beyond what will sell well politically.  Nevertheless, these kinds of tradeoffs are generally left to the legislatures and the political process.  (Which *in principle* is exactly as it should be.  The problem is not with who is called upon to make such decisions - it's that they are doing such a piss-poor job of it.)

> 3.  Were there vulnerabilities in previous versions of the software
> that *didn't* use default encryption?  Does every vulnerability have
> to be devastatingly exploited before Apple/Google are allowed to
> utilize stronger security protocols?  I can think of a number of
> reasons for going ahead with so-called "full disk encryption" as a
> default option, having nothing to do with stymieing law enforcement.
There is absolutely nothing in this proposal requiring that makers bring back old versions of the software.

What he seems to have in mind is that, just as in the case of FDE for commercial laptops, there be a secondary key that only the manufacturer has access to.  In a perfect world, where only the good guys ever demand access, and only for good reasons; and the keys are kept forever secure; this would be a great solution.  Unfortunately that's not the world we live in.

Sad to say, countering fear with understanding is usually ineffective.  Countering fear with fear works better.  It's important to keep pounding on all the data that's being leaked, every day.  Let people know that when someone like Vance says "we know Apple can keep the master key safe" he's speaking nonsense.

> Thirdly, because "everything is software", the mere existence of
> unlocking & decrypting code -- supposedly accessible only through
> physical access -- provides additional attack surface for *other,
> networked* attacks.  So it isn't true that "physical access only"
> doesn't also increase the risk of networked attacks.
Again, in an ideal world, there's no reason why that should be true.  The special access could require opening up the phone and wiring in to something inside it.

Again, you need to convince the general public that the real world is not like this.

> 5.  Why isn't Apple/Google cloud data encrypted?  Why, indeed.  I
> suspect that Apple/Google will soon fix this so that there is no
> distinction in encryptedness between data on the device and data in
> the cloud.
These are completely separate issues.

Data at Google *is* encrypted; that I can tell you from personal experience.  I have no direct knowledge of Apple, but data there is almost certainly encrypted as well.

The problem is that people want remote access to their data, and they want some of their data to be processed remotely as well.  In many scenarios, this requires that the cloud provider be able to access the data.  In some cases - where the data is accessed remotely but has to be decrypted locally - the provider might only have access *while the user is connected and has provided a password*, but that's enough.

Cell phones can protect the data in them because they only need to process it internally - and even then, they need specialized hardware to keep stuff secure in the case of a physical attack.

> 6.  Other nations' access to the data.  To a large extent this is
> irrelevant, since these other countries don't have a Fourth or Fifth
> Amendment.
Actually, it's very relevant.  If we provide access for the US government, every other government in the world will insist on getting access as well.

> 7.  Access to deleted data.  Vance is clearly overstepping here.  The
> *only* reason why "deleted data" is sometimes accessible is due to a
> *mistake* in the operating system and file system.  Just like
> upgrading SSL to TLS fixes many mistakes, an operating system and a
> file system that truly deletes data that is marked for deletion fixes
> many potential flaws -- including security flaws.
The courts disagree with you.  After Enron's massive destruction of documents in anticipation of the Feds coming after them, the law was changed so that destruction of data when (I'm going from memory here) you know that it might be evidence of a crime, and where there is any kind of investigation of a crime to which the data might be relevant, even if you don't know about it, is itself a crime.  The law was *intended* to cover cases of corporate malfeasance, but of course prosecutors have now tried to extend it as far as they can.

Exactly where the lines will end up being drawn isn't yet settled, but the excuse "the computer made me do it" won't hold up in the future.  There will be situations in which you *may not* destroy data - and your computer had better be configured to do it.  (This has long been the case *once you've been informed of the existence of legal processes that will require you to hold some data.  What's different now is that you don't even have to be informed.)

>  For example, it is
> critical for "forward secrecy" to be able to guarantee that "session
> keys" be securely erased.
So you've just proved that in some undefined cases, you *may not use* forward security - unless you log the data.

> The biggest hole of all is "application".  If the cellphone user
> downloads & installs the Tor Instant Messaging Bundle app, how can
> Apple/Google possibly decrypt data -- especially data that isn't
> stored.  With the advent of so-called "containers" and "virtual
> machines", and application can be an entirely new operating system,
> with its own set of apps, etc.  Turing Machines can diagonalize, and
> diagonalization is pretty darn cheap these days.

This is the dumb thing about all these proposals:  The bad guys will easily get around them.  It's the average citizen who's most exposed.  This needs to be explained again and again, so that people understand it.  Distasteful as it is to say it, the NRA understands this kind of thing extremely well.  No matter how tailored a law regulating guns useful to bad guys might be, they will attack it as "the Feds are trying to take away John Q. Public's guns".  These kinds of arguments, done right, are effective.

BTW, if you find yourself in a debate with one of the proponents of this stuff and he uses the recent argument that "Apple and Google are just doing this because they make a profit from it" - you can hit them hard.  Just go right-wing on them:  "We're a capitalist country.  They make a profit because that's what a free people want to buy.  What are you, some kind of socialist?  You want the government deciding what people are around to buy, for their own good?"  Trying to demonize some of the country's favorite companies by arguing against the profit motive - really, really dumb move by the anti=crypies.  Give them hell for it.
                                                        -- Jerry



More information about the cryptography mailing list