[Cryptography] Hope Apple Fights This!

Kevin W. Wall kevin.w.wall at gmail.com
Wed Feb 17 21:15:35 EST 2016


On Wed, Feb 17, 2016 at 6:29 PM, Jerry Leichter <leichter at lrw.com> wrote:
>> What they are asking Apple to do is to update code which is not
>> (AAUI) secured by that cryptography.  Specifically, they want
>> Apple to disable the OS code that counts the number of attempts
>> that the user has made to enter the password.  That OS code
>> is not customer data and isn't protected by the device encryption.
>>
>> If Apple *can* do this, then it cannot legally refuse the order
>> unless it wins in court.  Which I hope they do, but in this
>> specific case - owner known to have committed multiple murders,
>> search warrant pursuant to an ongoing investigation, and compliance
>> opens a single device instead of all devices - I don't think they
>> will.
> TBD.  Courts will require you to turn over any material that already exists -
> no issues there.  *In general*, they will not compel you to go out and create
> stuff that doesn't already exist.  The line is gray.  Note that the order
> itself tells Apple that it can push back if it believes the effort is too
> onerous.

Exactly, that is point #7:

    7. To the extent that Apple believes that compliance with this Order
    would be unreasonably burdensome, it may make an application to this
    Court for relief within five business days of receipt of Order.

I think the interesting aspect here is how the courts might measure
"unreasonably burdensome". If it is interpreted such that it is restricted
to the effort of developing, testing, and deploying the software, that
is unlikely to be considered burdensome. The FBI may even decide to
partly subsidize or even completely pay for those costs. (The only out
there would be if Apple could justify some expensive consulting firm
who sends in 17 PMs to "manage" the project on Apple's behalf driving it
beyond budget and slipping schedules ad infintum. You know, how most IT
projects work. :)

The more interesting way that I could see this playing out is if Apple
argues that there current iOS security mechanisms give them a substantial
competitive advantage in the marketplace and that by being forced to
backdoor their own security measures, they will be at risk of losing
that competitive advantage.

In the first case, we're probably looking at a max of $1M. In the latter
case, it could run into the billions. I'm not holding my breath on this
because I think there would be a substantial burden of proof for Apple
to show it is burdensom in this latter sense.

> This case will likely be precedent-setting on exactly the issue of how far the
> government can go in ordering a third party to do work for it.
>
Precisely; as I mentioned on Twitter, this is a slippy slope and we're just
sitting one NSL from the FBI demanding Apple to generalize the technique
to work for all their iPhones (or at least all those of a certain model)
and all of a sudden we're back to the warrantless searches that we've been
trying to avoid. Surprisingly none of the talking heads I've seen seem to
get that, not even the Libertarian ones. (But then, they don't seem to think
this is a backdoor either, so maybe it's just a case of being technologically
clueless.)

>> If Apple *can't* do this, then it means they have given up not
>> just the ability to get to their customers' data remotely but
>> also the ability to update the customers' operating systems
>> without the customers' express consent.
>
> Apple has never claimed that ability, and there's no evidence they have it.
> Normal updates are user-initiated, at at least user-approved.  Over-the-air
> updates require the phone to be unlocked.  Updates through iTunes require that
> the computer running be marked in the phone as "trusted".

There appears to be a Device Firmware Upgrade (DFU) mode that can be
entered from a powered-off (and thereby, presumably locked) mode. See
<https://www.theiphonewiki.com/wiki/DFU_Mode#Entering_DFU_Mode_on_iPhone.2C_iPad_or_iPod_touch>

> Only an unlocked
> iPhone can mark a new computer as "trusted".  In most cases like this, the FBI
> would have access to the subject's laptop or other computer, which has
> probably already been marked trusted.  (Mac's don't have the kind of hardware
> protection that iPhones do; breaking into them is relatively much easier.)  In
> this case, the subjects apparently removed the hard disk from their computer,
> and it hasn't been found - so it's not clear if the FBI has access to a
> computer the phone considers "trusted".  (I *think*, even in the case of a
> trusted computer, the normal update process requires an unlocked phone; but
> I'm not sure of that.)
>
> So it's not clear exactly what Apple would have to do to get into this phone.
> We have the court ordering them to do that, and Apple pushing back on
> principle; we don't know if they could also push back with "we have no
> technical means to do what you want".  My guess is Apple can find a way to do
> it - but it would be difficult and would effectively reveal the existence of
> security hole.

Apparently the security company Trail of Bits thinks it is doable, especially
since this particular model is a 5C and predates the Security Enclave. See

<https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/>

for details.

>
>> Because having many different versions of the OS out there makes
>> software support into a huge expensive headache, I *REALLY*
>> doubt that Apple has given up the ability to push OS updates
>> regardless of customer consent. I also don't believe they've
>> invested in the kind of hardware security module that would
>> protect any individual part of the OS from updates without the
>> customer entering the code, while allowing updates to the rest
>> of the OS.
> a) They don't do such updates today.  There *may* be exceptions for phones set
> up for remote administration.  I don't think so, though - the main things
> remote administration can do forcibly is wipe a stolen phone and push policies
> on things like minimum password lengths - though for the latter, I think you
> still have to accept the modification.  The main enforcement mechanism is that
> if you don't accept the profiles, you're denied access to corporate assets
> like mail accounts.

If current Apple iPhones with Security Enclaves allow firmware upgrades
to the Security Enclave itself *while the device itself is LOCKED*
and such an upgrade does not wipe the Security Enclave keys, then Apple
may want to reconsider this in the future as an addition to their present
threat model. Of course, the downside of that is that more customers
accidentally get their phone bricked and/or their data wiped. Choose
your poison. (Or perhaps they arrange it so that the customer gets to
choose their own poison?)

> Since this is a county-owned phone, it *may* have been set up for remote
> administration - but in that case it's extremely unlikely the attackers would
> have stored anything even remotely incriminating on it.
>
> b) Apple *has* built such a hardware security module (the "secure enclave"
> that's part of the CPU chip), and all iPhones since the iPhone 5S include it.

Which seems to be irrelevant here since this is supposedly an iPhone 5C.

-kevin
-- 
Blog: http://off-the-wall-security.blogspot.com/    | Twitter: @KevinWWall
NSA: All your crypto bit are belong to us.


More information about the cryptography mailing list