[Cryptography] The FBI can (almost certainly) crack the San Bernardino iPhone without Apple's help

Jerry Leichter leichter at lrw.com
Wed Mar 2 20:23:22 EST 2016


>> I want the security.  But I don't want the system I'd get if
>> that security were a commodity that software or service
>> providers could use to create captive markets in their walled
>> gardens.
> 
> In practice, though, it appears that the average user seems to be
> better served by the security of the walled garden than by a system
> which anyone with physical access can sabotage. (I include myself, by
> the way, as an average user for these purposes.)
Frankly, as far as I can tell, pretty much *everyone* is more secure in the walled garden.

The day when any one person understood everything about what was happening on any useful computer system are long passeed.  Hell, even the days when any one person was familiar with all the security-relevant research have long passed.  I  could honestly say 25 years ago that I knew at least a bit of something about cryptography, access control, secure implementation techniques - pretty much everything being done in the field.  Today it's completely impossible.  You have sophisticated mathematical techniques, chip analysis techniques, bizarro attacks like ROP and JSFUCK - the range of stuff out there is astounding.

So the idea that you can somehow build a secure environment for yourself, confidently picking and choosing exactly what to trust and what can safely be combined with what is about as realistic as the idea that you could go out into the wilderness alone and in a couple of years build a car.

Attacks these days are team efforts.  So are defenses.  And if you think you can build a team of like-minded individuals to form your own team with any real degree of assurance that not one of them isn't quite what he seems ... well, forget it.  There's a reason organized militaries have been beating disorganized  but larger and fiercer groups since Roman times.

It's hard to let go of the dream, but it's necessary.

> This is bothersome to me because, ideologically, I prefer completely
> open systems, and in many practical respects I prefer them as well.
> However, my odds of downloading dangerous iOS software are much lower
> than my odds of downloading dangerous Android software, and it
> appears to require some significant sophistication to attack the
> physical security of an iOS device (though clearly not as much as has
> been portrayed by the law enforcement community.)
I've said this before, but perhaps not here:  "Open" - and particularly the lack of openness - as we use it is to a large degree an illusion.  It's missing what's really going on, which is a movement to higher levels of abstraction.

I've been around long enough to have seen the tail end of the period when hardware was open:  A decently equipped university lab could build a reasonably competitive computer from small parts.  And many did, to experiment with all kinds of architectural ideas.  But eventually designing new computers became uninteresting as an academic exercise as the basic problems all got pretty well understood good solutions.  The remaining problems moved onto silicon, and the challenges were at a scale that only a few large corporations could compete.

Operating systems went through a similar evolution.  So did UI's, compilers, etc.

In the 1980's, anyone could come up with a nifty new text-based Unix tool.  I wrote a bunch myself, some of which I still use.  But when was the last time you saw something new like this?  Again, the old problems have good solutions, and the new problems are in different areas:  Much larger scale, much higher levels of abstraction.

If you look at the most closed system in wide use today - iOS - indeed, everything from the innards of the chips to the OS to the system UI is fully controlled.  But ... on top of the based, there's an incredibly rich collection of apps.  App development is pretty much open.  (Yes, Apple controls it - but the interesting thing is that back in the days when jailbreaks were easily available, the apps you could find for jailbroken phones really didn't do all that much more than the Apple-approved apps.)  Even beyond that ... anyone can, for $100/year, become a developer.  Writing and using *your own apps, yourself* is essentially open.  Yes, the layer of abstraction provided by iOS is there and you can't change it - but above that, do your damnedest.  Any Apple policies only come into play if you want to *sell* your app.  (I find it fascinating - and hard to explain - why a culture of software sharing of app sources among hacker developers has never emerged.)

What we're seeing is directly analogous to the tradeoff between standards and non-standard development.  TCP/IP pinned down a standard.  If you want to play in the Internet world, you need to use it - even if it limits you.  But ... the very closedness of the TCP/IP world ensures that anyone can write code to use it, and it will work everywhere and talk to everything.

                                                        -- Jerry



More information about the cryptography mailing list