[Cryptography] About those fingerprints ...

Jerry Leichter leichter at lrw.com
Thu Sep 12 13:42:22 EDT 2013


[Perry - this is likely getting too far off-topic, but I've included the list just in case you feel otherwise.  -- J]

On Sep 12, 2013, at 12:53 AM, Andrew W. Donoho <awd at DDG.com> wrote:

> 
> On Sep 11, 2013, at 12:13 , Jerry Leichter <leichter at lrw.com> wrote:
> 
>> On Sep 11, 2013, at 9:16 AM, "Andrew W. Donoho" <awd at DDG.com> wrote:
>>> Yesterday, Apple made the bold, unaudited claim that it will never save the fingerprint data outside of the A7 chip.
>> By announcing it publicly, they put themselves on the line for lawsuits and regulatory actions all over the world if they've lied.
>> 
>> Realistically, what would you audit?  All the hardware?  All the software, including all subsequent versions?
> Jerry,
> 
> 
> 
> 	First I would audit that their open source security libraries, which every app has to use, are the same as I can compile from sources.
Well ... there's an interesting point here.  If it's an open source library - what stops you from auditing it today?

On OSX, at least, if I were to worry about this, I'd just replace the libraries with my own compiled versions.  Apple has a long history of being really slow about updating the versions of open source software they use.  Things have gotten a bit better, but often through an odd path:  Apple gave up on maintaining their own release of Java and dumped the responsibility back on Oracle (who've been doing a pretty miserable job on the security front).  For the last couple of years, Apple distributed a X client which was always behind the times - and there was an open source effort, XQuartz, which provided more up to date versions.  Recently, Apple decided to have people pull down the XQuartz version.  (For both Java and X, they make the process very straightforward for users - but the important point is that they're simply giving you access to someone else's code.)  They've gone this way with a non-open source component as well, of course - Flash.  They never built it themselves; now they don't even give you help in downloading it.

But ... suppose you replaced, say, OpenSSL with your own audited copy.  There are plenty of ways for code that *uses* it to leak information, or just misuse the library.  On top of that, we're mainly talking user-level code.  Much of the privileged code is closed source.

So ... I'm not sure were auditing gets you.  If you're really concerned about security, you need to use trusted code throughout.  A perfectly reasonable choice, perhaps - though having regularly used both Linux and MacOS, I'd much rather use MacOS (and give up on some level of trustworthiness) for many kinds of things.  (There are other things - mainly having to do with development and data analysis - that I'd rather do on Linux.)

> Second, the keychain on iOS devices is entirely too mysterious for this iOS developer. This needs some public light shone on it. What exactly is the relationship between the software stack and the ARM TPM-equivalent.
I agree with you.  I really wish they'd make (much) more information available about this.  But none of this is open source.  I've seen some analysis done by people who seem to know their stuff, and the way keys are *currently* kept on iOS devices is pretty good:  They are encrypted using device-specific data that's hard to get off the device, and the decrypted versions are destroyed when the device locks.  But there are inherent limits as what can be done here:  If you want the device to keep receiving mail when it's locked, you have to keep the keys used to connect to mail servers around even when it's locked.

> Third, in iOS 7, I can make a single line change and start syncing my customer's keychain data through iCloud. At WWDC this year, Apple did not disclose how they keep these keys secure. (As it is a busy conference, I may have missed it.)
Keychain syncing was part of the old .Mac stuff, and in that case it was clear: They simply synced the keychain files.  As I said, there is some information out there about how those are secured, and as best I've been able to determine, they are OK.  I wish more information was available.

It's not clear whether iCloud will do something different.  Apparently Apple removed keychain syncing from the final pre-release version of iOS 7 - it's now marked as "coming soon".  The suspicion is that in the post-Snowden era, they decided they need to do something more to get people to trust it.  (Or, I suppose, they may actually have found a bug....)  We'll see what happens when they finally turn it on.

> Fourth, does Apple everywhere use the same crypto libraries as developers are required to use?
Developers aren't *required* to use any particular API's.  Could there be some additional crypto libraries that they've kept private?  There's no way to know, but it's not clear why they would bother.  The issue is presumably that NSA might force them to include a back door in the user-visible libraries - but what would Apple gain, beyond the extra work of maintaining two sets of code, and likely future headaches on the legal front - by maintaining a separate library? What's exposed either way is *your* data, not Apple's!

> Fifth, what is the path for the fingerprint data from sensor to TPM-like device? I'm sure there are many more questions we could ask.
Sure.  But you're highly unlikely to get detailed answers.  Then it's your choice 

> 	And yes, subsequent versions need audits. That is why this is a formal part of the release process. It is just as important as an audited annual report. 
Apple doesn't see "people (or even organizations) who want to, and are able to, do detailed security audits on their hardware and software" as a significant market.  And they are probably right - we're talking *at most* thousands of people in the entire world.  So why should they cater to them - a process with significant cost?  Many large, highly security-concious organizations are satisfied with the guarantees they get.

There are specialized vendors of phones that come with tighter security guarantees.  BlackBerry used to be one, though the recent NSA leaks have tarnished their image - perhaps without actual justification.  They operate on the "trust us" principle, too.  General Dynamics makes cell phones for the US government - see http://www.gdc4s.com/sectera-edge-(sme-ped)-proddetail.html
for the phone that was allegedly given to President Obama in place of his beloved BlackBerry.  One of these babies will set you back $3345, but it does come with likely the best assurance you can find anywhere that it's as secure as the NSA can make it against outsiders - but of course whether it's secure *against* the NSA is a whole other question.

>> This is about as strong an assurance as you could get from anything short of hardware and software you build yourself from very simple parts.
> 	I can image many stronger assurances than a promise in a video that is superseded by a click wrap EULA. 
Well, you can always review the EULA.  Be prepared to set aside a *lot* of time.

> 	Per Perry's direction, I have elided the rest of Jerry's excellent comments. Thank you Jerry for making them.
Thanks.
                                                        -- Jerry

> 
> 
> Anon,
> Andrew
> ____________________________________
> Andrew W. Donoho
> Donoho Design Group, L.L.C.
> awd at DDG.com, +1 (512) 750-7596, twitter.com/adonoho
> 
> Download Retweever here: <http://Retweever.com>
> 
> No risk, no art.
> 	No art, no reward.
> 		-- Seth Godin
> 
> 
> 



More information about the cryptography mailing list