[Cryptography] "we need to protect [our dox] by at least encrypting them"
brk7bx at virginia.edu
Mon Nov 7 21:18:40 EST 2016
On Tue, 2016-11-08 at 02:04 +0100, ianG wrote:
> But this rabbit hole goes further - unless the app decodes in a
> and can prove that nothing else can get into the sandbox, the bottom
> line is that Huma's erstwhile 'end-node' is still a node.
That is true but is a somewhat different issue. It is difficult to
secure a box that has already been compromised, but it is not so
difficult (technically) to ensure that older, pre-compromise mail
remains secure (though the UI still needs a lot of work). By analogy,
it is definitely possible for someone to steam open your letters, but
that does not allow them to look back in time and see mail you received
before they started spying.
PGP actually solves several immediate security problems. Phishing is
much harder -- just knowing the password is not enough. Mass
surveillance is harder -- you cannot just compromise a single server,
you need to compromise all the clients. Older messages can remain
secret even after a successful attack.
> Most all security work starts from the wire - it's that old CIA
> that's embedded into the consciousness. It's that box we have to
> out of.
Again, PGP is not about the wire, it is about "the node." The most
important thing PGP does is to limit the scope of attacks to the users'
machines. True, those machines could be attacked, but then you can
just record what the victims are typing anyway (which is basically what
the government of Vietnam did with their keyboard driver backdoor). We
can come up with all kinds of things PGP will not protect you from, but
so what? PGP does solve several glaring security problems with email
that are immediately relevant.
> > I think insider threats are not really a cryptography problem,
> > although
> > certain approaches to dealing with insider threats call for some
> > sort
> > of cryptography. Modern cryptography is about dealing with the
> > security problems that arise when information flows across some
> > organizational or security boundaries. Such problems are inherent
> > to
> > any Internet-connected system and to any application that uses the
> > Internet:
> Well, no. Insider threats are beyond cryptography only when you're
> cryptothinking's cryptobox. Consider Bitcoin - it's a cryptographic
> solution to an insider threat, that of Ivan issuing more money or
> accepting false tender.
Ignoring the issue of whether Bitcoin solves that problem, or frankly
any problem, and whether or not that even is a problem in need of a
solution (i.e. let's just skip the Bitcoin economics flamewar), how are
you defining "insider threat" here? One of the few things about
Bitcoin that actually is clear is that it is not meant to fit into some
kind of organization (anyone can join, without needing to authenticate
themselves in any way) -- so who is an "insider" and who is not?
> > How would using PGP fail to move the threat needle? If the mail on
> > the
> > server was encrypted the needle would have been moved, in the sense
> > that hacking just one server would not give you access to anything.
> ach - because to be usable, an email client has to connect to IMAP
> analyse all the mail in the clear.
OK, but then the attack surface is reduced from clients and servers to
just clients. How is that not a big move of the threat needle?
> > If
> > the private keys are kept on a hardware token that requires a
> > periodic
> > button press to decrypt anything, the needle moves even more (and
> > the
> > usability benefits of having keys on a token are nice too).
> To be worth anything it has to work for 1++ bn people. The paper
> designs of the security industry are routinely crushed between the
> unstoppable force of the hacker and the immovable rock of
> Kherckhoffs' 6th.
Which is why carrying the key on a token is better than trying to keep
it on client machines. In my (humble) experience one of the biggest UX
failures of PGP is that there is no easy way for people to sit down at
some random computer and read their mail. That is how people use
email; a hardware token is one of the only good ways to align email
encryption with that usage pattern.
Maybe people would find that annoying, but I doubt it -- 2FA has been
pretty successful despite requiring people to carry around something
extra, and even if it is riskier in terms of compromise, a smartphone
could be a convenient "token."
> Idle question - did Hillary have a hardware token?
No, and in fact the failure to even deploy 2FA (i.e. some kind of
hardware token for reading mail) has been a criticism that I have seen
even in the typically clueless media.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 847 bytes
Desc: This is a digitally signed message part
More information about the cryptography