[Cryptography] Zoom publishes draft cryptographic design for end-to-end encryption

Phillip Hallam-Baker phill at hallambaker.com
Thu Jun 4 23:27:37 EDT 2020


On Thu, Jun 4, 2020 at 5:10 PM Christian Huitema <huitema at huitema.net>
wrote:

>
> On 6/3/2020 9:22 PM, John Gilmore wrote:
> > John Young <jya at pipeline.com> wrote:
>
> > I *have* been advised by someone who ought to know better, that merely
> > moving off the most popular two or three platforms vastly reduces the
> > likelihood of penetration.  Like, don't use a Microsoft or Apple OS.
> > But: you have to think about who might want to attack and what their
> > goals are.  If breaking into servers is what matters to one of your top
> > adversaries, then Linux is likely the first platform attacked; if
> > smartphones, Android; etc.
>

When we were building stuff on Open Genera for the White House project, we
used to joke that if it was ever broken the list of suspects would be very
short.

Claims about the comparative hackability of OS tend to be editor war stuff
rather than reasoned arguments. OS without memory protection are
intrinsically easier to break. OS without accounts are easier to break,
etc. But once we get to the commonly used platforms it is a matter of
incentives and how many people have the necessary skill set and which
systems they like working on most.

Oh and don't forget that the person who smugly lectures folk on how
insecure Windows is will probably be running PHP or the like on their Linux
Web server...

Yet I think there is something to that argument, because widely used
> applications are often most vulnerable to nation-state compromises due
> to their business model. Take the example of Skype. The early versions
> of Skype were designed for end-to-end security, and law enforcement
> agencies in many countries were not happy. As Skype became widely used,
> it migrated from being managed by a small crew to being managed as part
> of a big business.


I think this is what people are really missing with Skype/Signal/Zoom etc.
End to End makes no damn difference if the service is only accessible from
a single app provided by the service provider who can force an automatic
update.

Lawful intercept of a Signal or Zoom call is merely a matter of getting a
warrant that requires the service provider to drop a client with a backdoor
onto the specific users they want to intercept. Oh and of course a court
can and will tell you to lie about how many warrants you have been served.
I can't see a judge being remotely impressed by warrant canaries. If a
person intentionally constructs a situation that makes it impossible for
them to comply with a court warrant in good conscience, that is their
problem, not the court's.


It then became much more vulnerable to pressure, and
> had to find creative ways to satisfy the requests of at least some law
> enforcement agencies. After Microsoft bought Skype they centralized the
> handling of the call set-up, and the centralized handling made it much
> easier to satisfy law enforcement requests. We are seeing the same
> process happening with Zoom.
>
> At app that just serves a small niche of users might escape these
> pressures -- until of course it becomes popular enough and noticed...
>

How many users did Lavabit have when the FBI went after them? You only need
to have one customer to get a warrant if it is the wrong customer.

The only robust solution to this problem I can see is an open standard for
end-to-end communications that covers all the common modalities and is
supported by multiple implementations and the updates to those
implementations are subject to some form of transparency controls.

NOBUS is the key here: NObody But US. NSA is not going to rubber hose my
company to force it to issue a backdoored version of the code if they think
the backdoor can be used by someone else. Nor are they likely to want to do
so if the compromise is likely to be discovered.

So lets say that I have a code updater that is the only tool authorized to
update certain apps on a device. It notes the existence of an update,
downloads the code, verifies the signature of the provider and then checks
to see that that particular build has been registered with a transparency
log before it OKs deployment.

Could even go a stage further and require that the build has been validated
as corresponding to a particular GitHub version branch, compiler version,
etc.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://www.metzdowd.com/pipermail/cryptography/attachments/20200604/ce0ff2f6/attachment.htm>


More information about the cryptography mailing list