# [Cryptography] Scott Aaaronson: NSA: Possibly breaking US laws, but still bound by laws of computational complexity

Eugen Leitl eugen at leitl.org
Mon Sep 9 08:40:10 EDT 2013

http://www.scottaaronson.com/blog/?p=1517

NSA: Possibly breaking US laws, but still bound by laws of computational
complexity

Last week, I got an email from a journalist with the following inquiry.  The
recent Snowden revelations, which made public for the first time the US
government’s “black budget,” contained the following enigmatic line from the
Director of National Intelligence: “We are investing in groundbreaking
cryptanalytic capabilities to defeat adversarial cryptography and exploit
internet traffic.”  So, the journalist wanted to know, what could these
“groundbreaking” capabilities be?  And in particular, was it possible that
the NSA was buying quantum computers from D-Wave, and using them to run
Shor’s algorithm to break the RSA cryptosystem?

I replied that, yes, that’s “possible,” but only in the same sense that it’s
“possible” that the NSA is using the Easter Bunny for the same purpose.  (For
one thing, D-Wave themselves have said repeatedly that they have no interest
in Shor’s algorithm or factoring.  Admittedly, I guess that’s what D-Wave
would say, were they making deals with NSA on the sly!  But it’s also what
the Easter Bunny would say.)  More generally, I said that if the open
scientific world’s understanding is anywhere close to correct, then quantum
computing might someday become a practical threat to cryptographic security,
but it isn’t one yet.

That, of course, raised the extremely interesting question of what
“groundbreaking capabilities” the Director of National Intelligence was
referring to.  I said my personal guess was that, with ~99% probability, he
meant various implementation vulnerabilities and side-channel attacks—the
sort of thing that we know has compromised deployed cryptosystems many times
in the past, but where it’s very easy to believe that the NSA is ahead of the
open world.  With ~1% probability, I guessed, the NSA made some sort of big
improvement in classical algorithms for factoring, discrete log, or other
number-theoretic problems.  (I would’ve guessed even less than 1% probability
for the latter, before the recent breakthrough by Joux solving discrete log
in fields of small characteristic in quasipolynomial time.)

Then, on Thursday, a big New York Times article appeared, based on 50,000 or
so documents that Snowden leaked to the Guardian and that still aren’t
Schneier, and accompanying Q&A.)  While a lot remains vague, there might be
more public information right now about current NSA cryptanalytic
capabilities than there’s ever been.

So, how did my uninformed, armchair guesses fare?  It’s only halfway into the
NYT article that we start getting some hints:

The files show that the agency is still stymied by some encryption, as Mr.
Snowden suggested in a question-and-answer session on The Guardian’s Web site
in June.

“Properly implemented strong crypto systems are one of the few things that
you can rely on,” he said, though cautioning that the N.S.A. often bypasses
the encryption altogether by targeting the computers at one end or the other
and grabbing text before it is encrypted or after it is decrypted…

Because strong encryption can be so effective, classified N.S.A. documents
make clear, the agency’s success depends on working with Internet companies —
by getting their voluntary collaboration, forcing their cooperation with
court orders or surreptitiously stealing their encryption keys or altering
their software or hardware…

Simultaneously, the N.S.A. has been deliberately weakening the international
encryption standards adopted by developers. One goal in the agency’s 2013
budget request was to “influence policies, standards and specifications for
commercial public key technologies,” the most common encryption method.

Cryptographers have long suspected that the agency planted vulnerabilities in
a standard adopted in 2006 by the National Institute of Standards and
Technology and later by the International Organization for Standardization,
which has 163 countries as members.

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered
by two Microsoft cryptographers in 2007, was engineered by the agency. The
N.S.A. wrote the standard and aggressively pushed it on the international
group, privately calling the effort “a challenge in finesse.”

So, in pointing to implementation vulnerabilities as the most likely
possibility for an NSA “breakthrough,” I might have actually erred a bit too
far on the side of technological interestingness.  It seems that a large part
of what the NSA has been doing has simply been strong-arming Internet
companies and standards bodies into giving it backdoors.  To put it bluntly:
sure, if it wants to, the NSA can probably read your email.  But that isn’t
mathematical cryptography’s fault—any more than it would be mathematical
crypto’s fault if goons broke into your house and carted away your laptop.
On the contrary, properly-implemented, backdoor-less strong crypto is
something that apparently scares the NSA enough that they go to some lengths
to keep it from being widely used.

I should add that, regardless of how NSA collects all the private information
it does—by “beating crypto in a fair fight” (!) or, more likely, by
exploiting backdoors that it itself installed—the mere fact that it collects
so much is of course unsettling enough from a civil-liberties perspective.
So I’m glad that the Snowden revelations have sparked a public debate in the
US about how much surveillance we as a society want (i.e., “the balance
between preventing 9/11 and preventing Orwell”), what safeguards are in place
to prevent abuses, and whether those safeguards actually work.  Such a public
debate is essential if we’re serious about calling ourselves a democracy.

At the same time, to me, perhaps the most shocking feature of the Snowden
revelations is just how unshocking they’ve been.  So far, I haven’t seen
anything that shows the extent of NSA’s surveillance to be greater than what
I would’ve considered plausible a priori.  Indeed, the following could serve
as a one-sentence summary of what we’ve learned from Snowden:

Yes, the NSA is, in fact, doing the questionable things that anyone not
living in a cave had long assumed they were doing—that assumption being so
ingrained in nerd culture that countless jokes are based around it.

(Come to think of it, people living in caves might have been even more
certain that the NSA was doing those things.  Maybe that’s why they moved to
caves.)

yadda, let me move on to discuss the implications of the Snowden revelations
for something that really matters: a 6-year-old storm in theoretical computer
science’s academic teacup.  As many readers of this blog might know, Neal
Koblitz—a respected mathematician and pioneer of elliptic curve cryptography,
who (from numerous allusions in his writings) appears to have some
connections at the NSA—published a series of scathing articles, in the
Notices of the American Mathematical Society and elsewhere, attacking the
theoretical computer science approach to cryptography.  Koblitz’s criticisms
were varied and entertainingly-expressed: the computer scientists are too
sloppy, deadline-driven, self-promoting, and corporate-influenced; overly
trusting of so-called “security proofs” (a term they shouldn’t even use,
given how many errors and exaggerated claims they make); absurdly overreliant
on asymptotic analysis; “bodacious” in introducing dubious new hardness
assumptions that they then declare to be “standard”; and woefully out of
touch with cryptographic realities.  Koblitz seemed to suggest that, rather
than demanding the security reductions so beloved by theoretical computer
scientists, people would do better to rest the security of their
cryptosystems on two alternative pillars: first, standards set by
organizations like the NSA with actual real-world experience; and second, the
judgments of mathematicians with … taste and experience, who can just see
what’s likely to be vulnerable and what isn’t.

Back in 2007, my mathematician friend Greg Kuperberg pointed out the irony to
me: here we had a mathematician, lambasting computer scientists for trying to
do for cryptography what mathematics itself has sought to do for everything
since Euclid!  That is, when you see an unruly mess of insights, related to
each other in some tangled way, systematize and organize it.  Turn the tangle
into a hierarchical tree (or dag).  Isolate the minimal assumptions (one-way
functions?  decisional Diffie-Hellman?) on which each conclusion can be
based, and spell out all the logical steps needed to get from here to
there—even if the steps seem obvious or boring.  Any time anyone has tried to
do that, it’s been easy for the natives of the unruly wilderness to laugh at
the systematizing newcomers: the latter often know the terrain less well, and
take ten times as long to reach conclusions that are ten times less
interesting.  And yet, in case after case, the clarity and rigor of the
systematizing approach has eventually won out.  So it seems weird for a
mathematician, of all people, to bet against the systematizing approach when
applied to cryptography.

The reason I’m dredging up this old dispute now, is that I think the recent
NSA revelations might put it in a slightly new light.  In his article—whose
main purpose is to offer practical advice on how to safeguard one’s
communications against eavesdropping by NSA or others—Bruce Schneier offers
the following tip:

Prefer conventional discrete-log-based systems over elliptic-curve systems;
the latter have constants that the NSA influences when they can.

Here Schneier is pointing out a specific issue with ECC, which would be
solved if we could “merely” ensure that NSA or other interested parties
weren’t providing input into which elliptic curves to use.  But I think
there’s also a broader issue: that, in cryptography, it’s unwise to trust any
standard because of the prestige, real-world experience, mathematical good
taste, or whatever else of the people or organizations proposing it.  What
was long a plausible conjecture—that the NSA covertly influences
cryptographic standards to give itself backdoors, and that
otherwise-inexplicable vulnerabilities in deployed cryptosystems are
sometimes there because the NSA wanted them there—now looks close to an
established fact.  In cryptography, then, it’s not just for idle academic
reasons that you’d like a publicly-available trail of research papers and
source code, open to criticism and improvement by anyone, that takes you all
the way from the presumed hardness of an underlying mathematical problem to
the security of your system under whichever class of attacks is relevant to
you.

Schneier’s final piece of advice is this: “Trust the math.  Encryption is

“Trust the math.”  On that note, here’s a slightly-embarrassing confession.
When I’m watching a suspense movie (or a TV show like Homeland), and I reach
one of those nail-biting scenes where the protagonist discovers that
everything she ever believed is a lie, I sometimes mentally recite the proof
of the Karp-Lipton Theorem.  It always calms me down.  Even if the entire
universe turned out to be a cruel illusion, it would still be the case that
NP ⊂ P/poly would collapse the polynomial hierarchy, and I can tell you
exactly why.  It would likewise be the case that you couldn’t break the GGM
pseudorandom function without also breaking the underlying pseudorandom
generator on which it’s based.  Math could be defined as that which can still
be trusted, even when you can’t trust anything else.

This entry was posted on Sunday, September 8th, 2013 at 11:31 am	 and
is filed under Complexity, Nerd Interest. You can follow any responses to
this entry through the RSS 2.0 feed. You can leave a response, or trackback

24 Responses to “NSA: Possibly breaking US laws, but still bound by laws of
computational complexity” Aaronson on crypto. Schneier “elliptic-curve
systems; the latter have constants that the NSA influences when they can.” |
Gordon's shares Says: Comment #1 September 8th, 2013 at 1:22 pm […] Link.
Trust math, but not NSA mathematicians. […]

Douglas Knight Says: Comment #2 September 8th, 2013 at 1:35 pm Could you be
more specific about what you mean by the hypothetical “big improvement” on
number theory algorithms that is covered by your 1%?

Do elliptic curve algorithms count? Does an L(1/4) algorithm count, or only
quasi-polynomial? What if they can’t break all instances, but, as has
particular keys weak? Breaking a random half of all keys is almost as good as
breaking all of them. Schneier’s condemnation of ECC seems to require more
than 1% chance NSA knows something special about ECC.

PS – David Jao, commenting on Schneier’s blog says that we can and do use
cryptography to prevent NSA from meddling with mystery constants. He says
that the ECC standard curves are generated by SHA-1, so to meddle, NSA would
have to break the has function. (But if half of curves are bad, that’s easy.)

Anonymous Says:

Comment #3 September 8th, 2013 at 1:45 pm

You are making good and interesting points. However, Koblitz also has some
valid criticisms of TCS even if his conclusions are not valid. The
mathematical models we built in TCS are useless if they don’t relate to the
practice and we know many of our standard models are not good enough
approximation of the reality and arguably there isn’t enough effort to deal
with these issues. Technical heavy weight lifting is used as the ultimate
criteria for judging the value of research projects inside the community.

Also I think you are exaggerating what most cryptographers expected that NSA
was doing. I have heard several famous crypto experts quite surprised by
these revelations and it has shaken their trust in the government
institutions. I never understood why some people presume that government is a
benevolent entity, such beliefs in government institutions seems like
ideology to me.

Daniel Armak Says:

Comment #4 September 8th, 2013 at 2:06 pm

You can trust the math itself, and so can Bruce Schneier and a few tens of
thousands of other people. But everyone else who can’t grok the entire
mathematical arguments for each cryptographical system, or doesn’t want to
spend a long time studying it, must trust the word of people like you. And
since the NSA can and does subvert people like you, who do original work and
analyze others’ work and sit on standards committees, not to mention the
programmers who implement it in code, what are we to do?

Daniel W. Says:

Comment #5 September 8th, 2013 at 2:33 pm

In my mind, the best circumstantial evidence that the NSA has not practically
broken any of the major cryptosystems is the following:, if they had, they
would most likely keep this as a highly guarded secret to be used only
against high value targets rather than as a means of monitoring potential
terrorists. It would most likely be contained within a small circle and not
mentioned in power-point presentations to low-level analysts.

Of course, the above argument may be flawed by assuming the NSA has too high
of a level of competence.

T H Ray Says:

Comment #6 September 8th, 2013 at 2:43 pm

Scott,

” … the clarity and rigor of the systematizing approach has eventually won
out.”

No doubt. In Euclid’s time as well as the present, though, it is helpful to
have something to systematize. Making that assumption available and
convenient is what mathematicians do.

Scott Says:

Comment #7 September 8th, 2013 at 3:02 pm

Daniel Armak #4:

You can trust the math itself, and so can Bruce Schneier and a few tens of
thousands of other people. But everyone else … must trust the word of people
like you.  You raise an excellent point, which I think applies even more
broadly than you say. For one thing, I merely understand some of the general
ideas: I haven’t gone through every detail of the math used by the crypto in
my web browser, and I dare say that most professional cryptographers haven’t
either.

For another, the point is much broader than cryptography: how can you trust
quantum mechanics, if you haven’t done the requisite experiments yourself?
The physicists could’ve all been bought off by some anti-realist cabal. :-)
Or how can you trust that the government isn’t putting mind-control drugs
into the fruit you buy in the supermarket, etc. etc.

So we’re extremely lucky that science hit on a solution to these problems—the
only workable solution, really—back in the 17th century. The solution is to
open up every question to scrutiny, discussion, and challenge by any
interested person. Assertions gain credibility by surviving public
criticism—and that’s just as true in math as it is in experimental sciences.
I believe many theorems even though I haven’t checked the proofs myself,
because I know that if there were an error, then someone else could’ve made a
name for themselves by finding it.

Now, for this Popperian dynamic to work, the whole process has to be carried
out in the open: if I thought someone who found a fatal flaw in a proof would
only tell their friends, then that doesn’t do me any good. That’s why the
dividing line between “crypto as black art” and “modern crypto” happened
precisely when new discoveries started being published in the open
literature, rather than being filed in a drawer at NSA or GCHQ.

wolfgang Says:

Comment #8 September 8th, 2013 at 3:20 pm

Unfortunately, this xkcd.com/538/ had it right imho.

Scott Says:

Comment #9 September 8th, 2013 at 3:20 pm

Daniel W. #5: If the NSA had really broken strong cryptosystems, then why
would they have resorted to so many covert tactics (or, in the case of the
Clipper Chip, overt attempts) to prevent people from using strong crypto,
unless NSA has a backdoor? I suppose it’s all elaborate psychological
warfare, to prevent us from discovering the fact that these cryptosystems
were broken? And that even Snowden himself is part of the NSA’s master plan?
:-)

At least in my book, every time you claim that what looks on its face like
evidence for X, is really evidence for a powerful cabal trying to prevent
everyone from discovering not(X), the plausibility of your theory gets cut by
a factor of maybe 50,000. This is directly related to the fact that I don’t
believe any conspiracy theories—as in zero, not one.

Scott Says:

Comment #10 September 8th, 2013 at 3:32 pm

Douglas Knight #2: Sure, dramatic improvements in elliptic-curve algorithms
would certainly count—as would “merely” subexponential algorithms, were the
improvements large enough to threaten key sizes that the academic
cryptographers considered safe.

More broadly, though, you’re entirely right that there’s not a sharp line
between “improved number-theory algorithms” and “implementation
vulnerabilities.” Often, what’s happened in practice is that an
implementation vulnerability has opened the way for an attack that still
requires interesting and nontrivial number theory. But I suppose that sort of
thing would still belong to the “99%” part of my probability estimate. In the
“1%” part, I really had in mind “something that would give theoretical
cryptographers a heart attack” (like, I dunno, factoring in L(1/10), or

Scott Says:

Comment #11 September 8th, 2013 at 5:03 pm

Anonymous #3:

You are making good and interesting points. However, Koblitz also has some
valid criticisms of TCS even if his conclusions are not valid.  I completely
agree that Koblitz has some valid criticisms.

However, I’ve read pretty much all of his and Menezes’s anti-TCS screeds, and
to me what he’s doing seems, if you like, too easy to be helpful. Koblitz’s
favorite M.O. is to recount various slip-ups by people in the “Goldreich
school of crypto” and laugh at them: “haha, they talk about ‘provable
security,’ but there was a bug in their proof! or their security definition
left out an important class of side-channel attacks!” Then, with even more
glee, Koblitz relates how the hapless computer scientists put out a new paper
supposedly fixing the problem, but that paper had its own problems, and so
on.

The trouble is, that is indeed what a bunch of incompetent buffoons would
look like, but it’s also what science looks like! :-) Koblitz never seems to
want to acknowledge that the end result of the process is better scientific
understanding and more secure cryptosystems than before (even if still not
perfect).

Also, of course, Koblitz almost defiantly refuses to suggest any better
mathematical foundations for cryptography, besides the reduction-based
foundations that were built up over the last 30 years. I.e., it’s not that
instead of adaptive chosen ciphertext attack, he has a better definition to
propose, or that instead of “bodacious” new hardness assumptions, he can give
a single assumption that suffices for everything. Instead, what he appears to
want is simply a return to the “black art” era of cryptography, when security
arguments boiled down to “we tried to break it and failed” or “trust us, we
have better mathematical taste than you.”

The trouble is, I can’t think of a single case in the history of science when
mathematical foundations as well-developed as cryptography’s now are, were
simply abandoned wholesale without better mathematical foundations to replace
them. So intellectually, Koblitz strikes me as someone who’s throwing spears
at battle-tanks. Being the excellent marksman that he is, he actually scores
some hits—but the reduction-encrusted battle-tanks are still going to win in
the end.

The mathematical models we built in TCS are useless if they don’t relate to
the practice and we know many of our standard models are not good enough
approximation of the reality and arguably there isn’t enough effort to deal
with these issues.  Would one also say that the mathematical foundations of
topology—open sets, Urysohn’s Lemma, etc.—are useless if they don’t relate to
the practice of tying and untying knots? I think that’s a pretty close
analogy for the relationship between what, say, Goldreich or Goldwasser or
Micali do, and the actual practice of cryptography. In both cases, yes,
there’s some relation between the intellectual foundations on the bottom and
the beautiful ornaments on top, but not surprisingly there are many floors in
between. Starting from a one-way function, for example, you first have to
construct a quasi-regular one-way function, then a pseudoentropy generator,
then a pseudorandom generator, then a pseudorandom function, and then maybe
you can start to think about building (say) a rudimentary private-key
cryptosystem or signature scheme.

Also I think you are exaggerating what most cryptographers expected that NSA
was doing. I have heard several famous crypto experts quite surprised by
these revelations and it has shaken their trust in the government
institutions. I never understood why some people presume that government is a
benevolent entity, such beliefs in government institutions seems like
ideology to me.  My situation is different: I never had any real doubt that
NSA was doing such things; the thing I genuinely don’t know is whether they
have good reasons to be doing them. I consider it conceivable that the NSA
has indeed stopped many terrorist attacks or other international disasters
that we never hear about—in which case, the strongest case in their favor
might be stronger than the strongest case that can ever be made publicly. The
fact that President Obama, who’s so reasonable on so many issues, has implied
as much is evidence for that view from my perspective. On the other hand, I
also consider it conceivable that the current eavesdropping regime is purely
a result of the universal tendency of bureaucracies to expand, justify
themselves, and zealously guard their power and privileges. Or it could be
some combination of the two.

For me, though, the deciding consideration is that, even in a fantasy world
where the NSA’s actions had always been 100% justified, I’d still want them
to be more accountable to the public than they are now. “Trust that we have
our reasons, even though we can’t tell you what they are” simply doesn’t work
over the long term in a democracy, even if the trust is justified at any
particular time or in any particular case (and of course, often it hasn’t
been).

Anonymous Says:

Comment #12 September 8th, 2013 at 8:05 pm

I agree with you that his attitude is not constructive criticism. I would
even go further than you and say it is stupid to forget the science of crypto
and go back to purely engineering art treatment.

Regarding reasonability of what NSA does, NSA and its backers would of course
claim these tools are useful. To be honest, security was a weak point of
Obama’s campaign, he is not really knowledgeable in these issues and he has
not gone and will not go against his advisers if they tell him these tools
are necessary to fight terrorism. However, as far as I have heard, they have
hard time convincing anyone outside executive branch that these tools have
been as useful as they are claiming. How many major terrorist plots they have
been uncovered and prevented using these tools? It seems that they are using
these tools for a very wide range of activities including industrial and
political espionage on foreign governments and companies and gain political
and commercial advantage (what they call US national interests, not just
securing Americans against terrorists). Does anyone really believe that EU or
Brazil or liberal NGOs will launch a terrorist attack on US? FBI’s actions
against Dr. King is telling how far they would go. They use the fear factor
of a possible terrorist attacks to justify these actions to the public,
however the laws allow them to do whatever they want to and when there are
restrictions (like the fourth amendments) they find ways to circumvents them
(e.g. by colliding with foreign intelligence services like GCHQ to spy on
American citizens) or change the interpretations of those laws. We are very
lucky that many influential Americans in the previous generations had a
negative view of the federal government and wanted to restrict its powers as
much as possible, restrictions which are being removed in practice (partly
because some people want to settle sociopolitical disputes present in the
country using the government’s power). I don’t see why so much power should
be invested in a single authority with almost no real public supervision and
scrutiny (a role that media was playing to some extent in previous decades
but is coming under heavy pressure from government as Manning, Swartz,
Snowden, … cases demonstrate). And even when courts find that someone in the
government has seriously violated the laws the president forgives them and
they avoid real punishment (as Scoot Libby case demonstrates).

It is not just US government, there is a trend in western liberal
democracies. It is simply unbelievable that the UK security forces used a law
passed to fight terrorism to hold the partner of a Guardian journalist for 9
hours without a lawyer and without the protection of Miranda rights against
self-incrimination. Anyone who thinks that security forces will only use the
authority and tools they obtain to the limited extent of the original goal
suffers from extreme nativity. They will use any tools in their disposal to
the fullest extent they can to achieve what they perceive to be the goals of
their institution. When they perceive journalists like Greenwald as a threat
to the national interests they use these tools to fight them which includes
intimidating the partner of a journalist using terrorism fighting powers. I
still fund it really hard to believe that we have gone so far in the
direction of an Orwellian society.

What can theoretical computer science offer biology? | Theory, Evolution, and
Games Group Says:

Comment #13 September 9th, 2013 at 2:16 am

[…] the aid that cstheory can offer to biological understanding. In
yesterday’s post on the NSA and computational complexity, Aaronson — with
attribution to mathematician Greg Kuperberg — provided the following […]

Paul Beame Says:

Comment #14 September 9th, 2013 at 2:45 am

Some of the NSA revelations have been no surprise at all. It was well known
in the 1980′s, particularly after the publication of The Puzzle Palace, that
the NSA was tapping all the trans-Atlantic telephone cables; gathering up of
all e-mail to foreign addresses seems like more of the same.

The relationship of the NSA with TCS cryptographers has been pretty shaky. I
recall attending a theory of cryptography workshop at MIT’s Endicott House in
June 1985 with one or two official NSA attendees. At the time, there were one
or two TCS attendees known to have NSA funding and the NSA people wanted to
recruit more. In announcing their desire to sponsor more TCS cryptographers,
one of the NSA people cast a pall over the meeting by saying: “If you are
interested, just mention it in a phone conversation with one of your friends
and we’ll get back to you.” This didn’t exactly endear them to anyone.

J Says:

Comment #15 September 9th, 2013 at 2:51 am

“Math could be defined as that which can still be trusted, even when you
can’t trust anything else”

Wait till someone shows multiplication and addition have same complexity or
possible Voevodsky’s/Nelson’s worst nightmare comes true

Refer:

http://mathoverflow.net/questions/40920/what-if-current-foundations-of-mathematics-are-inconsistent

http://mathoverflow.net/questions/36693/nelsons-program-to-show-inconsistency-of-zf

Scott Says:

Comment #16 September 9th, 2013 at 4:20 am

J #15: Multiplication and addition having the same complexity (and yes, it’s
conceivable that there’s a linear-time multiplication algorithm) wouldn’t do
anything whatsoever to undermine my trust in math—why would it?

Also, even if ZF set theory were shown to be inconsistent (and it won’t be
:-) ), that wouldn’t do anything whatsoever to undermine my trust in theorems
about (say) finite groups, or low-dimensional topology, or theoretical
computer science—in fact, about anything that doesn’t involve transfinite
sets. It would “merely” tell me that there was a need (and, of course, an
exciting opportunity) to rethink the foundations. That’s something that
already happened 100+ years ago (the renovations causing virtually no damage
to the higher floors), and that could conceivably happen again.

Vitruvius Says:

Comment #17 September 9th, 2013 at 4:58 am

I agree, Scott, with your general position that any time one claims that
“evidence for x is really evidence for a powerful cabal trying to prevent
everyone from discovering not(x)” one’s credibility drops by an irrecoverably
large factor, and I agree with you that “math can be defined as that which
can still be trusted, even when you can’t trust anything else” (as you put
it), yet that still begs the question of how we the people decide what to
trust to be valid math.

Similarly, while your suggestion to “open up every question to scrutiny,
discussion, and challenge by any interested person” may be necessary in order
to establish public trust, it isn’t sufficient because we still have the
problem of deciding which such interested persons to trust, and which to
write off as conspiracy theorists in their own right. How do we feasibly
decide, in effect, whether Ehrenhaft is a crackpot (as it were), and whether
“Snowden himself is part of the NSA’s master plan” (as you playfully alluded
to)?

To that end you may be interested in Why Doesn’t the Public Trust
Scientists?, a lecture by The Right Honourable Professor The Baroness O’Neill
of Bengarve, Emeritus Professor of Philosophy at the University of Cambridge
and past Principal of Newnham College, Cambridge, which she presented in 2005
as part of the Science Futures series by the San Diego Science and Technology
Council’s Center for Ethics in Science and Technology.

Note that while “scientists” are the titular and exemplary referent matter in
that lecture, Baroness O’Neill’s talk actually considers a range of questions
in regard of public trust, including the roles of professional organizations,
trustworthiness (which can’t replace trust because of the quis custodiet
ipsos custodes problem), statutory regulation, post hoc accountability, &c,
which apply more broadly to the matters of public trust in any and every
profession and institution, including politics and the law.

O’Neill argues, if I may be so bold as to suggest a précis, that going back
through the 17th century (as you noted) western liberal democracies have
indeed evolved a multipartite methodology that does tend work in practice and
that may well be the best we can get in principal, though it remains unclear
to me how well we are applying those techniques to matters of state security
in general, and how effectively you folks in the United States of America are
applying those techniques to your vaunted Agency in particular.

Scott Says:

Comment #18 September 9th, 2013 at 5:01 am

Paul Beame #14: I’ve actually heard that joke many times, in other variants.
(“Interested in career opportunities at the NSA? Call your mom and let her
know!”) I didn’t know that NSA people themselves used the joke at
conferences, but it doesn’t surprise me at all.

J Says: Comment #19 September 9th, 2013 at 6:39 am “Multiplication and
addition having the same complexity (and yes, it’s conceivable that there’s a
linear-time multiplication algorithm) wouldn’t do anything whatsoever to
undermine my trust in math—why would it?”

I thought I read somewhere that if addition and multiplication turn out to be
similar in complexity, then it would imply something is wrong with
mathematics.

On the same vein think of the generalization of scheme theory that Mochizuki
claims to have undertaken to take apart + and x in ring structure.

I would think something fundamentally would have changed in our picture if
they turn to be similar in complexity.

J Says:

Comment #20 September 9th, 2013 at 6:47 am

Atleast for computational purposes, the multiplicative group structure and
additive group structure of $\Bbb Z$ seem to be coinciding. This seems wrong.
I cannot directly relate to $Z \bmod p$ but this seems to have implication to
Discrete Log. An implication for this may not be beyond reach for atleast a
few other rings as well.

Scott Says:

Comment #21 September 9th, 2013 at 7:02 am

J #19: Well, we already have a remarkable O(n logn loglogn) multiplication
algorithm (due to Fürer, and building on many previous works), and it hasn’t
created any problem for the foundations of mathematics that I know about.
Meanwhile, just like for most problems, we currently have no lower bound for
multiplication better than the trivial Ω(n). I suppose I’d guess that Ω(n
logn) is some sort of barrier, but not with any strength of conviction: if a
linear-time algorithm were discovered, it certainly wouldn’t cause me to
doubt the consistency of ZF set theory. :-)

Scott Says:

Comment #22 September 9th, 2013 at 7:16 am

Vitruvius #17:

it remains unclear to me … how effectively you folks in the United States of
America are applying those techniques to your vaunted Agency in particular.
As long as we’re trading mild national barbs, you’re Canadian? You guys do
have the Communications Security Establishment, which according to the NYT
article is one of only four foreign agencies (along with Britain’s,
Australia’s, and New Zealand’s) that “knows the full extent” of the NSA’s
decoding capabilities and is cleared for its “Bullrun” program. Though I
confess that, when I try to imagine Canada’s CSE, I come up with something
like the following:

Read this gentleman’s private email? Ooo, nooo, that doesn’t sound terribly
polite, eh?

J Says:

Comment #23 September 9th, 2013 at 7:21 am

Professor I am well aware of all $n^{1+\epsilon}$ algorithms and Schonage’s
$O(n)$ algorithm on multitape machines. I cannot find the reference I am
thinking. It was written by a TCS theorist. I would seriously think that the
standard ring structure in $\Bbb Z$ could be modeled differently. I do not
know if ZF would be affected. However the question of treating x and +
differently for computation purposes compare to mathematical purposes arises
making things murky.

I am not implicating ZF with $O(n)$ algorithms for standard x operations on
the standard structure of $\Bbb Z$. The ZFC comment was a second piece of
mathematical conundrum some reputed folks have raised awareness about for a
need to be more well-grounded and it rang well with your statement on truth
in math as we know it. (Unrelated but bringing in – $Z$ has been a puzzle
before as well – it is the simplest ring with a spectrum of prime ideals
whose dimension is unclear to be interpreted in a standard way)

Scott Says:

Comment #24 September 9th, 2013 at 7:23 am

Wolfgang #8:

Unfortunately, this xkcd.com/538/ had it right imho.

YES! I especially liked the mouseover text (“Actual actual reality: nobody
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: Digital signature
URL: <http://www.metzdowd.com/pipermail/cryptography/attachments/20130909/249ab6a2/attachment.pgp>