[Cryptography] Some bits from the 1981 NSA COMSEC Guide

Moritz Bartl moritz at headstrong.de
Fri Dec 18 20:48:16 EST 2015


https://www.fredericjacobs.com/blog/2015/12/18/COMSEC/

BrainOverfl0w
Some bits from the NSA COMSEC Guide

Dec 18, 2015

A few weeks ago, additional parts of the “A History of U.S.
Communications Security” books from the NSA were declassified. A little
intrigued, I decided to read the second volume that was originally
published in 1981 under the SECRET classification. I wondered what we
could learn, in 2015, from the dated document.

Note: For legibility reasons I transcripted the original notes. Italic
was in original text. I’m highlighted segments in bold and added links
for important references.

Why COMSEC/OPSEC?

The main argument for COMSEC for military operations that the author is
bringing forward is that COMSEC is a crucial element to maintain the
“surprise” element of operations.

    Operations Security (OPSEC) is a discipline designed fundamentally
to attain and maintain surprise, particularly in military operations.

    COMSEC is a key ingredient of OPSEC - it may help achieve surprise,
nor with the correlative assertion that fewer and fewer major operations
can be planned and executed these days without a large amount of
supporting communications to coordinate, command and control them, nor
even with the assertion that,without security for those communications,
surprise is unlikely

On Threat Modeling

No COMSEC tool (messaging app, radio, … ) is a silver bullet. Ideally,
it will minimize “what could go wrong”, but there will always be ways of
using it in an unsafe manner. Authentication is one of the hardest
challenges that is yet to solve. Asking people to manually verify
fingerprints comes at a huge usability trade-off. The following segment
highlights the important idea of first trying to find technical
solutions to the problem before going after procedural solutions. Nobody
likes manual key verification, but no technical solution has currently
been found to satisfy the same level of security without adding that
procedural step.

    In olden times, most of our programs, whether in equipment
development, TEMPEST, or security procedures were driven largely by our
view of COMSEC weaknesses - our vulnerabilities - more or less
independent of judgements made on the ability of an opponent to exploit
them. We assumed hostile SIGINT to be at least as good as ours, and used
that as a baseline on what might happen to us. If we perceived a
weakness, we would first try for a technical solution - like new
crypto-equipment. If the state of the art did not permit such a
solution, or could do so only at horrendous expense, we’d look for
procedural solutions and, those failing, would leave the problem alone.

    So our priorities were developed more or less in the abstract, in
the sense that they related more to what we were able to do
technologically or procedurally than the probabilities that a given
weakness would be exploited in a given operating environment. In short,
we did not do much differentiation between vulnerabilities which were
usually fairly easy to discover and threats (which were more difficult
to prove) - where threats are are fairly rigorously defined to mean
demonstrated hostile capabilities, intentions, and/or successes against
U.S. communications. The accusations of overkill touched on earlier in
part stemmed from that approach.

    The thrust towards gearing our countermeasures to threat rather than
theoretical vulnerability was healthy, and driven by a recognition that
our resources were both finite and, for the foreseeable future,
inadequate to fix everything. In fact, one of the reactions of an
outside analyst to our earlier approach was, “These nuts want to secure
the world”. Some still think so.

If your crypto doesn’t work, nobody will use it

During the Vietnam war, the encryption was reducing the transmission
range and introducing unacceptable delays in radio communications. Some
pilots got frustrated about it and fell back to plaintext
communications. A commander had to threaten his pilots to ground them to
get them to use NESTOR, the encryption system that was implemented on
top of radio communications.

    Finally, our own engineers sent to Vietnam reported back: “Sorry
about that, S2; the system reduces range - typically by 10% or more.”
And it, in fact, did. It turned out that NESTOR did not affect range
only if the associated radio was perfectly tuned, “peaked,” matched to
the NESTOR equipment (as we naturally did here at home). In the field,
maintenance personnel were neither trained nor equipped for such
refinement - the test instrumentation simply did not exist there, and we
had not anticipated those real world conditions when we sent it out.

    In tactical air, it was claimed that the delay - up to 3/5 of a
second of required wait between pushing to talk and ability to
communicate - was intolerable when air-to-air warnings among pilots had
to be instantaneous. A survey showed, by the away, that most pilots
judged this time to be on the order of three seconds; so, in fact, the
wait must have seemed interminable when one wanted to say “Bandit at two
o’clock.”

    Most US SIGINT assets in Vietnam used NESTOR heavily and
successfully almost from the outset. Towards the end of the war, so did
most in-country Naval forces, particularly airborne assets. In the
SIGINT user’s case, it was because they were already equipped when they
got in country; had used it previously, knew, accepted or circumvented
its peculiarities, and, of course, because they believed their traffic
required protection. In the Navy case, it was the result of Draconian
measures by the Commander, Naval Forces, Vietnam (COMNAVFORV). That
Admiral happened to be a COMSEC believer; so he told his pilots that if
they didn’t use the equipment, he’d ground them. Some didn’t and he did.
There is, I understand, no comparable trauma for a fighter pilot.

Dealing with compromise

Every time I read a headline that reads “X is broken” where X is a
messaging app, an anonymity program or other, I fear that people might
just give up on using encryption thinking that everything is broken
anyways and thus it’s useless to take any defensive measures.

Unless there is a good alternative, it’s really deceptive to just blame
the tool that offers the best protection.

    A caveat: While nothing gets a user’s attention like documented
proof that communications he thinks are sensitive are being read by an
opponent, several things should be borne in mind before telling him
about it. Foremost is the fragility of the source of the information
(the “proof”) you have. Secondly, it is worse than useless to go out and
impress a user with a problem unless you have a realistic solution in
hand. No matter how dramatic the evidence of threat, if we simply go out
and say, “Stop using your black telephone”, it’s likely to be effective
for about two weeks. Don’t jeopardize a good source for that kind of payoff.

    Finally, the results of our own monitoring and analysis of
communications,at best, prove vulnerability, not threat, are often
remarkably ineffective. Nothing brought this home more persuasively than
the Vietnam experience. Monitoring elements of all four Services
demonstrated the vulnerability of tactical voice communications again
and again. This did not show that the NVA or VC could do it. It was
first argued that they weren’t engaged in COMINT at all. Next, that even
if they were able to intercept us, they couldn’t understand us,
especially given our arcane tactical communications jargon. Third, even
given interception and comprehension, they could not react in time to
use the information.

    It took years to dispel those notions with series of proofs in the
form of captured documents, results of prisoner and defector
interrogations, some US COMINT and, finally, the capture of an entire
enemy COMINT unit: radios, intercept operators, linguists, political
cadre and all. Their captured logs showed transcriptions of thousands of
US tactical voice communications with evidence that their operators were
able to break our troops’ home-made point-of-origin, thrust line, and
shackle codes in real time. The interrogations confirmed their use of
tip-off networks (by wire line or courier) to warn their commanders of
what we were about to do - where, when, and with what force.

De-classification

NSA thought that declassifying crypto algorithms would lead to having
more integrated options for communication devices. Sadly, a lot of
privacy tools are still a hack on top of a protocol (SMS, Jabber, Email
…) and not well integrated. As long as crypto is the topmost layer of
the OSI model, we’re going to lose users.

    The two major reasons for declassification were the “inhibition of
use” argument, and the vision of full integration of COMSEC circuitry
into radios of the future - full integration being defined as
inseparable and shared radio and crypto-circuitry. In that
configuration, our COMSEC tail would be wagging the communications
system dog with the controls classification denotes - how would such
equipments be shipped, stored, and particularly, how would they be
maintained? “Integration” has thus far not turned out to be the wave of
the future. COMSEC modules will by and large be separable from their
associated radios because the designers found it more efficient to do it
that way. […]

Offence is easy

As the rest of the security community, NSA is good at offence, but has
no clue how to secure a system.

    It seems to me that NSA does not yet have much expertise in computer
security. Rather, we are expert in computer insecurity. We do much
better in finding security vulnerabilities in any computer complex than
in proposing security architecture for them. Somehow the attack seems
more challenging (fun) than the defence, and this seems true in the
general business of crypto system design as well. A spin-off of this
syndrome manifests itself when a security modification is needed for an
existing crypto-equipment. In my experience, most design engineers would
much rather attack a brand new problem - meet a new and difficult
requirement - starting from scratch, pushing the electronic state of the
art, exercising opportunities for innovations, and so on than go through
the drudgery of a mere “fix” accepting the constraints of configuration
and technology in some pre-existing piece of hardware.

The Cost of Transparency

Soviet’s were apparently not revealing any information about their
COMSEC programs. I’m not sure we would know much about NSA’s without
whistleblowers.

    NSA spends tens of millions of dollars and tens of thousands of
man-hours trying to discover what Soviet COMSEC is like. Despite
all-source research dating back to more than 30 years, the incidence of
any unclassified statements by the Soviets on any aspect of their COMSEC
program is so trivial as to be virtually non-existent. In other words,
the Soviets protect (classify) all information about their cryptography
and associated communications security measures. The effect of this
stone wall has been either to greatly delay U.S. ability to exploit some
Soviet communications or frustrate it altogether. Viewed as an element
of economic warfare, we are losing hands down as we expend enormous
resources to acquire the same kind of information from the Soviets that
we give them free - i.e., without classification. Clearly, the Soviet’s
classification program costs them something, just as ours costs us. But,
they have a cost advantage because they still operate in an essentially
closed society with a well-established security infrastructure and with
many of their officials already well attuned psychologically to the
concept of secrecy.

Value of crypto-analytic information

For NSA, the best way for breaking a crypto system is to break the math,
not the implementation. I assume this argument is based on the fact that
a passive attack is superior to an active one.

    The optimum attack on any crypto system (if you can hack it) is
cryptanalytic - you need only operate on cipher text; your risk is low
or non-existent unless you have to position yourself dangerously to
perform the interception. You don’t need to steal keys or penetrate
cryptocenters or subvert people and, if you succeed, the return on
investment is likely to be rich - all the secrets committed to the
cryptosystem in question. The one essential pre-requisite to such attack
is knowledge of the cryptologic - which may have been the reason why the
Soviets were (reportedly) willing to offer $50, 000 for PARKHILL several
years ago.

    The “SIGINT” argument for protecting our cryptologics is well known
- the COMSEC arguments much less so, despite their reiteration for some
decades: - With the exception of true one-time systems, none of our
logics is theoretically and provably immune to cryptanalysis - the
“approved” ones have simply been shown to adequately resist whatever
kinds of crypto-mathematical attacks we, with our finite resources and
brains, have been able to think up. We are by no means certain that the
Soviet equivalent of A Group can do no better. But no attack is likely
to be successful - and certainly cannot be optimised - without
preliminary diagnostics - discovery of how it works. - Systems which
have no known cryptanalytic vulnerabilities may still be exploited if,
and usually only if, their keying materials have been acquired by the
opposition or if their TEMPEST characteristics permit it. In either of
these contingencies, however, the logic, the machine itself, or both may
be required for exploitation to be successful.

    Because the thrust for unclassified when unkeyed equipments is lying
fallow at the moment, all of the above may seem like beating a dead
horse as far as our mainline equipment are concerned. But the matter
will assuredly rise again.

Public Cryptography

The COMSEC guide also talks about the rise of non-governmental
encryption and more specifically the discovery of “RSA” by GCHQ a few
years before it was publicly discovered.

    While commercial equipment has been around for many decades, their
quantity and variety was relatively small. Most were manufactured
overseas - particularly in Switzerland, and no huge market existed for
them after World War II because many Governments (like our own) began
increasingly to use systems exclusively of their own design and under
their own control. Similarly, the amount of published literature on
cryptography, and particularly on sophisticated cryptanalytic ideas was
sparse. In the U.S., the Government (specifically, NSA) enjoyed a
near-monopoly on the subject by the early ’50’s. That persisted until
about 1970, when a dramatic change occurred. A handful of U.S. companies
interested in computers, in communications, or in electronics began to
perceive a market for electronic crypto-equipments. A few other American
companies began building crypto-equipment in competition with the Swiss
and others in Europe, supplying devices to some Governments in Africa,
South America, and the Middle East and to a few major corporations -
notably some oil companies seeking to protect vital industrial secrets.
At about the same time, the question of computer security, which had
been on the back burner since the late 50’s, began to get a great deal
of attention from computer manufacturers themselves and from some of
their customers. Computer fraud had become more common, and its impact,
particularly on the banking world, became significant.

    One of the interesting outgrowths of the burgeoning interest in
cryptography in the private sector was the “invention” of a concept
called “Public Key Cryptography” (PKC). All conventional cryptography
requires the pre-positioning of shared keys with each communicant. The
logistics for the manufacturing and delivery of those keys keeps S3 in
business and forces uses to maintain a large secure crypto-distribution
system. (Remote keying eases but does not eliminate the problem) The
thought was, cryptography would be revolutionized if a system could be
devised in which people could communicate securely without prior
exchange of keys. The main idea that came forward was an effort to
capitalize on the fact that some mathematical functions are easy to
carry out in one “direction,” but difficult or impossible to reverse. A
classic example of these so-called one-way functions is the phenomenon
that it is not hard to multiply two very large prime numbers together,
but given only their product, no elegant way has been put forward for
determining what the two original numbers were. So the original numbers
could be considered to be part of the one man’s secret “key:” their
product could be published; an encryption algorithm could be specified
operating on that product which could not be efficiently decrypted
without knowledge of the “key”; and all messages addressed to that
person would be encrypted by that algorithm. By coincidence, the
identical idea had been put forward by one of our British colleagues
five years earlier, and we and they had been studying it ever since. We
called it non-encrypted encryption (NSE) and were trying to solve the
same problem of key distribution. We treated our work on it as SECRET
and still do. We did not leap to its adoption for a variety of a reason.
Foremost, we were uncertain of its security potential. The fact that
mathematicians had not yet found a way to factor large numbers did not
mean that there was no way. It was an interesting mathematical puzzle,
first put forward centuries ago, but no great incentives for its
solution beyond the satisfaction of intellectual curiosity, no perceived
commercial applications, and so on. So there was no e evidence of a
great many brains having worked on the problem over the year; nor did we
go all out against it because, apart from theoretical doubts, there were
other drawbacks. The most obvious - although perhaps not the most
important - was the fact that the encrypted himself could never decrypt
his own message - he would be using the crypto system of the recipient
who was the only one holding the secret decrypting key - he would have
no means to verify its accuracy or correct an error. More or less
elaborate protocols involve hand-shaking between the communications were
put forward to get around this difficulty - usually entailing the
receiver having to re-encrypt the received message in the sender’s key
and asking if it was right. A clumsy business. Next, each user would
have to keep his primes absolutely secret, forcing on each some of the
secure storage and control problems inherent within conventional
schemes. Known (or unknown) loss would compromise all of his previously
received messages. To get around that, relatively frequent change would
be necessary. This would move him towards the conventions of keying
material supersession; generation and selection of suitable primes and
their products, and their republication to all potential correspondents.

Embassies and SIGINT

Soviets were suspected of intercepting communications in the Washington
area and elsewhere in the United States. Therefore, the NSA was
pressured by the Government to disclose and share more about their
knowledge about cryptography.

Soviet SIGINT got out of control and weakened telecom infrastructure was
so vulnerable that the Vice-President Rockefeller said: “If you don’t
want it known, don’t use the phone”. Take that as an opsec advice.

    Another major factor arose bringing great pressure on NSA to let
some of our cats out of the bag. This was the matter of Soviet
interception of communications in the Washington area and elsewhere in
the United States. By 1966, we were pretty sure that they were doing
this work from their Embassy on 16th Street and perhaps from other
facilities as well - but we had no clear idea of its scope, its targets,
possible successes, not of the value of the information they might be
collecting.

This paragraph is followed by a large redacted area to this day. And
then continues

    By the early 70’s, evidence had accrued confirming the Soviet
intercept effort, and it was shown to be large and efficient. [REDACTED]
characterised their intercept effort from their Washington Embassy as
the single most valuable covert SIGINT activity in their arsenal. The
vulnerable communications obviously extended beyond those of our
traditional military and diplomatic customers. Privacy for individual
citizens could be violated; technological information communicated by
Defense contractors could be compromised, and information useful in
economic warfare could be obtained. The threat became public and
explicit in the waning days of the Ford Administration with
Vice-President Rockefeller announcing in a press conference that “If you
don’t want it known, don’t use the phone”. Later on, in a press
conference, President Carter acknowledged the existence of the problems,
characterized the Soviets’ efforts as passive, noted that all “his”
communications were secured and did not appear too upset. Senator
Moynihan, however, expressed outrage, wanted to jam the Russians, expel
them, or something because of their outrageous invasion of US citizens’
privacy.

The reason why the president probably wasn’t that outraged compared to
the Senator is probably because he knew the US embassies were doing the
same …

    By this time, we had bitten the bullet, deciding to seek a generic
COMSEC solution. This was a decision of enormous consequence for us. The
notion of injecting Communications Security into the commercial world in
a big way was unprecedented, with serious policy, political, and
technical implications for all involved. Principal players became
ourselves, the telephone companies, the White House, DCA, the now
defunct Office of Telecommunications Policy in OMB, FCC and, ultimately
many users of the commercial telephone system.

    The project was (and is) called HAMPER. At the outset, it had three
main thrusts: a greatly expanded program to move “sensitive” circuits to
cable in Washington, NYC and San Francisco where major Intercept
activities were now known to exist; a program to bulk encrypt some of
the tower-to-tower microwave links in their entirety, re-inforced by
end-to-end encryption for some particularly critical lines; and the
definition of “Protected Communications Zones” to circumscribe those
areas - e.g., Washington and its environs - from which microwave
interception would be relatively safe and easy. It took several years of
concerted effort simply to sort out how communications were routed
through the enormously complex telephone system.

With that deployment, cryptography became increasingly prominent in the
private sector. Raising some concerns for NSA.

    The doctrinal problems were large and intractable because they
involved the provision of cryptography in unclassified environments
where many of our traditional physical security measures were thought to
be inapplicable. How would the crypto-equipments be protected? How to
protect the keys? How do you effect key distribution with no secure
delivery infrastructure such as we enjoy in the Government COMSEC world?
Problems of this kind let to campaign to use the DES - the only
unclassified Government-approved crypto system available, thus solving
the physical security problem insofar as the crypto-equipment itself was
concerned. The root difficulty with this proposal from the security
analysts’ viewpoint lay in the fact that the DES algorithm was
originally designed and endorsed exclusively for the protection of
unclassified data, fundamentally to insure privacy, and without a SIGINT
adversary with the power of the Soviet Union having been postulated as a
likely attacker. Accordingly, the system was not designed to meet our
high grade standards and were not interested in educating the world at
large in the best we can do.

This is really interesting. It’s another example of intelligence
agencies struggling to reconcile their Information Assurance and SIGINT
sides. They thus admit not protecting citizens and US companies at risk
of Soviet spying with the only goal of being still able to spy more on
the Soviet.

    Nonetheless, the system is very strong; has stood up to our
continuing analysis, and we still see no solution to it short of a brute
force exhaustion of all its 2^56 variables. It is good enough, in fact,
to have caused our Director to endorse it not only for its original
computer privacy purposes, but for selected classified traffic as well.
Cynics, however, still ask “Are we breaking it?” The answer is no. But
could we? The answer is “I don’t know; if I did I wouldn’t tell you” And
there’s a good reason for this diffidence. A “No” answer sets an upper
limit on our analytic power. A “Yes” answer, a lower limit. Both of
those limits are important secrets because of the insights the
information would provide to opponents on the security of their own systems.

NSA won’t reveal if they can break something. We all knew that this was
happening but interesting to read an internal document confirming that
practice.

    the Hamper program faltered because of uncertainties on who was
charged with, responsible for, authorized to, or capable of moving
forward. Big money was involved and we didn’t know who should budget for
it. Should the common carriers pay for it themselves, or its customers?
Or the government? It is, after all, a security service that most may
not want or perceive a need for.

    By and large, most people in both the COMSEC and SIGINT
organizations in NSA believe that we can accomplish our missions more
effectively in considerable secrecy because it helps us to conceal our
strengths and weaknesses and to achieve technological surprise. DoC
[Department of Commerce], on the other hand, is in business, in part, to
encourage private enterprise, to maximize commercial markets at home and
abroad, and to exploit the products of our own Industry for use in
Government rather than having the Government compete with Industry and
this does not exclude cryptography.

Academic Freedom vs National Security

    In any event, production and marketing of equipment by U.S.
commercial vendors is not our biggest problem with public cryptography
because there are various Government controls on such equipment -
particularly, export controls - and Industry itself is usually
disinterested in publishing the cryptanalytic aspects of their research
in any detail. The central issue that continues to fester is
encapsulated in the phrase: “Academic Freedom versus National Security”.

    Our Director has made a number of overtures to various academic
forums and individuals in an effort to de-fuse this issue, but has stuck
to his guns with the statement that unrestrained academic research and
publication of results can adversely affect National Security. While a
few academicians have been sympathetic, the more usual reaction - at
least that reaching the press - has been negative.

    The principal reason that there is an NSA consensus that
unrestrained academic work has the potential for harm to our mission is
because, if first-case U.S. mathematicians, computer scientists, and
engineers begin to probe deeply into cryptology, and especially into
cryptanalytics, they are likely to educate U.S. SIGINT target countries
who may react with improved COMSEC. Less likely, but possible, is their
potential for discovering and publishing analytic techniques that might
put some U.S. crypto systems in jeopardy.

    The academicians’ arguments focus on absolute freedom to research
and publish why they please, a rejection of any stifling of intellectual
pursuit, and concerns for the chilling effect of any requests for
restraint. Their views are bolstered by the real difficulty of
differentiating various kinds of mathematical research from
“crypto-mathematics” - notably in the burgeoning mathematical field of
Computational Complexity, often seeking solutions to difficult
computational problems not unlike those posed by good cryptosystems.

    As a practical matter, Government “leverage,” if any, is rather
limited. We have made some half-heated attempt to draw an analogy
between our concerns for cryptology with those for private research and
development in the nuclear weapons field which led to the Atomic Energy
Act that does - at least in theory - constrain open work in that field.
But there is no comparable public perception of clear and present danger
in the case of cryptology and, despite the “law,” academicians have
sanctioned research revelatory of atomic secrets including publications
on how to build an atomic bomb.

That’s where it gets interesting.

    Another wedge, which as yet has not been driven with an appreciable
force, is the fact that - overwhelmingly - the money underwriting
serious unclassified academic research in cryptography comes from the
Government itself. Among them are the National Science Foundation (NSF),
the Office of Naval Research (ONR) and the Defense Advanced Research
Projects Agency (DARPA). NSA supplies a little itself. The wedge is
blunted because Government officials administering grants from most of
these institutions have been drawn largely from the academic community
who believe strongly in the value of research performed outside
Government, and are sympathetic to concerns about abridgement of
Academic Freedom.

In following paragraphs, the author explains that NSA has tried to make
more bridges with academia and convince them to be “reasonable”.
Burn after reading

NSA masters pyrotechnic techniques to burn crypto equipment when it is
at risk of compromise. After previous failures of keeping
crypto-equipment safe from foreign reverse-engineering, they got a big
success during the 1979 Iranian revolution where they burned all of the
US equipment that was at risk of being compromised.

    The most successful use of pyrotechnics (thermate slabs, thermite
grenades, and sodium nitrate barrels) in Tehran occurred at the major
Army Communications Center there. It had a number of crypto-equipments,
but also served as a depot for pyrotechnic materials for the whole area.
They piled all of their classified crypto material in a shed; covered
them with their pyrotechnical material (some 300 devices), lit off the
whole enchilada, and took off. The result was probably the largest
single conflagration during the entire revolution. Observers reported
seeing flames shooting hundreds of feet into the air from posts several
miles away. The building was, of course, consumed and we assume only a
slag pile remains. […]

There are many more topics covered that I have decided to leave out of
this summary. If this sounded interesting, you might want to take a look
in the COMSEC guide for yourself!


More information about the cryptography mailing list