[Cryptography] Imitation Game: Can Enigma/Tunney be Fixed?

John Denker jsd at av8n.com
Mon Jan 5 10:20:17 EST 2015


On 01/04/2015 04:56 PM, Henry Baker wrote:
> Since The Imitation Game is playing & is quite likely to win some
> awards, I was wondering if anyone has written an analysis of the
> Enigma & Lorenz encryption systems using 2015 eyes?
> 
> What would be required to "fix" these codes for modern usage, e.g.,
> converting the mechanical bits into software, adding more wheels,
> etc. ?

I suggest that any such analysis should proceed in two
stages:  
 A) synchronistic (i.e. "fair") criticism, and
 B) anachronistic (i.e. "unfair") criticism.

In category (A), it is entirely fair to compare Enigma to
the contemporary US M209 /and/ SIGABA.  The first thing you
notice is that it is a one-to-two comparison.  
 -- M209 was more portable, more reliable, less complicated,
  and less secure than Enigma.  It was used as a low-level
  field cipher.  By 1943, the Germans could break M209, but
  it took several hours.  This meant that the M209 was good
  enough for its intended purpose.
 -- SIGABA was larger and more complicated than Enigma.
  It was compact enough to be usable in a submarine, embassy,
  or intermediate HQ.  It was not broken during the war,
  and probably not for many years after that.

The first thing we learn from this is that you really need
to make the distinction between field cipher and HQ cipher.
Enigma was too big for one purpose and not big enough for
the other.

Also in category (A), Enigma used a "reflector".  This was
clever and stupid:  It was a cheap way to double the number 
of rounds, but it it meant that no letter could map to
itself, which was a devastating weakness.

Also in category (A): Parts of the Enigma system (notably
the initial rotor positions) provided what was essentially
an initialization vector.  This IV was not long enough and 
(in practice) not random enough.

This brings up a point that transcends categories:  Code
clerks make mistakes!  For example, German code clerks
tended to use an IV that was not much different from the
previous IV.  If they had been provided a pair of dice
and required on pain of death to choose truly random IVs,
cryptanalysis would have been more difficult.

Similarly, German code clerks were fond of sending test
messages consisting of proverbs or common jokes.  This
provided lots of almost-known-plaintext cribs.  Given a
truly strong crypto system this wouldn't have mattered, 
but given a system with weaknesses this added to the
stress on the system.

Also in category (A):  It helps to add more rounds.  SIGABA
had 15 rotors against 4 or 5 for Enigma.

Steckering made Enigma considerably stronger.

I'm not sure what category this is in, but one can imagine
something not much more complicated than Enigma where a
random Steckerverbindung became part of the per-message 
IV (rather than part of the daily key).  This would have 
made life very unpleasant for the codebreakers.  It would
have significantly reduced the volume of traffic they 
could afford to break.

Moving now to category (B):  Rotor machines as a class
have weaknesses.  Even SIGABA is breakable using modern
methods.  For starters, it has a key length of at most
72 bits, which can be brute-forced nowadays.  OTOH SIGABA 
could be scaled up to more rotors nowadays, using reliable
lightweight electronics, so the anachronism cuts both ways.

For some discussion of this, and pointers to other work,
see:
  http://ucsb.curby.net/broadcast/thesis/thesis.pdf

Also in category (B):  Any letter-by-letter system will
have weaknesses compared to a block cipher with a reasonably
large block.

Similarly, German messages tended to have formulaic headers
and prolix salutations.  It would have helped to remove these,
or at least compress them using a codebook.  Preprocessing 
messages with a lossless compressor is always a good idea, 
since it increases the entropy-density of the plaintext.

  This weakness remains relevant even today, e.g. as applied
  to full-disk encryption where there are many /blocks/ all
  the same, e.g. blocks of all zeros.

Compression followed by blocking is better than either one 
separately.

Just adding a bunch of random nulls at the front would 
have helped alleviate the known-header problem.  This would
have effectively increased the length of the IV, although 
there are other ways of achieving the same effect more 
efficiently.

On 01/04/2015 07:13 PM, Ray Dillinger wrote:
> ... the M-209, which was successfully
> cryptanalyzed in the 1970s (though publication was suppressed
> by the NSA) by Dennis Ritchie, James Reeds, and Robert Morris.

That's true but misleading.  The M209 was broken by the
Germans as early as 1943.  The Ritchie/Reeds/Morris analysis
was suppressed for other reasons, presumably because the
/method/ of attack applied to other systems, including
systems that were still in use at the time.  The R/R/M 
paper remains unpublished to this day AFAIK.  See e.g.
  http://cm.bell-labs.com/cm/cs/who/dmr/crypt.html
>From about the same time there is an M209 cryptanalysis by Barker.
  http://www.amazon.com/Cryptanalysis-Hagelin-Cryptograph-Wayne-Barker/dp/0894120883

In any case, SIGABA remains a more worthy topic of "modern"
analysis.


More information about the cryptography mailing list