Citibank e-mail looks phishy

Leichter, Jerry leichter_jerrold at emc.com
Tue Nov 14 18:21:38 EST 2006


| Every twenty years or so, a major accounting firm implodes in a
| scandal. In the 1980s it was Laventhal and Horvath. A few years ago it
| was Andersen. At intervals, the institutional memory of what can go
| wrong vanishes, someone pushes the edge, and it takes a bit of blood
| in the streets for people to remember why they were supposed to
| follow the rules. (By the way, this is a good reason why people should
| oppose the reduction of individual liability for partners in
| accounting firms -- it is an important check on accounting scandals.)
| 
| At intervals, there are also major explosions in other parts of
| finance. For example, everyone remember how Barings melted down
| because of lax controls? There have been failures of this sort at
| intervals in trading operations -- Askin detonated even though it had
| correct models of the CMO market because the market remained
| irrational longer than it could remain liquid. Twenty years later, the
| memory forgotten, Long Term Capital Management had a similar problem.
| 
| I think that failures of this sort are, for good or ill, part of the
| natural order of things. Unless there are object lessons around,
| people forget what the reason for the controls....
One of Henry Petroski's early books is "To Engineer Is Human: The Role
of Failure in Successful Design".  Petroski argues that we only learn
from failure.  Success tells us how to build exactly the same thing
the next time.  Failure is the inevitable result of pushing out beyond
what you already know.  (Wonderful book, highly recommended.)

It's a curiosity of the financial industries that they repeatedly
forget what they've learned!  Architects design buildings that stay
up.  Engineers build bridges that don't fail when the wind blows.
Doctors abandon treatments that kill patients and don't go back to
them.  In most fields, failures are translated in to "best practices"
that are used to produce codes and rules and educational methods and
such that avoid repeating those failures - and remain in force
pretty much forever (sometimes beyond their useful lifetime, but
that's a different problem).  In "lower finance", there are plenty
of such safety rules - e.g., the person who authorizes the check is
never the person who signs the check - that are followed pretty
consistently.  But the guys in "high finance" all think they know
better....

|						     Right now, the systems
| technologies in use are too new for there to have been major failures,
| so many people in management do not understand why the technical
| people have pushed for certain kinds of controls. I suspect the
| failure of a major bank as a result of deep penetration of their
| systems or some similar failure will be rather educational for the
| ones that remain. Unfortunately it will also cause a lot of damage,
| but I'm not sure there is any way to help this....
With every successful shuttle flight, we "learned" that the shuttle
could be flown safely even with ring erosion, at low temperatures, etc.
It usually does take a disaster to change the mindset.

							-- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list