[Cryptography] How to De-Bollocks Cryptography?

Patrick Chkoreff pc at fexl.com
Thu Aug 8 15:19:22 EDT 2024


On 8/7/24 1:12 AM, Peter Gutmann wrote:
> Kent Borg <kentborg at borg.org> writes:

> John Downer covers this in his book Rational Accidents, which has only just
> come out and which I'm still waiting to get so relying on a reviewer's
> comments,
> https://www.technologyreview.com/2024/06/26/1093692/book-review-technological-complexity-problems/
> 
>    Finally is what might be the most interesting and counterintuitive factor:
>    Downer argues that the lack of innovation in jetliner design is an essential
>    but overlooked part of the reliability record. The fact that the industry
>    has been building what are essentially iterations of the same jetliner for
>    70 years ensures that lessons learned from failures are perpetually relevant
>    as well as generalizable, he says. ...

One of my favorite TV shows is "Air Disasters."  The common theme I see, 
in response to every horrible disaster, is an intense focus on 
identifying what went wrong as specifically as possible, and then 
devising a specific set of countermeasures to thwart future occurrences 
as surely as possible, codifying them into "law" so to speak.

That process, when done correctly, tends to have a "ratchet" effect, 
leading toward order and life, and away from disorder and death.  These 
steps should be cumulative in benefit and with no regression.

In some cases it is mechanical, such as improvements to the redundancy 
of hydraulic lines, the integrity of wiring harnesses, the maintenance 
schedule of jackscrews and engine mounts, or methods of reducing ice in 
fuel lines.  In other cases it is human factors, with improvements in 
cockpit and tower communication, sleep and duty schedules, 
documentation, and training for situations where automatic systems go 
awry or instruments give confusing or contradictory information.  In one 
case even the fine points of Korean culture played a role.

That process requires a lot of hard objective engineering and analysis, 
along with art and seasoned intuition to lead it in the right direction 
and keep the priorities straight.

The cited article suggests that this process might not apply well to the 
evolution of nuclear reactors, but perhaps the industry could employ 
experienced people, not affiliated with any one operation, who always 
look for the most plausible "opportunities" for Murphy's Law to prevail. 
  In short: they need some way to avoid the complacency and 
self-satisfaction that can easily set in among people beholden to a 
specific plant or technology.

I'm not sure how even such an august group could guard against a Black 
Swan event such as a 9.0 earthquake causing a tsunami which reaches 10 
meters above sea level.  Fukushima already had a variety of redundant 
systems.  Fortunately it appears that residents near the plant were not 
actually harmed by radiation exposure, and the tsunami itself was the 
real problem.  Three Mile Island turned out to be fairly benign. 
Ironically, the Chernobyl incident began as a result of a testing 
procedure, and was exacerbated by a cascade of design flaws sprinkled 
with a bit of human error.


-- Patrick



More information about the cryptography mailing list