Destruction -> creation

bomber
worker A woman works on the nose of a bomber at the Douglas Aircraft Company plant in Long Beach, California in 1942.


I wonder if we’d have computers today if World War II hadn’t been so terrifically horrible.

Not hard to imagine that the death, destruction, and general frenzy that comes with an existential crisis at the national level drive individuals to do remarkable things. Code breaking and the development of nuclear weapons during WWII undoubtedly pushed computing forward.

Alan Turing, the father (or at the very least the guy who delivered the baby) of modern computing, started work on the theoretical basis for the computer back in 1936, before WWII, but he spent the war holed up in Bletchely Park, putting into application much of the theory about computing that he’d been developing beforehand:

The British needed mathematicians to crack the German Navy’s Enigma code. Turing worked in the British top-secret Government Code and Cipher School at Bletchley Park. There code-breaking became an industrial process; 12,000 people worked three shifts 24/7. Although the Polish had cracked Enigma before the war, the Nazis had made the Enigma machines more complicated; there were approximately 10^(114) possible permutations. Turing designed an electromechanical machine, called the Bombe, that searched through the permutations, and by the end of the war the British were able to read all daily German Naval Enigma traffic.

Some sort of reputable source

It’s hard to argue that this sort of effort and rabid allocation of talent wouldn’t have moved “practical” computing forward by quite a lot. I’m sure there are many analogues for math and physics on the nuclear side of the story.

For a page-turning and mostly fictional read that gets deep into some of this stuff, see Cryptonomicon.

No free lunch?

Prepare yourself for a grim statistic: over 60 million people died in WW2. That’s an incomprehensible number of anything, let alone human beings. It would be a deeply ironic thing if it took a toll on human life and happiness of that magnitude to till the soil for a potentially wonderful thing like computing.

I hope that isn’t a case. I hope that the way to technological prosperity isn’t some near zero-sum trade with stupefyingly tragic wars at the other end of the table. I’d like to imagine that if WWII hadn’t happened, some kind of peaceful potpourri of telecommunication and profit motive would’ve eventually netted us personal computers, but it’s hard to know.

It’s safe to say the kind of abilities, value, and connectivity that we enjoy because of widely dispersed computing wouldn’t have gotten here nearly as quickly, and that’s ironic.

Edit: @lucaswiman has kindly offered some perspective that calls into question that last, kinda grim statement:

Shannon’s work on information theory was pre-war, and unclassified. It had a much bigger practical impact, since Turing’s work remained classified for decades. I’d say the Cold War had a bigger effect on the history of computation than WWII due to the need for nuclear weapons simulations and the space race. Arguably the main innovations required for the first electronic computers came out of Bell Labs and pre-war academic research. It’s possible that absent WWII and the Cold War, we’d be a few years behind in technologies which happened to have military importance, but I’d take that trade off.