Turing's Cathedral
The Origins of the Digital Universe
-
-
4.2 • 9 Ratings
-
-
- $14.99
Publisher Description
“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.
PUBLISHERS WEEKLY
An overstuffed meditation on all things digital sprouts from this engrossing study of how engineers at Princeton's Institute for Advanced Studies, under charismatic mathematician John von Neumann (the book should really be titled Von Neumann's Cathedral), built a pioneering computer (called MANIAC) in the years after WWII. To readers used to thinking of computers as magical black boxes, historian Dyson (Darwin Among the Machines) gives an arresting view of old-school mechanics hammering the first ones together from vacuum tubes, bicycle wheels, and punch-cards. Unfortunately, his account of technological innovations is too sketchy for laypeople to quite follow. The narrative frames a meandering tour of the breakthroughs enabled by early computers, from hydrogen bombs to weather forecasting, and grandiose musings on the digital worldview of MANIAC's creators, in which the author loosely connects the Internet, DNA, and the possibility of extraterrestrial invasion via interstellar radio signals. Dyson's portrait of the subculture of Von Neumann and other European migr scientists who midwifed America's postwar technological order is lively and piquant. But the book bites off more science than it can chew, and its expositions of hard-to-digest concepts from G del's theorem to the Turing machine are too hasty and undeveloped to sink in.