Q&A: Hacker Historian George Dyson Sits Down With Wired's Kevin Kelly

George Dyson talks to Wired/cite> about the big bang of the digital universe and his new book, Turing's Cathedral.
Photo Joe Pugliese
Photo: Joe Pugliese

The two most powerful technologies of the 20th century—the nuclear bomb and the computer—were invented at the same time and by the same group of young people. But while the history of the Manhattan Project has been well told, the origin of the computer is relatively unknown. In his new book, Turing's Cathedral, historian George Dyson, who grew up among these proto- hackers in Princeton, New Jersey, tells the story of how Alan Turing, John von Neumann, and a small band of other geniuses not only built the computer but foresaw the world it would create. Dyson talked to Wired about the big bang of the digital universe.

Wired: Because your father, Freeman Dyson, worked at the Institute for Advanced Study in Princeton, you grew up around folks who were building one of the first computers. Was that cool?

George Dyson: The institute was a pretty boring place, full of theoreticians writing papers. But in a building far away from everyone else, some engineers were building a computer, one of the first to have a fully electronic random-access memory. For a kid in the 1950s, it was the most exciting thing around. I mean, they called it the MANIAC! The computer building was off-limits to children, but Julian Bigelow, the chief engineer, stored a lot of surplus electronic equipment in a barn, and I grew up playing there and taking things apart.

Wired: Did that experience influence how you thought about computers later?

Dyson: Yes. I tried to get as far away from them as possible.

Wired: Why?

Dyson: Computers were going to take over the world. So I left high school in the 1960s to live on the islands of British Columbia. I worked on boats and built a house 95 feet up in a Douglas fir tree. I wasn't antitechnology; I loved chain saws and tools and diesel engines. But I wanted to keep my distance from computers.

Wired: What changed your mind?

Dyson: If you spend time alone in the wilderness, you get very attuned to living things. I learned to spot the trails left by life. When I looked at the digital universe, I saw the tracks of organisms coming to life. I eventually came out of the Canadian rain forest to study this stuff because it was as wild as anything in the woods.

Wired: You write about "digital organisms." Is this what you mean?

Dyson: Digital organisms, while not necessarily any more alive than a phone book, are strings of code that replicate and evolve over time. Digital codes are strings of binary digits—bits. A Pixar movie is just a very large number, sitting idle on a disc, while Microsoft Windows is an even larger number, replicated across hundreds of millions of computers and constantly in use. Google is a fantastically large number, so large it is almost beyond comprehension, distributed and replicated across all kinds of hosts. When you click on a link, you are replicating the string of code that it links to. Replication of code sequences isn't life, any more than replication of nucleotide sequences is, but we know that it sometimes leads to life.

Dyson: What other kinds of digital organisms can we see?

Wired: Besides obvious ones like computer viruses, we have large, slow-moving megafauna like operating systems and now millions of fast-moving apps, almost like microbes. Recently we've seen enormous conglomerations of code creeping up on us, these giant, multicellular, metazoan-level code-organisms like Facebook or Amazon. All these species form a digital universe.

Dyson: Are we in that digital universe right now, as we talk on the phone?

Wired: Sure. You're recording this conversation on a digital recorder—into an empty matrix of addresses on a chip that is being filled up at 44 kilobytes per second. That address space full of numbers is the digital universe.

Wired: But haven't we written down numbers for centuries?

Dyson: Yes, we had clay tablets, counting pebbles, the abacus, ledgers, and punch cards. Writing and retrieving numbers isn't new. But the computer accelerated information processing to the speed of light. That velocity fundamentally changed everything.

Wired: So how did this parallel light-speed universe begin?

Dyson This vast digital world, where we can get almost anything anytime, goes back to the very first address memory in the MANIAC. Its dimensions were only 32 x 32 x 40 bits. That's 5 kilobytes, or room to record about a fraction of a second of this conversation!

Wired: And how fast is this universe expanding?

Dyson: Like our own universe at the beginning, it's more exploding than expanding. We're all so immersed in it that it's hard to perceive. Last time I checked, the digital universe was expanding at the rate of 5 trillion bits per second in storage and 2 trillion transistors per second on the processing side.

Photo: Joe Pugliese

Wired: How did the MANIAC project get started?

Dyson: The von Neumann project was funded to do H-bomb calculations. It was a deal with the devil: If they designed this ultimate weapon, they could have this fantastic machine.

Wired: So the creation of digital life was rooted in death?

Dyson: In some creation myths, life arises out of the earth; in others, life falls out of the sky. The creation myth of the digital universe entails both metaphors. The hardware came out of the mud of World War II, and the code fell out of abstract mathematical concepts. Computation needs both physical stuff and a logical soul to bring it to life. These were young kids who had just come through World War II, who could repair the electronics on airplanes and get them flying the same day, and von Neumann put them together with mathematical logicians who could imagine a universe created entirely out of 0s and 1s.

Wired: Who were some of the heroes in this creation myth?

Dyson: Alan Turing was the logician with the original idea. Julian Bigelow was the engineer who built the actual machine, and biologist Nils Barricelli saw where it was all going. Johnny von Neumann had the government money and confidence to make it happen, and his wife, Klari von Neumann, wrote the code.

Wired: What was Turing's vision?

Dyson: Turing, as a 23-year-old graduate student, derived the principles of modern computation more or less by accident—as a byproduct of his interest in something called the Entscheidungsproblem, or Decision Problem. It can be stated as: Is there a formula or mechanical process that can decide whether a string of symbols is logically provable or not? Turing's answer was no. He restated the answer in computational terms by showing that there's no systematic way to tell in advance what a given code is going to do. You can't predict how software will behave by inspecting it. The only way you can tell is to actually run it. And this fundamental unpredictability means you can never have a complete digital dictatorship with one government or company controlling our digital lives—not because of politics but because of mathematics. There will always be codes that do unpredictable things. This is why the digital universe will never be a national park; it will always be an undomesticated, unpredictable wilderness. And that should be reassuring to us.

Wired: Where did the idea of the address matrix come from?

Dyson: Turing gave us a one-dimensional model, a linear sequence on an unlimited tape. John von Neumann and colleagues, starting in 1945, gave us a two-dimensional implementation—the address matrix, which stored strings of bits at randomly accessible coordinates as if they were locations on a chessboard—a design that underlies all computers today. In von Neumann's scheme, you used 10 bits to specify an address and 10 bits to specify an instruction for what to do at that address. Each address stored 40 bits, which might be data or further instructions. So when you put 20 bits into the matrix you would get at least 40 bits back. The moment the electricity was turned on for the first time in the summer of 1951, computers started generating new numbers, and those numbers generated more numbers, and the cycle hasn't stopped since.

Wired: Were there alternative schemes in the beginning?

Dyson: Yes, there were many alternatives. Von Neumann's architecture just happened to have a head start. And it turns out he was consulting for IBM, so all these ideas were being mainlined right into IBM headquarters, without any patent restrictions.

Wired: Did they envision what their creation would eventually become?

Dyson: They would have completely understood what is going on today in computers. But they would have been utterly amazed 60 years later to see that everything is still running exactly as they left it in 1952! They all thought their design would last a year or two, and then a better one would come along.

Wired: Could their design still be replaced?

Dyson: Probably not at the fundamental, cellular level. The microprocessor is just so cheap, powerful, and effectively self-replicating that it is unlikely to be replaced—though it may become the substrate for something else.

Wired: What were these guys like?

Dyson: They were hackers. They were young men and women, mostly in their twenties. The ones in their thirties were considered old. They did all the things hackers do: working all night, living for their code, arguing over whether a problem was due to software or hardware. They kept logbooks where they left notes telling the next shift what they had done, and they filled them with dark nerd humor, the same sort of sarcastic jokes you find on email and comment threads today. Von Neumann was warned by the director that his "computer people" were consuming too much sugar—when sugar was still rationed. And they were fast. They completed the entire project in less time than it took me to write about it!

Wired: In your story you give the engineers as much credit as the theoretical geniuses.

Dyson: They were trying to build this machine in a place—a former farm on the outskirts of Princeton—that didn't have any workshops or engineering facilities. They had to build their own electric power supplies and do their own wiring. Right after World War II, everything was rationed, including lumber, so they bought unrationed firewood to use as lumber. They were just making the whole thing up as they went along.

Wired: Other than bombs, what did they use the MANIAC for?

Dyson: They were gung ho on numerical weather prediction, and they made amazingly good progress in the first few years. Remember, all they had was 5 kilobytes of memory, running at a speed of 8 kilocycles. Yet by 1954 they were predicting weather for the northern hemisphere.

Wired: What were they wrong about?

Dyson: In 1948, von Neumann gave a speech on the future of computing in which he said that the proper order of magnitude for a computing machine seemed to be about 10,000 switches. He was way wrong. Your laptop has billions of switches.

Wired: What did they get right?

Dyson: Vacuum tubes in the early machines had an extremely high failure rate, and von Neumann and Turing both spent a lot of time thinking about how to tolerate or even take advantage of that. If you had unreliable tubes, you couldn't be sure you had the correct answer, so you had to run a problem at least twice to make sure you got the same result. Turing and von Neumann both believed the future belonged to nondeterministic computation and statistical, probabilistic codes. We've seen some success recently in using that kind of computation on very hard problems like language translation and facial recognition. We are now finally getting back to what was envisioned at the beginning.

Wired: Did you discover anything in your research that surprised you?

Dyson: I was surprised by the close calls between failure and success. The ENIAC was the American wartime computer project built to calculate ballistic trajectories. The two main guys who invented and built the original ENIAC, Presper Eckert and John Mauchly, formed their own company to build and sell computers. They were doing quite well and then mysteriously had their security clearance revoked, which cost them their government contracts and put them out of business—to the great benefit, eventually, of IBM. Success in technology was as unpredictable at the beginning as it is today.

Wired: Did they see computation in the quasi-biological ways you do?

Dyson: The moment von Neumann got the computing machine running, Nils Barricelli showed up, trying to evolve self-replicating, crossbreeding digital organisms. He encouraged strings of code to replicate with small variations to compete in solving a problem or a simple game. The winning code gained computing resources. Like biological life, no one designed them. Fifty years later, I went back to the basement storeroom where the project was started, which at the time was the institute's main network server room. One of the servers was working full-time to keep out all the self-replicating computer viruses trying to get in. Barricelli's vision had come true!

Wired: Did they have any concerns about what they were creating?

Dyson: Shortly before Barricelli's experiments, mathematician Stan Ulam asked von Neumann if they should be worried about using the term organism to refer to code. Von Neumann didn't shy away from hydrogen bombs and end-of-the-world weapons, but he was hesitant about "creating life" and other Frankenstein stuff.

Wired: Where is this digital universe headed?

Dyson: We have created this expanding computational universe, and it's open to the evolution of all kinds of things. It's cycling faster and faster, and it's way, way, way more than doubling in scale every year. Even with the help of Google and YouTube and Facebook, we can't consume it all. And we aren't really aware what this vast space is filling up with. From the human perspective, computers are idle 99 percent of the time, just waiting for the next instruction. While they're waiting for us to come up with instructions, more and more computation is happening without us, as computers write instructions for each other. And as Turing showed mathematically, this space can't be supervised. As the digital universe expands, so does this wild, undomesticated side.

Wired: If this is true, what's the takeaway?

Dyson: Hire biologists! It doesn't make sense for a high tech company to have 3,000 software engineers but no biologists.

Wired: One last question: What do you mean by "Turing's cathedral"?

Dyson: In Turing's 1950 paper, "Computing Machinery and Intelligence," he argued that when we build intelligent machines, we will not be creating souls but building the mansions for the souls that God creates. When I first visited Google, right about the time it went public, I walked around and saw what they were doing and realized they were building a very large distributed AI, much as Turing had predicted. And I thought, my God, this is not Turing's mansion—this is Turing's cathedral. Cathedrals were built over hundreds of years by thousands of nameless people, each one carving a little corner somewhere or adding one little stone. That's how I feel about the whole computational universe. Everybody is putting these small stones in place, incrementally creating this cathedral that no one could even imagine doing on their own.

Senior maverick Kevin Kelly (
[email protected]) interviewed James Gleick, author of The Information, in issue 19.03.