Von Neumann had one piece of advice for us: not to originate anything.
I don't know about you, but when I read about the exploits of people like John von Neumann, Alan Turing, J. Robert Oppenheimer, and Kurt Gödel in Turing's Cathedral: The Origins of the Digital Universe by George Dyson, I can't help but flash back to the Age of Heroes, where the names are different--Achilles, Odysseus, Agamemnon, and Ajax--but the larger than life story they lived is familiar. Dyson's book is the Iliad of our times, telling the story of great battles of the human mind: the atomic bomb, Turing machines, programmable computers, weather prediction, genetic-modeling, Monte Carlo simulation, and cellular automata.
Which brings up another question I can't help but ponder: is it the age that makes the person or is it the person that makes the age? Do we have these kind of people today? Or can they only be forged in war?
Anyway, I found this advice from John von Neumann, as told by Julian Bigelow, about how to go about building the MANIAC computer. This advice still echoes down project management halls today:
“Von Neumann had one piece of advice for us: not to originate anything.” This helped put the IAS project in the lead. “One of the reasons our group was successful, and got a big jump on others, was that we set up certain limited objectives, namely that we would not produce any new elementary components,” adds Bigelow. “We would try and use the ones which were available for standard communications purposes. We chose vacuum tubes which were in mass production, and very common types, so that we could hope to get reliable components, and not have to go into component research.”
They did innovate on architecture by making it possible to store and run programs. Some interesting quotes from the book around that development:
- The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.
- Electronic components were widely available in 1945, but digital behavior was the exception to the rule. Images were televised by scanning them into lines, not breaking them into bits.
- Three technological revolutions dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA.
- The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms.
- in the best of all possible worlds, where the computable functions make life predictable enough to be survivable, while the noncomputable functions make life (and mathematical truth) unpredictable enough to remain interesting
- Words’ coding the orders are handled in the memory just like numbers,” explained von Neumann, breaking the distinction between numbers that mean things and numbers that do things. Software was born. Numerical codes would be granted full control—including the power to modify themselves.
- Instead of being in a world of expensive multiplication and cheap storage, we were thrown into one in which the former was very cheap and the latter very expensive. Virtually all the algorithms that humans had devised for carrying out calculations needed reexamination.
- The ENIAC was programmed by setting banks of 10-position switches and connecting thousands of cables by hand. Hours, sometimes days, were required to execute a programming change.
- Data and instructions were intermingled within the machine. “A pulse, no matter for what use, had the same physical definition in almost all situations throughout the ENIAC,” Mauchly explains. “Some pulses were used to control operations and others to signify data … but a pulse representing an algebraic sign for some data, or one representing a digit value, could be fed into a control circuit and expected to function just as any control pulse might function.
- I have often been asked, ‘How big was the ENIAC storage?’ ” says Mauchly. “The answer is, infinite. The punched-card output was not fast, but it was as big as you wished.
- The difficulty, as Mauchly described it, was that “the fast memory was not cheap and the cheap memory was not fast.
- vacuum-tube flip-flop had a response time in the order of a microsecond, whereas it took on the order of one second to read or write an IBM card. A gap of six orders of magnitude lay in between.
- Acoustic delay-line memory was used in many first-generation stored-program computers, although, as British topologist Max Newman complained, “its programming was like catching mice just as they were entering a hole in the wall.”
- The idea of the stored program, as we know it now, and which is a clear-cut way of achieving a universal computer, wasn’t invented overnight,” explains Rajchman. “Rather it evolved gradually. First came manually changeable plug-ins, relays, and finally the modifying contacts themselves became electronic switches. Next came the idea of storing the state of those switches in an electronic memory. Finally this resulted in the idea of the modern stored program in which ‘instructions’ and ‘data’ are stored in a common memory.”
- The functional elements of the computer were separated into a hierarchical memory, a control organ, a central arithmetic unit, and input/output channels, making distinctions still known as the “von Neumann architecture” today.
- Leibniz had invented the shift register—270 years ahead of its time. In the shift registers at the heart of the Institute for Advanced Study computer (and all processors and microprocessors since), voltage gradients and pulses of electrons have taken the place of gravity and marbles, but otherwise they operate as Leibniz envisioned in 1679.
- What Gödel (and Turing) proved is that formal systems will, sooner or later, produce meaningful statements whose truth can be proved only outside the system itself. This limitation does not confine us to a world with any less meaning. It proves, on the contrary, that we live in a world where higher meaning exists.
- We owe the existence of high-speed digital computers to pilots who preferred to be shot down intentionally by their enemies rather than accidentally by their friends.
- This was achieved by following the same principle that Bigelow and Wiener had developed in their work on the debomber: separate signal from noise at every stage of the process—in this case, at the transfer of every single bit—rather than allowing noise to accumulate along the way. This, as much as the miracle of silicon, is why we have microprocessors that work so well today. The entire digital universe still bears the imprint of the 6J6.
- The use of display for memory was one of those discontinuous adaptations of preexisting features for unintended purposes by which evolution leaps ahead.
- The digital universe and the hydrogen bomb were brought into existence at the same time. “It is an irony of fate,” observes Françoise Ulam, “that much of the high-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man’s monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.”
- Barricelli “insisted on using punched cards, even when everybody had computer screens,” according to Gaure. “He gave two reasons for this: when you sit in front of a screen your ability to think clearly declines because you’re distracted by irrelevancies, and when you store your data on magnetic media you can’t be sure they’re there permanently, you actually don’t know where they are at all.”
- If humans, instead of transmitting to each other reprints and complicated explanations, developed the habit of transmitting computer programs allowing a computer-directed factory to construct the machine needed for a particular purpose, that would be the closest analogue to the communication methods among cells.
- Barricelli believed in intelligent design, but the intelligence was bottom-up. “Even though biologic evolution is based on random mutations, crossing and selection, it is not a blind trial-and-error process,” he explained in a later retrospective of his numerical evolution work. “The hereditary material of all individuals composing a species is organized by a rigorous pattern of hereditary rules into a collective intelligence mechanism whose function is to assure maximum speed and efficiency in the solution of all sorts of new problems.… Judging by the achievements in the biological world, that is quite intelligent indeed.”
- The von Neumann model might turn out to be similarly restrictive, and the solutions arrived at between 1946 and 1951 should no more be expected to persist indefinitely than any one particular interpretation of nucleotide sequences would be expected to persist for three billion years. The last thing either Bigelow or von Neumann would have expected was that long after vacuum tubes and cathode-ray tubes disappeared, digital computer architecture would persist largely unchanged from 1946.
- “It was all of it a large system of on and off, binary gates,” Bigelow reiterated fifty years later. “No clocks. You don’t need clocks. You only need counters. There’s a difference between a counter and a clock. A clock keeps track of time. A modern general purpose computer keeps track of events.”15 This distinction separates the digital universe from our universe, and is one of the few distinctions left.
- “Science thrives on openness,” he reflected in 1981, “but during World War II we were obliged to put secrecy practices into effect. After the war, the question of secrecy was reconsidered … but the practice of classification continued; it was our ‘security,’ whether it worked or failed.… The limitations we impose on ourselves by restricting information are far greater than any advantage others could gain.”