How the information age really began
- August 25, 2022
- Ananyo Bhattacharya
While the foundations of computing may have been laid by the likes of Babbage and Lovelace, it was John von Neumann's report on the EDVAC system which transformed abstract musings into a canonical blueprint for the stored-program computer.
Charles Babbage, Ada Lovelace and Alan Turing are all celebrated as computer pioneers, but the name of John von Neumann, a brilliant Hungarian-American mathematician once nearly as well known in America as Albert Einstein, is more likely to elicit blank looks than knowing nods. Yet, for all their genius, the origin of the information age that revolutionised the way we work and play can be traced much more directly to von Neumann than to any of these more famous figures.
That story begins with an accidental meeting on a train platform one summer evening in 1944 between von Neumann and Herman Goldstine, a fellow mathematician who had taught at the University of Michigan in peacetime but enlisted when the United States entered the Second World War. Goldstine was stationed at the Aberdeen Proving Ground, a weapons-testing facility in Maryland, where he was assigned to a team calculating artillery-firing tables — long lists of trajectory data showing how far shells fly under different conditions. He was waiting for his train home that day when he spotted von Neumann on the platform. Goldstine, who had attended some of von Neumann’s lectures, recognised him instantly. He introduced himself, and the pair began to chat.
Von Neumann was 40 years old. A child prodigy born to a wealthy Jewish family in Budapest, he had published his first maths paper aged 17 and had a first draft of his PhD two years later. His thesis, published in 1925, had offered a solution to some deep problems in set theory that were baffling contemporary mathematicians, including Bertrand Russell, who were more than twice his age. A powerful tool for manipulating and proving theorems, set theory is the language of mathematics, so the young von Neumann’s contribution to shoring up his discipline’s foundations sealed his reputation as a mathematician of the first rank. Shortly afterwards, at the University of Göttingen, he rubbed shoulders with another boy wonder, Werner Heisenberg, who was busily laying the groundwork for a bewildering but successful new science of the atom and its constituents that was soon named ‘quantum mechanics’. Von Neumann produced the first mathematically rigorous description of the new field in 1932, and one of the work’s earliest fans would be a teenage Turing, who ordered the book in the original German as his prize for winning a school competition.
Lured to the US by Princeton University with the offer of a huge salary, von Neumann had arrived in January 1930 with his first wife and fellow Budapestian, Mariette Kövesi. Feeling that János — his Hungarian name — sounded too foreign in his new home, he urged American friends to call him ‘Johnny’. Three years later, Johnny became one of the first professors to be hired by the elite new Institute for Advanced Study (IAS) in Princeton, along with Einstein. Aged 29, von Neumann was the youngest of the institute’s recruits.
In America, convinced another world war was around the corner, and fearing European Jews would suffer a genocide as Armenians had under the Ottoman Empire, von Neumann made himself an expert in the mathematics of explosions. In demand by the American military and government, he was sent on a mission to wartime Britain in 1943, only to be recalled to the US by a letter from American physicist Robert Oppenheimer, begging him to join a ‘somewhat Buck Rogers project’ in Los Alamos, New Mexico. He joined the project, America’s massive top-secret bid to build the atom bomb, and immediately made decisive contributions to the design of the ‘implosion device’, the more powerful weapon, which would eventually be detonated over Nagasaki. Shortly afterwards, von Neumann, who had a long-standing interest in computing machines, began scouring the country to find ways to help Los Alamos tackle a long roll of bomb-related calculations. And that was when he bumped into Goldstine.
Von Neumann was nearly as renowned for his wit and a way with dirty limericks as for his near superhuman intellect. But when Goldstine told him that his role involved liaising with a team at the Moore School in Philadelphia who were busy building an electronic computer capable of more than 300 multiplications per second, von Neumann instantly turned serious. ‘The whole atmosphere of our conversation changed,’ Goldstine remembered, ‘from one of relaxed good humour to one more like the oral examination for the doctor’s degree in mathematics.’
The machine being built at the Moore School was the ENIAC (Electronic Numerical Integrator and Computer), the brainchild of John W. Mauchly, a former physics teacher, and J. Presper Eckert, the whizz-kid son of a local real estate magnate who ran the teaching laboratory for the electronics course there. During the war, the task of computing artillery firing tables was consuming an ever-growing proportion of the Moore School’s resources. The ENIAC had been commissioned to help clear the backlog.
The computer occupied a room that was roughly 30 feet wide and 56 feet long. Arrayed along the walls were the machine’s innards: some 18,000 vacuum tubes, and banks of wiring and switches arranged in panels eight feet tall. But when von Neumann arrived on the scene in August 1944, the ENIAC, plagued by parts shortages, was still more than a year away from completion.
One of his first contributions to the project was to keep the army’s money flowing in. He argued convincingly that the machine’s usefulness would extend far beyond the purpose for which it had been designed. But from the moment he first saw it, von Neumann was thinking of a radically different kind of computer altogether. While the ENIAC was born as a machine of war, built for a single task, he understood that the future lay in a greatly more flexible device that could be easily reprogrammed. More importantly, von Neumann saw more clearly than anyone on the ENIAC team — and perhaps more clearly than anyone in the world — the best way to structure such a machine.
The result of his musings, First Draft of a Report on the EDVAC, would become the most influential document in the history of computing. ‘Today, it is considered the birth certificate of modern computers,’ says computer scientist Wolfgang Coy. Curiously, von Neumann was men- tally prepared for this cutting-edge contribution to computing by his early abstruse mathematical work on set theory.
The problems in set theory that von Neumann had helped to address as a youth were part of the ‘foundational crisis’ which swept through mathematics during the early twentieth century. David Hilbert, the most eminent mathematician of the period, was troubled that his discipline seemed to be founded on shifting sands. ‘If mathematical thinking is defective,’ he asked, ‘where are we to find truth and certitude?’
In 1928, he challenged his followers to ensure that the foundations of mathematics were secure once and for all. To do so, he said, they would need to demonstrate that mathematics is complete, consistent and decidable. By complete, Hilbert meant that all true mathematical theorems and statements can be proved from a finite set of assumptions or axioms. By consistent, he was demanding a proof that the axioms would not lead to any contradictions. The third of Hilbert’s demands, that mathematics should be decidable, became widely known as the Entscheidungsproblem (decision problem): is there a step-by-step procedure — an algorithm — that can be used to show whether or not any particular mathematical statement can be proved? Mathematics would only be truly safe, said Hilbert, when, as he expected, all three of his demands were met.
Soon after Hilbert issued his challenge, the intellectually dynamic but psychologically frail Austrian mathematician Kurt Gödel would demonstrate that it is impossible to prove that mathematics is either complete or consistent. Five years after Gödel’s breakthrough, a 23-year-old Turing would attack Hilbert’s ‘decision problem’ in a way completely unanticipated by any other logician, conjuring up an imaginary machine to show that mathematics is not decidable.
Hilbert’s quest for a perfectible mathematics ran aground. But he had forced mathematicians to think extremely rigorously about what can or cannot be proved. The systematic procedures they used to tackle Hilbert’s challenges would soon become familiar — through von Neumann’s work — in a very different guise, as computer programs.
To prove that maths was incomplete, for example, Gödel devised a system in which logical statements (that were very much like computer commands) could be rendered as numbers, dissolving the rigid distinction between syntax and data. Or, as von Neumann would put it in 1945 while describing the computer he was planning to build at the IAS: ‘“Words” coding the orders are handled in the memory just like numbers.’ That is the essence of modern-day coding, the concept at the heart of software.
‘Someone knowledgeable about modern programming languages today looking at Gödel’s paper…will see a sequence of forty-five numbered formulas that looks very much like a computer program,’ the mathematician Martin Davis explains. ‘The resemblance is no accident. Gödel had to deal with many of the same issues that those designing programming languages and those writing programs in those languages would be facing.’
Turing’s achievement, by contrast, was to describe in abstract terms imaginary machines that could read, write or erase symbols on an infinitely long tape. In his paper, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’, Turing painstakingly builds a ‘universal machine’. When fed the coded description of any other Turing machine, the universal machine can simulate it exactly.
The whole logical apparatus of Turing’s paper was assembled to answer Hilbert’s Entscheidungsproblem. With it, he proved there can be no general, systematic process for deciding whether or not any particular mathematical statement is provable, dashing the last of Hilbert’s dreams. Though no one recognised it as such at the time, Turing’s ‘universal machine’ is now considered an abstract prototype of a general purpose ‘stored program’ computer — one that can, like any laptop or smartphone today, execute any application in the computer’s memory. The proofs of ‘On Computable Numbers’ arrived while Turing was in Princeton as a visiting fellow, a post he had secured in part because of a letter of recommendation from von Neumann. Turing was disappointed with his paper’s reception there. But one person did take notice. ‘Turing’s office was right near von Neumann’s, and von Neumann was very interested in that kind of thing,’ says Goldstine. ‘I’m sure that von Neumann understood the significance of Turing’s work when the time came.’
With the EDVAC report, von Neumann turned Gödel’s and Turing’s abstract musings into the canonical blueprint for the stored-program computer. Because he did not wish to get bogged down in the specifics of engineering, and because the ENIAC was still classified as confidential, von Neumann described his computer in terms of idealised neurons, shorn of their physiological complexities, which came from work published by the neurophysiologist Warren McCulloch and the mathematician Walter Pitts in 1943. This seems odd today, but von Neumann, Turing, Norbert Wiener and other thinkers who contributed to the foundations of the field that became known as artificial intelligence did think about computers as electronic brains. Today, using ‘brain’ or ‘neuron’ in the context of computers seems laughably naive. Yet we accept the similarly anthropomorphic use of ‘memory’ to mean ‘storage’ without blinking an eye.
McCulloch-Pitts neurons are vastly simplified electronic versions of a neuron. In his EDVAC report, von Neumann wires networks of these ‘neurons’ together to make five components or ‘organs’, each with a different function. The first three components he described were a ‘central arithmetic’ unit for performing mathematical operations, such as addition and multiplication; a ‘central control’ unit to ensure that instructions were executed in the proper order; and a ‘memory’, a single organ that would store both computer code and numbers. The fourth and fifth components were the input and output units, for shipping data into or out of the machine.
Computer designers now refer to the whole configuration as the ‘von Neumann architecture’, and nearly all computers in use today — smart- phones, laptops, desktops — are built according to its precepts. The design’s fundamental drawback, now called the ‘von Neumann bottleneck’, is that instructions or data have to be found and fetched serially from memory, like standing in a line and being able to pass messages only forwards or backwards. That task takes much longer than any subsequent processing. That handicap is outweighed by the architecture’s considerable advantages, which stem from its simplicity. The ENIAC had 20 modules that could add and subtract, for example; in the EDVAC, there would be one. Less circuitry means less that can go wrong, and a more reliable machine. Historian Thomas Haigh and colleagues describe the aesthetics of von Neumann’s report as ‘radical minimalism, akin to high modernist architecture or design’. ‘His intellectual response to ENIAC,’ they say in their book, ENIAC in Action, ‘might be likened to that of a Calvinist zealot who, having taken charge of a gaudy cathedral, goes to work white-washing frescos and lopping off ornamental flourishes.’
When Goldstine received the report, he was in raptures. He congratulated von Neumann for providing the first ‘complete logical framework for the machine’ and contrasted the streamlined design with the ENIAC, which was ‘chuck full of gadgets that have as their only raison d’être that they appealed to John Mauchly’. He had the report typed up, and sent out dozens of copies to nascent computer groups all over the world.
Not everyone was pleased. The report Goldstine circulated only had the name John von Neumann on its title page. Eckert and Mauchly, who were hoping to patent aspects of computer design, were furious. The ENIAC’s inventors accused von Neumann of inflating his contribution to the project and rehashing their work.
Von Neumann, for his part, feared that the commercial route the ENIAC’s inventors were pursuing would stifle progress. The purpose of the EDVAC report, von Neumann testified in 1947, was ‘to contribute to clarifying and coordinating the thinking of the group’ and ‘further…the art of building high speed computers’ by disseminating the work as quickly and as widely as possible. ‘My personal opinion was at all times, and is now, that this was perfectly proper and in the best interests of the United States,’ he observed.
‘I certainly intend to do my part to keep as much of this field “in the public domain” (from the patent point of view) as I can,’ von Neumann wrote as he made plans for building his own computer at the IAS. Patent rights to the IAS machine were in large part handed over to the government in mid-1947. Von Neumann and Goldstine sent a stream of detailed progress reports to about 175 institutions in several different countries, spawning the first generation of modern computers across the world.
Britain’s Small-Scale Experimental Machine, nicknamed the ‘Manchester Baby’, sputtered into life in 1948. Often said to be the world’s first electronic stored-program computer, the Baby was based on the EDVAC design and on 21 June that year cycled through 17 commands over 52 minutes to determine that the highest factor of 262,144 is 131,072. But weeks earlier, von Neumann’s second wife, Klára Dán, had helped to rewire the ENIAC into a sort of EDVAC-style computer. Klára, who describes herself as a ‘mathematical moron’ in her memoirs, would also write the first code to run on the ENIAC in its new configuration. Her 800-command program, simulating the paths of neutrons inside an atom bomb, was the first truly useful, complex modern program ever to have been executed on a computer.
Von Neumann’s own computer at the IAS finally began work in 1951. By this time, the IAS machine’s numerous offspring, built with the aid of von Neumann and Goldstine’s numerous detailed updates, were snapping at its heels. Perhaps most important among them was the IBM 701, the company’s first commercial computer and practically a carbon copy of the one built at the IAS. By the 1960s, IBM manufactured about 70% of the world’s electronic computers. ‘Probably, the IBM company owes half its money to Johnny von Neumann,’ von Neumann’s friend, the Hungarian physicist Edward Teller, would tell his biographers.
In 1965, Intel’s co-founder, Gordon Moore, predicted that the number of components on an integrated circuit would double every year. His observation became known as ‘Moore’s law’, but von Neumann had beaten him to it a decade earlier, by noting that the overall capacity of computers had nearly doubled every year since 1945 and implying, in conversation, that he expected that trend to continue.
The battle for ownership of the intellectual property and patent rights relating to the ENIAC and EDVAC would drag on for decades. Long after von Neumann’s death in 1957, the judge’s verdict, delivered on 19 October 1973, held the automatic electronic digital computer to be in the public domain. The open source movement, born a decade or so later, would soon shun corporate secrecy, lauding the benefits of freely sharing information to drive forward innovation. But thanks to von Neumann, those principles were baked into computing from the very beginning.
Babbage designed the ‘Analytical Engine’, a mechanical general purpose computer that was never built. Lovelace wrote a program for it that never ran and her work was rediscovered only long after the first EDVAC-style stored-program computers were built. Turing would cite the EDVAC report nine months after its publication, as he made his own plans for a computer, the Automatic Computing Engine (ACE), and his theoretical work was retrospectively appropriated as a foundation stone of computer science. Yet the names of John von Neumann and Klára Dán, innovators who helped birth the computer age, have faded inexplicably from view.