The electronic computer is the defining technology of the modern era. For many of us it is difficult, if not impossible, to imagine life without computers: we use computers to do our work, to help us study, to create and access entertainment, and to communicate with friends and family. And those are just the ways in which computers are most obviously visible in our society: millions of other tiny computing devices, called microprocessors, are hidden inside other products and technologies, quietly gathering data, controlling processes, and communicating between components. Your automobile almost certainly has its own computer (in fact, probably several), as does your cell phone, and perhaps even your refrigerator. Computers are everywhere. But where did they come from?
The history of the computer is, like the computer itself, complicated but fascinating. It encompasses many of the great events of the 19th and 20th centuries: the industrial and communications revolutions, the Second World War, the Space Race, the emergence of the electronics and plastics industries, the establishment of a truly global economy. Some of the actors in this history have become famous — the IBM corporation, for example, and Apple Computer, and Bill Gates — while others have yet to be widely recognized for their contributions. This exhibit explores some of the less well-known but nevertheless extremely important pioneers of the computer era: Herman Hollerith, the inventor of the electric tabulating machine; John Mauchly and Presper Eckert, who designed and built the ENIAC, one of the earliest and most influential electronic computers; John Bardeen and Walter Brattain, whose work on the point-contact transistor won them a Nobel Prize in Physics; and Claude Shannon, who defined for the world a theory of information and communications that has profoundly shaped not only the science and technology of computing, but also of biology, ecology, economics, and physics.
One way to introduce the history of the computer is to begin with what would seem to be a simple and straightforward question: who invented the first computer? This is a question that is often asked, quite understandably, but which is in fact surprisingly difficult to answer. To begin with, the word "computer" has been with us a long time: it was first used in the third century AD to describe the calculations used to determine the constantly shifting date of the Easter holiday.1 More recently, "computer" was used to describe, not a machine, but a person: well into the 20th century, these "computers" were employed, by a wide variety of scientific, governmental, and commercial organizations, to make calculations, either by hand or with the assistance of calculating machines.2 But while these "human computers" played an important role in the larger history of computation, they are not what most of us would consider to be true computers; we associate the computer revolution with machines, rather than people.
Even if we confine ourselves to the more conventional understanding of the computer as an electronic, digital, and programmable device (more on all of these characteristics later), our search for the "first" computer is complicated. The shift from human to machine-based computing can be traced back to the early 17th century and beyond, as clever individuals developed new tools for manipulating numbers. It is easy to imagine why numbers were important: they are widely used in business, science, and warfare. But it was not until the 19th century that the search for large-scale mechanical computation began in earnest. It was in this period that population growth, economic expansion, and the rise of powerful nation-states created new demands for information processing techniques and technologies. In the United States, for example, innovations in communication and transportation allowed for the emergence of large national (and eventually international) corporations whose seemingly insatiable demand for data spurred numerous innovations in information technology. Many of the most important players in the early computer industry — including Burroughs, Remington Rand, National Cash Register (now NCR), and most importantly, the International Business Machines Company (IBM) — were creations of the burgeoning business machines industry of the late 19th century. Although IBM itself was not incorporated until 1924, it traces its origins directly to the 19th century Tabulating Machines Company, founded in 1886 by the inventor and engineer Herman Hollerith.3
The story of Herman Hollerith and his Tabulating Machine provides valuable insights into the early origins of the electronic computer. In the early 1880s Herman Hollerith had worked as a statistician for the United States Census bureau. At that time the Census Bureau was facing a problem typical of many Industrial Era governments and corporations: it had more data than it knew what to do with. In the case of the Census Bureau, of course, this was population data: for the 1880 census, the Census Bureau had to gather and enumerate data more than 50 million US citizens. The 1880 census report was 21,000 pages long and took seven years to compile. It was clear that, without some dramatic change in the way the Census Bureau dealt with their data, the 1890 census was going to prove too much to handle.
The case file on Herman Hollerith describes how this remarkable young inventor harnessed ideas from science and technology (some quite new, some well-established) to help contain the information explosion that threatened the Census Bureau. His tabulating machines provided an industrial-strength solution to the problem of information processing. Indeed, his tabulating machines formed essential components of an "information factory" approach to computing that gradually replaced older, human-based methods. In many respects, the earliest electronic computers were simply evolutionary extensions of Hollerith's 19th century technology — "glorified tabulating machines," as they were sometimes dismissively referred to by contemporaries.4 But in another very real sense, tabulating machines, despite their importance in the history of computing, were not, by modern standards, real computers. Which returns us to our original question: so who actually did invent the computer?
Many observers have identified the ENIAC (the Electronic Numerical Integrator And Computer) as the first true electronic computer. And indeed, the ENIAC has a strong claim to this title: not only was it digital, electronic, and programmable (and therefore looked a lot like a modern computer), but the designers of the ENIAC, John Mauchly and Presper Eckert, went on to form the first commercial computer company in the United States. The ENIAC and its commercial successor, the UNIVAC, were widely publicized as the first of the "giant brains" that presaged the coming computer age. But even the ENIAC had its precursors and competitors: for example, in the 1930s, a physicist at Iowa State University, John Atanasoff, had worked on an electronic computing device, and had even described it to John Mauchly. Others were working on similar devices. During the Second World War in particular, a number of government and military agencies, both in the United States and abroad, had developed electronic computing devices, many of which also have a plausible claim to being, if not the first computer, then at least a first computer.5
What is most significant about the ENIAC is that it was electronic. Earlier computing devices, including tabulating machines, were either mechanical or electromechanical, meaning that they contained numerous moving parts. These moving parts were complicated to manufacture, difficult to maintain, and, above all, relatively slow. By replacing them with completely electronic components, Eckert and Mauchly were able to dramatically speed-up the process of computation. Whereas the electromechanical Harvard Mark I (completed in 1943), which was of similar complexity to the ENIAC, could perform two or three additions per second, and one multiplication every six seconds, the ENIAC (completed just three years later), could perform 5,000 additions per second, or 333 multiplications.
These extraordinary improvements in performance came at a price, however. In the 1940s, electronic components were expensive and unreliable, and required a great deal of space and power. The ENIAC alone required almost 18,000 electronic vacuum tubes, weighed nearly 30 tons, and occupied an entire room. Its enormous cost could only be justified by the exigencies of the Second World War. The ENIAC was specifically designed to calculate ballistics trajectories for the United States Navy. It was later also used for computations essential to the development of the atomic bomb. The urgent demands of the war effort made possible massive investment in electronics and computing that otherwise would have been impossible. Later the Cold War and the Space Race would provide similar incentives for further investment in computing.
But once the ENIAC proved to the world the utility of the electronic computer, other innovations quickly followed. In 1946, At the Moore School of Engineering at the University of Pennsylvania, Eckert and Mauchly hosted a series of lectures that helped spread the word about electronic computing. They also went on to form the Electronic Control Company (later the Eckert-Mauchly Computer Corporation), which produced the first commercial computer available in the United States, the UNIVAC I (first shipped in 1952).
Over the course of the 1950s, computer manufacturers such as Remington Rand (which purchased the Eckert Mauchly Computer Corporation in 1951), Burroughs, IBM, Honeywell, and others worked to refine the technology of electronic computing. For the most part, however, computers remained large, expensive, and somewhat unreliable. Many were still used for the kind of work that was formerly done by tabulating machines: strictly numerical calculations of the sort needed by engineering design firms, scientific and weapons laboratories, and insurance companies. More novel uses of the electronic computers required smaller and less expensive machines.
Which brings us to the work of John Bardeen and Walter Brattain. In the 1940s, Bardeen and Brattain were physicists working at Bell Laboratories. They were charged with studying a group of materials called semiconductors, which their employer hoped could be used to develop technologies to replace bulky and power-hungry vacuum tubes. Vacuum tubes were used by Bell Telephone as amplifiers, which explains Bardeen and Brattain's interest in improving on them, but were also critical components of the emerging technology of electronic computing. When, in the late 1940s, Bardeen and Brattain and their colleague William Shockley invented the point-contact transistor, they helped to revolutionize the computer industry. Transistor-based computers were smaller, faster, and less-expensive than their vacuum tube-based predecessors, and could therefore be used for more and diverse applications. The transistor led directly to the integrated circuit, which led to the microprocessor, which in turn led to the personal computer. Although Bardeen and Brattain have been overshadowed (undeservedly) by their controversial colleague William Shockley, their contribution to the history of computing is substantial.6
Which brings us back yet again to our original question. By now it should be clear that the computer is not a single development but many: some of them large and impressive, like the ENIAC; some, like the transistor, almost invisible; still, others, such as Claude Shannon's work on information theory, are not things at all, but ideas. One of the defining characteristics of the modern computer — what sets is apart from all of its predecessor information technologies — is that it is programmable. It is a machine that can be easily transformed into other machines. By installing new software, we can make our computer serve alternatively as a word processor, a video game player, a digital photo album, or a gateway to the Internet. Its uses are limited only by our imagination, which is what makes the computer so unique, and so ubiquitous.
And so it may be that there is no good answer to our question about who invented the first computer. The technology, or set of technologies, that we call the computer is too complicated, too diverse, and too important to reduce to a single moment or act of invention. The history of the computer is as large as the number of uses to which we apply our technology. As each of the case studies in this exhibit reveals, there are many inventors, and many firsts, each worthy of our attention and appreciation.