THE AGE OF INTELLIGENT MACHINES | Chapter 5: Mechanical Roots
September 24, 2001
- Ray Kurzweil
What if these theories are really true, and we were magically shrunk and put into someone’s brain while he was thinking. We would see all the pumps, pistons, gears and levers working away, and we would be able to describe their workings completely, in mechanical terms, thereby completely describing the thought processes of the brain. But that description would nowhere contain any mention of thought! It would contain nothing but descriptions of pumps, pistons, levers!
Wilhelm Leibniz (contemporary of and collaborator with Isaac Newton), commenting on theories that the brain was “just” a complicated mechanical computer.
The human imagination for emulating human thought by machine did not stop at mere philosophical debate and thought experiment. From Platonic times, inventors were anxious to apply whatever technology was available to the challenge of recreating human mental and physical processes.1 Before the taming of the electron, this meant harnessing the state of the art in mechanical techniques.2
Early Automata and Calculating Engines
As far back as the times of ancient Greece, machines that could emulate the natural movements of living creatures were built as a source of delight and apparent magic. They were also constructed by philosophers and their associates as a way of demonstrating that the natural laws were capable of producing complex behavior. This in turn fueled speculation that the same deterministic laws governed human behavior.3
Archytas of Tarentum (c. 400-350 B.C.), a friend of Plato, constructed a pigeon whose movements were controlled by a jet of steam or compressed air. Even more elaborate automata, including an entire mechanical orchestra, existed in China at the same time.4
The technology of clock and watch making produced far more elaborate automata during the European Renaissance, including human androids that were notable for their lifelike movements.5 Famous examples of these include the mandolin-playing lady, built in 1540 by Giannello Torriano (1515-1585), and a child of 1772 that was capable of writing passages with a real pen, built by P. Jacquet-Droz (1721-1790).6
Perhaps of greater significance in the development of intelligent machines were early attempts to reduce the laborious efforts required for calculation. The abacus, developed more than 5,000 years ago in the Orient, is of particular interest in its similarity to the arithmetic processing unit of a modern computer. It consists of movable beads on rods, which together implement a digital number store. Using prescribed “algorithms,” a user can perform computations ranging from simple addition to evaluation of complex equations. The algorithms are performed directly by the user, not by the machine, but the methods are nonetheless mechanistic.7
In 1617 John Napier (1550-1617), who is generally considered the discoverer of logarithms, invented a method of performing arithmetic operations by the manipulation of rods, called “bones” because they were often constructed from bones and printed with digits. Napier’s innovation was of direct significance for the subsequent development of calculating engines in which the “algorithms” were implemented in the mechanism of the device, rather than by human manipulation.8
Working from the age of 19 until he was 30, Blaise Pascal (1623-1662) attempted to perfect his mechanical calculator. After more than 50 models were constructed and discarded using materials ranging from wood and ivory to a variety of metals, Pascal finally perfected the world’s first automatic calculating machine in 1642.9 The machine was considered automatic in that the algorithm was performed by the machine and not the user, at least for addition and subtraction. The Pascaline, replicas of which are still used by many school children, uses rotating wheels inscribed with the digits and a ratchet mechanism that controls the overflow of one place position to the next. The ratchet adds one digit to (or borrows one from) the next highest place when there is a complete revolution of the lower decimal position.
The perfection of this computing machine created a stir throughout Europe and established the fame of its inventor. The device stimulated Pascal’s own philosophical reflections, and in his last (unfinished) major work, Pensées (“Thoughts”), Pascal writes, “The arithmetical machine produces effects which approach nearer to thought than all the actions of animals. But it does nothing which would enable us to attribute will to it, as to the animals.”10
The advent of an automatic calculating machine also led to controversy about its impact and created fear that it would lead to the unemployment of bookkeepers and clerks. The excitement generated by the Pascaline encouraged Pascal and his father to invest most of their money in an advertising campaign to market the invention. Unfortunately, Pascal was a better philosopher than businessman, and problems with reliability and service caused the venture to fail (apparently, many of the production models required repair services to be performed by Pascal himself).11
The Leibniz computer
Inspired by the Pascaline, Gottfried Wilhelm Leibniz attempted to add multiplication, division, and the extraction of square roots to the capabilities of a machine. After studying the largely unsuccessful attempts of Sir Samuel Morland (1625-1695), master of mechanics to King Charles II of England, Leibniz was able to perfect a multiplying machine based on repetitive additions, an algorithm still used in modern computers.12
Leibniz recognized that the complicated mechanisms of his calculator could be greatly simplified if the decimal system were replaced with a binary notation.13 Leibniz’s contemporaries resisted the idea, but his writings on binary arithmetic and logic were the inspiration, almost two centuries later, for George Boole (1815-1864) to develop the theory of binary logic and arithmetic, still the basis of modern computation.14
Charles Babbage and the World’s First Programmer
It is not a bad definition of man to describe him as a tool-making animal. His earliest contrivances to support uncivilized life were tools of the simplest and rudest construction. His latest achievements in the substitution of machinery, not merely for the skill of the human hand, but for the relief of the human intellect, are founded on the use of tools of a still higher order.
One evening I was sitting in the rooms of the Analytical Society at Cambridge …with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep called out, “Well, Babbage, what are you dreaming about?” to which I replied, “I am thinking that all these tables might be calculated by machinery.”
We may say most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves.
The Difference Engine
In 1821 Charles Babbage wrote a paper entitled “Observations on the Application of Machinery to the Computation of Mathematical Tables.”15 His ideas were well received, and he was awarded the first gold medal by the British Astronomical Society, which hoped to use the technology Babbage had proposed to compute astronomical tables.16 With funding from the Royal Society and the British government, Babbage attempted to build his ambitious Difference Engine. He worked on the project to the point of exhaustion, almost never seeing his children. The complexity of the machine was on the edge of what was technically feasible, and it exhausted Babbage’s financial resources and organizational skills. He ended up in a dispute over ownership with the British government, had problems getting the unusual precision parts fabricated, and saw his chief engineer fire all of his workmen and then quit himself. He was also beset with personal tragedies, including the deaths of his father, his wife, and two of his children.17
The Analytical Engine
With the Difference Engine still not completed, Babbage was inspired by what he regarded as another mental breakthrough. Rather than a machine that could only perform specific calculations, he conceived of what he called the Analytical Engine, which could be programmed to solve any possible logical or computational problem.18 The Analytical Engine, although based entirely on the mechanical technology of the nineteenth century, was a remarkable foreshadowing of the modern computer.
It had a random-access memory consisting of 1,000 “words” of 50 decimal digits each. In today’s terminology this is equivalent to about 175,000 bits. A number could be retrieved from any location, modified, and stored in any other location. It had a punched-card reader inspired by the Jacquard looms, automatic weaving machines controlled by punched metal cards, which used a similar mechanism.19 Babbage’s forward-looking design also included a printer, even though it would be another 50 years before either typesetting machines or typewriters were to be invented. It had an arithmetic “mill” (Babbage’s term) with registers that could perform a variety of logical and arithmetic operations similar to the central processing unit of a modern computer. Most important, it had a special storage unit for the instructions, or program, with a machine language very similar to those of today’s computers. One decimal field specified the type of operation, and another specified the address in memory of the operand. Babbage recognized the critical importance of the “conditional jump” instruction and provided for this capability.20
Lady Ada Lovelace
Though Babbage was a lonely man obsessed with his vision of a programmable computer, he developed a liaison with the beautiful Ada Lovelace, the only legitimate child of Lord Byron, the poet. She became as obsessed as Babbage with the project and contributed many of the ideas for programming the machine, including the invention of the programming loop and the subroutine.21
Although Babbage was too busy to communicate his ideas to the rest of the world, he allowed an associate, L. P. Menabrea (1809-1896), to describe the machine and its principles. Lovelace translated Menabrea’s paper from French to English and in her added notes (which were longer than the original paper) extended Babbage’s ideas. She included in these notes extensive discussions on programming techniques, sample programs, and the potential of this technology to emulate intelligent human activities.22 She describes the speculations of Babbage and herself on the capacity of the Analytical Engine and machines like it to play chess and compose music. She finally concludes that though the computations of the Analytical Engine could not properly be regarded as “thinking,” they could nonetheless perform activities that would otherwise require the extensive application of human thought.23
Ada Lovelace is regarded as the world’s first computer programmer and has been honored by the United States Defense Department, which named its primary programming language, Ada, after her. She died a painful death from cancer at the age of 36, leaving Babbage alone again to pursue his quest.
The mechanics required for his machine were so complex that it required Babbage to make significant advances in the machinist’s craft. Despite his ingenious constructions and exhaustive effort, neither the Difference Engine nor the Analytical Engine were ever completed. Near the end of his life he remarked that he had never had a happy day in his life. Only a few mourners were recorded at Babbage’s funeral in 1871.24
It is possible that if Babbage had had more enthusiastic sponsorship during his life, his efforts may have borne fruit, but the Analytical Engine is generally regarded as having been beyond the means of nineteenth-century engineering to realize. J. W. Mauchly (1907-1980) has commented that Babbage might have succeeded had he been willing to “freeze” his design, but his continual attempts to improve it doomed the project. Babbage’s concepts would be implemented seventy years later, when the first American programmable computer, the Mark I, was completed in 1944 by Howard Aiken (1900-1973) of Harvard University and IBM, using an architecture very similar to Babbage’s.25 Babbage was a man distinctly ahead of his time. Despite his failure to complete the implementation of his machine, his concepts of a stored program, self-modifying code, addressable memory, conditional branching, and computer programming itself still form the basis of computers today.
The Practical Path
This apparatus works unerringly as the mills of the gods, but beats them hollow as to speed.
The Electrical Engineer in a review of Hollerith’s Tabulating Machine
What is a computer?
I would define a computer as a machine capable of automatically performing (that is, without human intervention) sequences of calculations, and of choosing between alternate sequences of calculations based on the results of earlier calculations. The description of the sequence of calculations to be performed, which includes all alternate paths and the criteria for choosing among them, is called a program. A programmable or general purpose computer is one in which we can change the program. A special purpose computer is one in which the program is built-in and unchangeable. A calculator is distinguished from a computer by its inability to perform more than one (or possibly a few) calculations for each human intervention and its inability to make decisions to choose from among multiple paths of computation. With the advent of today’s programmable calculators, the distinction between calculators and computers has become blurred. The distinction was clear enough in the 1940s: calculators were generally capable of only a single calculation for each manually entered number. Tabulating machines, such as sorters, were capable of multiple calculations (of a certain limited kind), but were not able to alter the sequence of computations based on previous results.26 Note that these definitions say nothing about the underlying technology, which, at least in theory, might be mechanical, electronic, optical, hydraulic, or even flesh and blood. Indeed, the era of practical computation began not with electronics but with mechanical and electromechanical automata.
The age of calculating machines
Economic expansion in the late nineteenth century, made possible by innovations in manufacturing and transportation, created a demand for the efficient and accurate calculation of numbers. The same painstaking and error-prone process of human calculation that originally motivated Babbage was threatening to block further economic and scientific progress. Demands for census data were growing more complex and an emerging interest in social research provided additional motivations for an entire generation of inventors.27
Yet producing a truly reliable mechanical calculator proved to be a daunting task. Many ingenious devices were created, but like the Pascaline and Leibniz’s Stepped Reckoner of earlier centuries, most were plagued by severe problems of accuracy and reliability.28 One of the inventors selling unreliable calculators was William Burroughs (1857-1898). His first production run of fifty Adding and Listing Machines quickly sold out, but all had to be recalled because of inconsistent performance.29 Burroughs persisted, however, and after years of exhausting work, the world’s first dependable key-driven calculator was brought to market. The machine had begun to receive widespread acceptance by the time of the inventor’s death from tuberculosis in 1898.30 While Burroughs’s machine was substantially simpler than Babbage’s, and not programmable, the perfection of mechanical calculation succeeded by the early twentieth century in transforming the conduct of business, government, and scientific investigation.
The age of tabulating machines
The taking of the national census every ten years is required by the U.S. Constitution, and by 1890 the Census Bureau was in a crisis. The 1880 census had been completed only a couple of years earlier. The population had burgeoned from a new wave of European immigration, and commitments had been made to Congress to provide extensive new types of information about the population. It looked as if the 1890 census would still be in full swing when it would be time to start again in 1900.31
To address the problem, a competition was held. Among the many innovative ideas for more efficient manual tabulation were schemes using multicolored cards and coded paper chips. However, no manual method prevailed. The winner was Herman Hollerith (1860-1929), a young Census Bureau engineer.32 Inspired by an off-hand suggestion of his supervisor and borrowing the idea of using holes punched in cards to represent information from Babbage’s Analytical Engine and the loom of Joseph-Marie Jacquard (1754-1834), Hollerith was able to demonstrate a solution at least eight times faster than the other finalists in the competition.33 The key to Hollerith’s breakthrough was a paper card remarkably similar to modern punch cards. As with modern computer cards, one corner was clipped so that the card orientation could be quickly determined. Each card contained 288 locations for possible holes to represent up to 288 bits of information.34 The equipment included a keypunch machine for encoding information and a card reader called a pin press. The latter consisted of 288 spring-loaded rods, each of which would make contact with a small container of mercury to complete an electric circuit if the corresponding hole was punched. The machine provided for multiple counters and logic circuits that could respond to relatively complicated patterns of information. The equipment turned out to be relatively fast and reliable: cards could be passed through at a rate exceeding one per second.35
The 1890 census represented the first time that electricity was used for a major data-processing project. The punched card itself survived as a mainstay of computing until quite recently. In view of the scale of the project, Hollerith’s innovation was implemented with remarkably few problems. Despite a 25 percent increase in population from the 1880 census (to 63 million people) and a dramatic increase in the complexity of the analysis provided, Hollerith’s electromechanical information machine did the job in only two and a half years, less than a third of the time required for the previous census.36
With the success of the 1890 census in hand, Hollerith set up the Tabulating Machine Company in 1896.37 The business grew rapidly, with his tabulators being used for a variety of business applications, census analysis in numerous European countries, and even for Russia’s first census in 1897.38 For the 1900 U.S. census, Hollerith introduced another innovation, an automatic card feed.39 This turned out to be Hollerith’s last contract with the Census Bureau. A dispute arose over rental charges, and for the 1910 census, the Census Bureau once again sponsored an internal project to develop an alternative technology. The result was improved tabulating equipment and another commercial concern called the Powers Accounting Machine Company.40
The two concerns became fierce competitors, a competition which lasted well into the modern era. Hollerith’s company, the Tabulating Machine Company, acquired several other firms by 1911 to form the Computing-Tabulating-Recording Company (CTR).41 In 1914, offering a salary of $25,000 and a stock option for 1,220 shares, CTR hired a forty-year-old executive named Thomas J. Watson (1874-1956), who had built a strong reputation for aggressive marketing strategies at National Cash Register.42 After only three months Watson was named president, and over the next six years the firm’s revenues more than tripled from $4 million to $14 million. In 1924 Watson was named chief executive officer, and he renamed the company International Business Machines (IBM), a reflection of Watson’s ambition and confidence.43
The Powers Accounting Machine Company also went through a series of mergers by 1927 to become Remington Rand Corporation, which merged with Sperry Gyroscope to become Sperry-Rand Corporation in 1955. Sperry-Rand was one of IBM’s primary competitors when computers really took off in the late 1950s.44
The mechanical roots of computation
As we have seen, automatic computation did not start with the electronic computer. The architecture of the modern computer, as well as some of the major players of today’s computer industry, have their roots in the mechanical and electromechanical automata of the late nineteenth century. The design and computational theory inherent in Babbage’s unrealized programmable computer provided the inspiration for the first realized programmable computers, those of the twentieth century (although not the first one). Babbage’s ideas were conceived in terms of mechanical technology and only realized when electromechanical and later all-electronic technology was perfected during World War II. Howard Aiken, the developer of the first American programmable computer, commented, “If Babbage had lived seventy-five years later, I would have been out of a job.”45 The computer industry itself was also conceived in an era of largely mechanical technology. It had to wait for modern electronics to provide the price-performance ratio required for it to flourish. It is interesting to note that Sperry Rand, the company to introduce the first commercial computer, as well as IBM, the modern industry’s leader and one of the largest industrial corporations in the world, were both spin-offs of the U.S. Census Bureau.46