THE AGE OF INTELLIGENT MACHINES | Prologue: The Second Industrial Revolution
September 24, 2001
- author |
- Ray Kurzweil
- year published |
On May 26, 1733, John Kay, a twenty-nine-year-old inventor, received the news that the English Patent Office had awarded him a patent for his New Engine for Opening and Dressing Wool, now known as the flying shuttle.1 To Kay this was good news, for he hoped to start a small business supplying his new machine to the burgeoning English textile industry. What neither Kay nor his contemporaries realized at the time was that his innovation in the weaving of cloth represented the launching of the Industrial Revolution.
Like many innovations that come at the right time in the right place, the flying shuttle caught on quickly. Unfortunately, Kay was more talented as an inventor than as a businessman, and after losing most of his money in litigation attempting to enforce his patent, he moved to France, where he died in poverty.
Kay nonetheless had a lasting impact. The widespread adoption of the flying shuttle created pressure for the more efficient spinning of yarn, which led to Sir Richard Arkwright’s Cotton Jenny, patented in 1770. In turn, machines to card and comb the wool to feed the new mechanized spinning machines were developed in the 1780s. By the turn of the century all aspects of the production of cloth had been automated. The cottage industry of English textiles was rapidly being replaced by increasingly efficient centralized machines.2
Good ideas catch on and innovators in other industries took note of the dramatically improved productivity that mechanization had brought to English textiles. The process of industrialization spread to other industries and to other countries. Major innovations that followed included Ford’s (1863-1947) concept of mass production and Edison’s (1847-1931) harnessing of the electron. Ultimately Europe, the United States, Japan, and other parts of the world shifted from an agrarian and craft economy to one dominated by machines. The succession of increasingly efficient generations of automation has continued to this day. The changing patterns of production and employment, together with related scientific advances, have had dramatic effects on all aspects of modern life, profoundly affecting our social, cultural, educational, economic, and political institutions.
The Industrial Revolution was not without its controversies. Emerging, appropriately enough, from the English textile industry, the Luddite movement was founded in Nottingham in 1811.3 The movement posed a serious and violent challenge to what its members perceived as a diabolical danger to the textile workers’ livelihoods. In one sense, the fears of the Luddites were accurate. Jobs they thought were threatened by the new machines did indeed disappear. At the same time, however, new jobs were created as new industries emerged and economic activity increased, although this was often of little consequence to those displaced. The Luddite movement itself was ended within a decade of its founding due to a combination of repression and prosperity, although its name has remained very much alive as a symbol of a still lingering issue.4 Automation versus jobs is still a particularly controversial issue in Europe, where it has had a noticeable impact on the rate at which new automated technologies are introduced. In the United States the issue simmers beneath the surface of political debate but rarely affects the pace of change. In Japan the issue is virtually unknown, due partly to a tradition in which the prosperous “first tier” industrial corporations provide lifetime employment, although employment guarantees are generally not extended by the less powerful “second tier” corporations and cottage industries.
Let us examine the Luddite issue for a moment. It is generally acknowledged that new jobs result as new industries are created by the advent of automation. The critical question then becomes, How do these jobs compare to the jobs that are displaced? In particular, for every ten jobs that are eliminated by automation, are we creating twelve new jobs or eight? Do the new jobs pay more or less than the older ones? Are they more or less fulfilling? What about those who are displaced; can they be retrained for the new jobs? Are they?
We now have over a century of extensive industrialization to look back on, and an examination of some clear economic trends over the past century can provide insights into at least some of the above questions. With regard to the numbers of jobs, the answer is closer to twelve than eight. In 1870 only twelve million Americans, representing 31 percent of the population, had jobs.5 By 1985 the figure rose to 116 million jobs held by 48 percent of the population.6 This substantial increase in the number of jobs occurred despite the dramatic shift away from the labor content of agriculture. In 1900 more than a third of all American workers were involved in food production.7 Today Americans are better fed and America a major food exporter, with only 3 percent of the workforce involved.8
In the economic power of jobs we see the most dramatic change. The gross national product on a per capita basis and in constant 1958 dollars went from $530 in 1870 to $3,500 in 1970.9 There has been a similar change in the actual earning power of the available jobs. This 600 percent increase in real wealth has resulted in a greatly improved standard of living, better health care and education, and a substantially improved ability to provide for those who need help in our society. At the beginning of the Industrial Revolution life expectancy in North America and northwestern Europe was about 37 years. Now, two centuries later, it has doubled.
The jobs created have also been on a higher level and indeed much of the additional employment has been in the area of providing the higher level of education that today’s jobs require. For example, we now spend ten times as much (in constant dollars) on a per capita basis for public school education than we did one hundred years ago.10 In 1870 only 2 percent of American adults had a high school diploma, whereas the figure is 76 percent today.11 There were only 52,000 college students in 1870; there are 7.5 million today.12 Attempts to project these trends into the future, including a recent study by the Institute for Economic Analysis at New York University, where a detailed input-output model of the U.S. economy was studied, indicate a continuation of these same trends.13 While there will be ebbs and flows in economic development, the trend over the next two decades indicates that employment and productivity will continue to increase, as will the average educational level of the population. The study indicated, for example, that the share of jobs going to professionals will increase from 15 percent today to 20 percent by the end of the next decade, with engineers and teachers accounting for virtually all of the increase.14
From these trends it would seem that the concerns of the Luddite movement are not well founded. From a macroeconomic point of view, it is clear that automation and other related technological advances have fueled over a century of dramatic economic development. There are nonetheless difficult, if often temporary, dislocations that result from rapid technological change.15 As our smokestack industries contract, workers with one set of skills do not necessarily find it easy to develop new careers. With the pace of change accelerating, we as a society need to find a way to provide viable avenues for displaced workers to reenter the economic mainstream with something more than a new dead-end job.
As profound as the implications of the first Industrial Revolution were, we are now embarking on yet another transformation of our economy, based once again on innovation. The Industrial Revolution of the last two centuries-the first Industrial Revolution-was characterized by machines that extended, multiplied, and leveraged our physical capabilities. With these new machines, humans could manipulate objects for which our muscles alone were inadequate and carry out physical tasks at previously unachievable speeds. While the social and economic impact of this new technology was controversial, the concept of machines being physically superior to ourselves was not. After all, we never regarded our species as unequaled in this dimension. Jaguars can run faster than we can, lions are better hunters, monkeys are better climbers, whales can dive deeper and longer, and birds are better fliers indeed, without machines we cannot fly at all.
The second industrial revolution, the one that is now in progress, is based on machines that extend, multiply, and leverage our mental abilities. The same controversies on social and economic impact are attending this second great wave of automation, only now a new and more profound question has emerged. Though we have always regarded our species as relatively mediocre in physical capacity, this has not been our view with regard to our mental capacity. The very name we have given ourselves, Homo sapiens, defines us as the thinking people. The primary distinction in our biological classification is the ability of our species to manipulate symbols and use language.
Before Copernicus (1473-1543), our “species centricity” was embodied in a view of the universe literally circling around us in a testament to our unique and central status. Today our belief in our own uniqueness is a matter not of celestial relationships but of intelligence. Evolution is seen as a billion-year drama leading inexorably to its grandest creation-human intelligence. The spectre of machine intelligence competing even tangentially with that of its creator once again threatens our view of who we are.
This latest revolution, based on machines that expand the reach of our minds, will ultimately have a far greater impact than the revolution that merely expanded the reach of our bodies. It promises to transform production, education, medicine, aids for the handicapped, research, the acquisition and distribution of knowledge, communication, the creation of wealth, the conduct of government, and warfare. The cost-effectiveness of the key ingredients in our new technological base-computers and related semiconductor technology-is increasing at an exponential rate. The power of computer technology now doubles (for the same unit cost) every 18 to 24 months.16
Unlike some revolutions, this latest transformation of our industrial base will not arrive after one brief period of struggle. It will be a gradual process, but it is one already under way. The potential exists to begin to solve problems with which the human race has struggled for centuries. An example is the application of computer technology to the needs of the handicapped, a personal interest of mine. It is my belief that the potential exists within the next one or two decades to greatly ameliorate the principal handicaps associated with sensory and physical disabilities such as blindness, deafness, and spinal cord injuries. New bioengineering techniques that rely on expert systems and computer-assisted design stations for biological modeling are fueling a new optimism for effective treatments of a wide range of diseases, including genetic disorders.17 The increase in real per capita wealth-600 percent in the past 100 years-is projected to continue. 18 There are many other examples of anticipated benefit.
The potential for danger is also manifest. We are today beginning to turn over our engines of war to intelligent machines, whose intelligence may be as flawed as our own.19 Computer technology is already a powerful ally of the totalitarian.
The advent of intelligent machines is altering global trade relationships. A remarkable aspect of this new technology is that it uses almost no natural resources. Silicon chips use infinitesimal amounts of sand and other readily available materials. They use insignificant amounts of electricity. As computers grow smaller and smaller, the material resources utilized are becoming an inconsequential portion of their value. Indeed, software uses virtually no resources at all. The value of the technology lies primarily in the knowledge governing the design of the hardware, software, and databases that constitute our intelligent machines, and in the ability to continue advancing these designs. This decreasing importance of material resources has allowed Japan, a country very poor in natural resources but rich in knowledge and expertise, to become one of the two wealthiest nations on the planet. There is the potential for emerging nations to largely skip industrialization and develop postindustrial societies based on an information economy.20 While the first Industrial Revolution increased the demand for and the value of natural resources, the second industrial revolution is doing the opposite.
In the case of computer software, it is apparent that one is paying for the knowledge inherent in the design and not for the raw materials represented by the floppy disk and users’ manual. What is sometimes less apparent is that the same economic model holds for most computer hardware as well. An advanced chip generally costs no more to produce than a floppy disk. As with a software program, the bulk of the cost of a chip is neither raw materials nor manufacturing labor, but rather what accountants call amortization of development, and what philosophers call knowledge.
It is estimated that raw materials comprise less than 2 percent of the value of chips (which is about the same estimate as for software) and less than 5 percent of the value of computers. As our computers become more powerful, the percentage of their value accounted for by raw materials continues to diminish, approaching zero. It is interesting to note that the same trend holds for most other categories of products. Raw materials comprise about 20 percent of the value of musical instruments, with this figure rapidly declining as acoustic musical-instrument technology is being replaced with digital-electronic technology. George Gilder estimates that the cost of raw materials for automobiles is now down to 40 percent of total costs (see “A Technology of Liberation” in this book). Again, this figure will continue to decline with the increasing use of computers and electronics as well as the replacement of expensive and relatively simple body materials such as steel with inexpensive yet relatively complex alternative materials such as new high-tech plastics.
With regard to the world of defense, military engagements such as the Israeli destruction of Russian SAM sites in Syria, the use of “smart” missiles in the Falklands war, and others have illustrated the growing importance of artificial intelligence in the military.21 Many military observers now predict that in the 1990s, artificial intelligence technology will be of greater strategic importance than manpower, geography, and natural resources.22 A major program called SCI (Strategic Computing Initiative) envisions the soldier of the future relying on a vast network of intelligent computers to make tactical decisions, fly planes, aim weapons, and avoid enemy fire .23