The New York Times | Why do we need predictions? Technology 25 years hence
December 27, 2010
The New York Times — December 27, 2010 | Ray Kurzweil
Thirty years ago, I realized that timing was the key to success as an inventor. Most inventions fail because the timing is wrong — the innovation needs to make sense for the world that will exist when the project is finished.
Consider how quickly the world changes; just a few years ago, most people didn’t use social networks, wikis or blogs. As an engineer, I gathered a lot of data to try to make sense of technology trends, and found a significant exception to the notion that “you can’t predict the future.”
If you plot the basic measures of the price to performance and capacity of information technologies (for example, computer instructions per second per constant dollar, bits of memory per dollar, or the total number of bits being moved around over the Internet), they follow remarkably smooth — and foreseeable — trajectories. This observation goes well beyond Moore’s Law (which says you can place twice as many transistors on an integrated circuit every two years); in the case of computation, it goes back to the 1890 American census, long before Gordon Moore was even born.
What’s predictable is that these measures grow exponentially, not linearly, though our intuition about the future is linear, which is hard-wired in our brains. This makes a remarkable difference. Thirty steps linearly gets you to 30, whereas 30 steps exponentially (2, 4, 8, 16. . .) gets you to a billion.
This “law of accelerating returns,” as I call it, tells us that any area of information technology will grow enormously in power while becoming ever smaller in size. This law has continued for the three decades since I first noticed it, and goes back decades before that.
And it’s not just electronics and communications that follow this exponential course. It applies as well to health, medicine and its related field of biology. The Human Genome Project, for instance, saw the amount of genetic sequencing double and the cost of sequencing per base pair come down by half each year.
In the early 1980s, I saw the ARPANET (now known as the predecessor to the Internet) double in size each year. In “The Age of Intelligent Machines,” which I wrote in the mid-1980s, I described a vast worldwide communications web emerging in the mid-to-late ‘90s as a result of this power of exponential growth. At the time, this seemed absurd to many observers, since the entire Department of Defense was only able to provide digital communication to a few thousand scientists per year. But the Web exploded by the late ‘90s as a result of the power of exponential growth.
Over the past two decades, I’ve made hundreds of predictions based on the law of accelerating returns. Of 147 predictions for 2009 that I made in “The Age of Spiritual Machines,” which I wrote in the 1990s, 78 percent were correct as of the end of 2009, and an additional 8 percent were off by a year or two. The closer I stayed to just predicting the underlying price/performance and capacity of information technologies, the more accurate the predictions were.
The law of accelerating returns is the only reliable method I know that allows us to forecast at least certain aspects of the future. A computer that fit inside a building when I was a student now fits in my pocket, and is a thousand times more powerful despite being a million times less expensive.
In another quarter century, that capability will fit inside a red blood cell and will again be a billion times more powerful per dollar.