JavaOne Conference Proceedings | Kurzweil keynote transcript for Oracle’s 2010 JavaOne Conference

September 23, 2010

JavaOne Conference Proceedings — September 23, 2010

This is a summary. Read original article in full here.

Click the image to watch the keynote video

Also see the JavaOne wrap-up article: Extreme Technologies at an Extreme Event: The 2010 JavaOne Conference was rich in ideas, innovation, and entertainment.”

Transcript of Ray Kurzweil keynote for Oracle’s JavaOne Conference 2010: “The Age of Embedded Computing, Everywhere.”

I started using computers in 1960, that is 50 years ago. I was 12 years old. That’s not so amazing today, but it was unusual then because it was unusual for anybody to be working with computers; there were only a dozen computers in all of New York City. A few years later, I went to MIT, and I went there because MIT was so advanced in 1965 that it actually had a computer. I think Harvard had one also. There were only a few in all of Massachusetts for the technically-minded among you, which I think is all of you. That was an IBM 7094 with 32K of memory, which was a lot in those days, 36-bit words, so 150,000 bytes of memory, quarter of a MIP.

This computer I carry around, which I understand has Java in it, is a million times cheaper. It’s thousands of times more powerful, in terms of bits, MIPs, bits of communication — that’s a billion-fold increase in price performance and computing per dollar since I was a student. And we’ll do it again in another 25 years. Even the rate of exponential growth is increasing. And a number of years later, in the 1980s, computers were still big centralized devices. I wrote that within a few decades computers would be massively distributed, very small devices, running embedded computing. I described a language very much like Java. There would be billions of them around the world. That seemed crazy back in the 1980s, but that’s exactly what has happened.

Now, I foresee there will be thousands, and ultimately millions of devices running Java inside your body, and that’s no more crazy today than these projections were back in the 1980s. I saw the ARPANET developing exponentially, but nobody noticed it because there were only a few thousand scientists being connected, but I saw that it was progressing exponentially. And so I did the math and projected this worldwide communication network merging in the mid-1990s, tying together tens, and ultimately hundreds of millions of people to each other and to vast knowledge resources, and ultimately, these would be small devices you carried in your pocket. That seemed crazy. People said well, it will happen, but it will be centuries, but that’s the power of exponential growth, and people say well, yes, sure, Moore’s Law.

Moore’s Law is really just one example among many of a much broader phenomenon, which I call the law of accelerating returns, which pertains to anything having to do with information. And even within computing, Moore’s Law was not the first paradigm to bring exponential growth to computers. The exponential growth of computers started decades before Gordon Moore was even born, and many decades before he did this back to the envelope projection of transistors on an integrated circuit. We had five different paradigms, and one of the criticisms I get is oh Kurzweil takes these exponentials and projects them out. And we all know exponential growth can’t go on forever. You have two rabbits in Australia. You get 4 rabbits, 8 rabbits, 16 rabbits, but that can’t go on forever. Finally, the rabbits run out of things to eat.

Isn’t that true also of information technology? And the answer is, yes, it’s true for specific paradigms, but what happens is we run out of steam for a particular paradigm, it creates research pressure to create the next paradigm. The third paradigm was shrinking vacuum tubes. I have a little museum, and we have a computer with tiny little vacuum tube computers. In 1950 CBS predicted the election of Eisenhower. The first time the networks did that. And then every year they were shrinking vacuum tubes, making them smaller and smaller. Finally, they got to a point in the late ’50s they couldn’t shrink the vacuum tubes anymore and keep the vacuum, and that was the end of the shrinking of vacuum tubes. It was not the end of the exponential growth of computing. Just went to another paradigm, to transistors, and then, finally, to Moore’s Law and integrated circuits.

There’s been regular predictions that that will come to an end. Gordon Moore originally said 2002. Intel says now 2022, but that will lead to the sixth paradigm, which is 3-dimensional computing, particularly with self-organizing molecular circuits, and we already see early steps in 3-dimensional chips with multiple layers and plans for a thousand layer circuitry. In terms of self-organizing circuits, if you speak to Justin Rattner, the CTO of Intel, he’ll tell you they have these circuits working in their labs. They’ll see the cross-over in the teen years, well before we run out of steam with Moore’s Law. So, it’s not just Moore’s Law. It’s not just computers. We can look at magnetic data storage density. That’s not Moore’s Law. It’s not transistors. Different engineers, different companies, same progression, communication technologies.

Let’s say, take the number of bits being moved around in wireless networks. Going back 100 years to Morse code transmissions up through 4G networks today, very smooth exponential growth. And this is really a key thing to contemplate because there is a big difference between our intuition, which is not exponential, but linear and the reality of information technology, which is exponential, and it’s not everything that progresses exponentially. It’s only information technology, and the reason is there is no inherent material limits to information technology. It can progress by taking innovations, creating a set of tools. We then use those tools to create the next set of tools. So we use computer system design to create the next set of computers. And because we’re constantly changing these platforms, we need software that can run independently of these various changes and platforms. And that’s why a language like Java, which actually just originated not so long ago, I think 15 years ago, is really the right way to go, so that you can write software and have it run on this multiplicity of platforms.

What’s the difference between linear intuition and the exponential reality? Well, if I take 30 steps linearly, that’s our intuition about the future, I get to 30. If I take 30 steps exponentially 2, 4, 8, 16, I get to a billion. It makes a very profound difference, and this is not a theoretical speculation about the future. As I mentioned, just since I was a student, we’ve seen billions fold increase in the power of computers per unit currency, and that’s going to continue, and it’s not just computers. Everything we care about, health and medicine, biology, which was not an information technology, it was just hit or miss, and therefore, progressed linearly and not exponentially has now become an information technology because we have the software of life. We have the means of changing and updating this outmoded software. We see the same progression there.

The Genome Project was considered a failure half way through the project. Seven-and-a-half years into this 15-year project, mainstream skeptics said I told you this wasn’t going to work. You’re halfway through the project, and you finished 1% of the project, but that’s actually right on schedule for an exponential progression. Exponentials are seductive. They are surprising. It starts out, looks like nothing is happening because you are doubling these tiny little numbers. By the time you get to 1%, now, you’ve got some traction. It’s only 7 doublings away from 100%, and in the case of the Genome Project, it continued to double every year, was finished 7 years later, much to the surprise of the skeptics, and there are many different areas of biology that is scaling up in this exponential manner.

So I’d like to just show you quickly since our time is limited this morning how pervasive this exponential growth is. How information technology is invading more and more industries, and it’s going to profoundly change them, how our expectations really need to be aligned with this exponential growth. Sometimes, we actually get ahead of the curve. In the 1990s, people looked at the Internet, and finally woke up, wow, this is a worldwide web connecting all those people. This is going to change everything, and we had the dot-com boom. People thinking every business model is going to be turned on its head. And then they came back to a few years later on the year 2000, the investment community that is, and said, gee, you know, hasn’t changed everything. Hasn’t changed anything. I guess, we were wrong, and all the values went the other way.

Meanwhile, it was progressing exponentially, but it was at that early stage where you don’t really notice an exponential. Now, we do have dot-coms like Google with $20 billion of revenue. There is $2 trillion of e-commerce. The dot-coms have changed every business model. Look at all the media companies, for example. In fact, this boom/bust psychology is an accurate harbinger of what ultimately is a profound transformation. It happened in communications in the ’90s. It happened in Artificial Intelligence in the ’80s. It happened with the railroads in the 19th century. So, let me just show you how pervasive this is. How it’s going to lead computers to become smaller and smaller? That, by the way, is another exponential progression. We’re shrinking the size of both computers and mechanical technologies like microelectronic mechanical systems by a factor of 100 3-D volume per decade. So, what used to fill a room, maybe half the size, with a very low-powered computer can now be in our pockets and on our bodies. Ultimately, it will be the size of blood cells and go inside our bodies. That’s another exponential progression. While we make them more powerful and less expensive, and ultimately, this is going to transform everything we care about.

(Original transcript provided by Oracle’s JavaOne Conference.)