Are We Becoming an Endangered Species? Technology and Ethics in the Twenty First Century
November 20, 2001 by Ray Kurzweil
Ray Kurzweil addresses questions presented at Are We Becoming an Endangered Species? Technology and Ethics in the 21st Century, a conference on technology and ethics sponsored by Washington National Cathedral. Other panelists are Anne Foerst, Bill Joy and Bill Mckibben.
Originally presented on November 19, 2001 at Washington National Cathedral. Published on KurzweilAI.net November 19, 2001. See the briefing paper, which contains questions posed to all panelists. Also see news item.
Ray Kurzweil: Questions and Answers
Ray Kurzweil, how do you respond to Mr. Joy’s concerns? Do scientific and technological advances pose a real threat to humanity, or do they promise to enhance life?
The answer is both, and we don’t have to look further than today to see what I call the deeply intertwined promise and peril of technology.
Imagine going back in time, let’s say a couple hundred years, and describing the dangers that lay ahead, perils such as weapons capable of destroying all mammalian life on Earth. People in the eighteenth century listening to this litany of dangers, assuming they believed you, would probably think it mad to take such risks.
And then you could go on and describe the actual suffering that lay ahead, for example 100 million people killed in two great twentieth-century world wars, made possible by technology, and so on. Suppose further that we provide these people circa eighteenth century a choice to relinquish these then future technologies, they just might choose to do so, particularly if we were to emphasize the painful side of the equation.
Our eighteenth century forbears, if provided with the visions of a reliable futurist of that day, and if given a choice, might very well have embraced the view of my fellow panelist Bill McKibben who says today that we “must now grapple squarely with the idea of a world that has enough wealth and enough technological capability, and should not pursue more.”
Judy Woodruff interviews Ray Kurzweil at Washington National Cathedral.
Now I believe that implementing such a choice would require a Brave New World type of totalitarian government in which the government uses technology to ban the further development of technology, but let’s put that perspective aside for a moment, and pursue this scenario further. What if our forefathers, and foremothers, had made such a decision? Would that have been so bad?
Well, for starters, most of us here today would not be here today, because life expectancy would have remained what it was back then, which was about 35 years of age. Furthermore, you would have been busy with the extraordinary toil and labor of everyday life with many hours required just to prepare the evening meal. The vast majority of humanity pursued lives that were labor-intensive, poverty-stricken, disease-ridden, and disaster-prone.
This basic equation has not changed. Technology has to a great extent liberated at least many of us from the enormous difficulty and fragility that characterized human life up until recent times. But there is still a great deal of affliction and distress that needs to be conquered, and that indeed can be overcome by technological advances that are close at hand. We are on the verge of multiple revolutions in biotechnology — genomics, proteomics, rational drug design, therapeutic cloning of cells, tissues, and organs, and others — that will save tens of millions of lives and alleviate enormous suffering. Ultimately, nanotechnology will provide the ability to create any physical product. Combined with other emerging technologies, we have the potential to largely eliminate poverty which also causes enormous misery.
And yes, as Bill Joy, and others, including myself, have pointed out, these same technologies can be applied in destructive ways, and invariably they will be. However, we have to be mindful of the fact that our defensive technologies and protective measures will evolve along with the offensive potentials. If we take the future dangers such as Bill and others have described, and imagine them foisted on today’s unprepared world, then it does sound like we’re doomed. But that’s not the delicate balance that we’re facing. The defense will evolve along with the offense. And I don’t agree with Bill that defense is necessarily weaker than offense. The reality is more complex.
We do have one contemporary example from which we can take a measure of comfort. Bill Joy talks about the dangers of self-replication, and we do have today a new form of fully nonbiological self-replicating entity that didn’t exist just a few decades ago: the computer virus. When this form of destructive intruder first appeared, strong concerns were voiced that as they became more sophisticated, software pathogens had the potential to overwhelm, even destroy, the computer network medium they live in. Yet the immune system that has evolved in response to this challenge has been largely effective. The injury is but a small fraction of the benefit we receive from computer technology. That would not be the case if one imagines today’s sophisticated software viruses foisted on the unprepared world of six or seven years ago.
One might counter that computer viruses do not have the lethal potential of biological viruses or self-replicating nanotechnology. Although true, this only strengthens my observation. The fact that computer viruses are usually not deadly to humans means that our response to the danger is that much less intense. Conversely, when it comes to self-replicating entities that are potentially lethal, our response on all levels will be vastly more serious.
Having said all this, I do have a specific proposal that I would like to share, which I will introduce a little later in our discussion.
Mr. Kurzweil, given humanity’s track record with chemical and biological weapons, are we not guaranteed that terrorists and/or malevolent governments will abuse GNR (Genetic, Nanotechnology, Robotics) technologies? If so, how do we address this problem without an outright ban on the technologies?
Yes, these technologies will be abused. However, an outright ban, in my view, would be destructive, morally indefensible, and in any event would not address the dangers.
Nanotechnology, for example, is not a specific well-defined field. It is simply the inevitable end-result of the trend toward miniaturization which permeates virtually all technology. We’ve all seen pervasive miniaturization in our lifetimes. Technology in all forms — electronic, mechanical, biological, and others — is shrinking, currently at a rate of 5.6 per linear dimension per decade. The inescapable result will be nanotechnology.
With regard to more intelligent computers and software, it’s an inescapable economic imperative affecting every company from large firms like Sun and Microsoft to small emerging companies.
With regard to biotechnology, are we going to tell the many millions of cancer sufferers around the world that although we are on the verge of new treatments that may save their lives, we’re nonetheless canceling all of this research.
Banning these new technologies would condemn not just millions, but billions of people to the anguish of disease and poverty that we would otherwise be able to alleviate. And attempting to ban these technologies won’t even eliminate the danger because it will only push these technologies underground where development would continue unimpeded by ethics and regulation.
We often go through three stages in examining the impact of future technology: awe and wonderment at its potential to overcome age-old problems, then a sense of dread at a new set of grave dangers that accompany these new technologies, followed by the realization that the only viable and responsible path is to set a careful course that can realize the promise while managing the peril.
The only viable approach is a combination of strong ethical standards, technology-enhanced law enforcement, and, most importantly, the development of both technical safeguards and technological immune systems to combat specific dangers.
And along those lines, I have a specific proposal. I do believe that we need to increase the priority of developing defensive technologies, not just for the vulnerabilities that society has identified since September 11, which are manifold, but the new ones attendant to the emerging technologies we’re discussing this evening. We spend hundreds of billions of dollars a year on defense, and the danger from abuse of GNR technologies should be a primary target of these expenditures. Specifically, I am proposing that we set up a major program to be administered by the National Science Foundation and the National Institutes of Health. This new program would have a budget equaling the current budget for NSF and NIH. It would be devoted to developing defensive strategies, technologies, and ethical standards addressed at specific identified dangers associated with the new technologies funded by the conventional NSF and NIH budgets. There are other things we need to do as well, but this would be a practical way of significantly increasing the priority of addressing the dangers of emerging technologies.
If humans are going to play God, perhaps we should look at who is in the game. Mr. Kurzweil, isn’t it true that both the technological and scientific fields lack broad participation by women, lower socioeconomic classes and sexual and ethnic minorities? If so, shouldn’t we be concerned about the missing voices? What impact does the narrowly defined demographic have on technology and science?
I think it would be great to have more women in science, and it would lead to better decision making at all levels. To take an extreme example of the impact of not having sufficient participation by women, the Taliban have had no women in decision-making roles, and look at the quality of their decision-making.
To return to our own society, there are more women today in computer science, life sciences, and other scientific fields compared to 20 years ago, but clearly more progress is needed. With regard to ethnic groups such as Afro-Americans, the progress has been even less satisfactory, and I agree that addressing this is an urgent problem.
However, the real issue goes beyond direct participation in science and engineering. It has been said that war is too important to leave to the generals. It is also the case that science and engineering is too important to leave to the scientists and engineers. The advancement of technology from both the public and private sectors has a profound impact on every facet of our lives, from the nature of sexuality to the meaning of life and death.
To the extent that technology is shaped by market forces, then we all play a role as consumers. To the extent that science policy is shaped by government, then the political process is influential. But in order for everyone to play a role in playing God, there does need to be a meaningful dialog. And this in turn requires building bridges from the often incomprehensible world of scientific terminology to the everyday world that the educated lay public can understand.
Your work, Anne (Foerst), is unique and important in this regard, in that you’ve been building a bridge from the world of theology to the world of artificial intelligence, two seemingly disparate but surprisingly related fields. And Judy (Woodruff), journalism is certainly critical in that most people get their understanding of science and technology from the news.
We have many grave vulnerabilities in our society already. We can make a long list of exposures, and the press has been quite active in reporting on these since September 11. This does, incidentally, represent somewhat of a dilemma. On the one hand, reporting on these dangers is the way in which a democratic society generates the political will to address problems. On the other hand, if I were a terrorist, I would be reading the New York Times, and watching CNN, to get ideas and suggestions on the myriad ways in which society is susceptible to attack.
However, with regard to the GNR dangers, I believe this dilemma is somewhat alleviated because the dangers are further in the future. Now is the ideal time to be debating these emerging risks. It is also the right time to begin laying the scientific groundwork to develop the actual safeguards and defenses. We urgently need to increase the priority of this effort. That’s why I’ve proposed a specific action item that for every dollar we spend on new technologies that can improve our lives, we spend another dollar to protect ourselves from the downsides of those same technologies.
How do you view the intrinsic worth of a “post-biological” world?
We’ve heard some discussion this evening on the dangers of ethnic and gender chauvinism. Along these lines, I would argue against human chauvinism and even biological chauvinism. On the other hand, I also feel that we need to revere and protect our biological heritage. And I do believe that these two positions are not incompatible.
We are in the early stages of a deep merger between the biological and nonbiological world. We already have replacement parts and augmentations for most of the organs and systems in our bodies. There is a broad variety of neural implants already in use. I have a deaf friend who I can now speak to on the telephone because of his cochlear implant. And he plans to have it upgraded to a new version that will provide a resolution of over a thousand frequencies that may restore his ability to appreciate music. There are Parkinson’s patients who have had their ability to move restored through an implant that replaces the biological cells destroyed by that disease.
By 2030, this merger of biological and nonbiological intelligence will be in high gear, and there will be many ways in which the two forms of intelligence work intimately together. So it won’t be possible to come into a room and say, humans on the left, and machines on the right. There just won’t be a clear distinction.
Since we’re in a beautiful house of worship, let me relate this impending biological — nonbiological merger to a view of spiritual values.
I regard the freeing of the human mind from its severe physical limitations of scope and duration as the necessary next step in evolution. Evolution, in my view, represents the purpose of life. That is, the purpose of life — and of our lives — is to evolve.
What does it mean to evolve? Evolution moves toward greater complexity, greater elegance, greater knowledge, greater intelligence, greater beauty, greater creativity, and more of other abstract and subtle attributes such as love. And God has been called all these things, only without any limitation: all knowing, unbounded intelligence, infinite beauty, unlimited creativity, infinite love, and so on. Of course, even the accelerating growth of evolution never quite achieves an infinite level, but as it explodes exponentially, it certainly moves rapidly in that direction. So evolution moves inexorably closer to our conception of God, albeit never quite reaching this ideal. Thus the freeing of our thinking from the severe limitations of its biological form may be regarded as an essential spiritual quest.