Bill Joy (1954 – ) is an American computer scientist who co-founded Sun Microsystems in 1982, and served as chief scientist at the company until 2003. His now famous Wired magazine essay, “Why the future doesn’t need us,” (2000) sets forth his deep concerns over the development of modern technologies.[i]
Joy traces his concern to a discussion he had with Ray Kurzweil at a conference in 1998. Taken aback by Kurzweil’s predictions, he read an early draft of The Age of Spiritual Machines: When Computers Exceed Human Intelligence, and found it deeply disturbed. Subsequently he encountered arguments by the Unabomber Ted Kaczynski’s. Kaczynski argued that if machines do all the work, as they inevitably will, then we can: a) let the machines make all the decisions; or b) maintain human control over the machines.
If we choose “a” then we are at the mercy of our machines. It is not that we would give them control or that they would take control, rather, we might become so dependent on them that we would have to accept their commands. Needless to say, Joy doesn’t like this scenario. If we choose “b” then control would be in the hands of an elite, and the masses would be unnecessary. In that case the tiny elite: 1) would exterminate the masses; 2) reduce their birthrate so they slowly became extinct; or 3) become benevolent shepherds to the masses. The first two scenarios entail our extinction, but even the third option is no good. In this last scenario the elite would see to it that all physical and psychological needs of the masses are met, while at the same time engineering the masses to sublimate their drive for power. In this case the masses might be happy, but they would not be free.
Joy finds these arguments convincing and deeply troubling. About this time Joy read Moravec’s book where he found more of the same kind of predictions. He found himself especially concerned by Moravec’s claim that technological superiors always defeat the inferiors, as well as his contention that humans will become extinct as they merge with the robots. Disturbed, Joy consulted other computer scientists who basically agreed with these technological predictions but were themselves unconcerned. Joy was stirred to action.
Joy’s concerns focuses on the transforming technologies of the 21st century—genetics, nanotechnology, and robotics (GNR). What is particularly problematic about them is that they have the potential to self-replicate. This makes them inherently more dangerous than 20th century technologies—nuclear, biological, and chemical weapons—which were expensive to build and require rare raw materials. By contrast, 21st century technologies allow for small groups or individuals to bring about massive destruction. Joy accepts that we will soon achieve the computing power to implement some of the dreams of Kurzweil and Moravec, worrying nevertheless that we overestimate our design abilities. Such hubris may lead to disaster.
Robotics is primarily motivated by the desire to be immortal—by downloading ourselves into them. (The terms uploading and downloading are used interchangeably.) But Joy doesn’t believe that we will be human after the download or that the robots would be our children. As for genetic engineering, it will create new crops, plants, and eventually new species including many variations of human species, but Joy fears that we do not know enough to conduct such experiments. And nanotechnology confronts the so-called “gray goo” problem—self-replicating nanobots out of control. In short, we may be on the verge of killing ourselves! Is it not arrogant, he wonders, to design a robot replacement species when we so often make design mistakes?
Joy concludes that we ought to relinquish these technologies before it’s too late. Yes, GNR may bring happiness and immortality, but should we risk the survival or the species for such goals? Joy thinks not.
Summary – Genetics, nanotechnology, and robotics are too dangerous to pursue. We should relinquish them.
[i] Bill Joy, “Why The Future Doesn’t Need Us,” Wired Magazine, April 2000.