I have been following Bill Joy's
comments on a coming biotech / nanotech / robotics apocolypse for the last year and a half -or so. His premise is that the potential for extreme violence on a global scale is extremely high, due to the democratization of destructive power that these three technologies offer.
I have to say, first, that I find his comments disquieting, at a minimum. Essentially, he believes that the only realistic option open to us is to choose not to develop these technologies. What I find interesting, though, is the similarity between these arguments and an older apocalyptic argument presented in The Limits to Growth (1972) and Beyond the Limits (1992). These books draw an analogy between over-population-induced die-offs in other animals and (so they claim) a soon to arrive massive die-off in the human population caused by ever-expanding resource consumption. The weakness in the Limits argument, I think, is that they inadequately model the influence of technology, and ironically, I think Bill Joy makes the same mistake. He (reasonably) points out the immense dangers that these technologies make possible, without pointing out that they also make possible immensely more powerful defenses.
So, for example, when he talks about the possibility of very nasty terrorist-engineered viruses which could hone in on weaknesses in the human genome, he doesn't talk about the possibility of white-hat engineered monitors that could scan for, and counter, these viruses in real time. (Norton Anti-Virus personal edition 2025 might have a very different meaning.)
I suppose I have to hope that those sorts of developments will happen, because I firmly believe that widespread rejection of these new technologies has a vanishingly small probability of occurring. We do not live under the kinds of economic and political systems that would make that a feasible solution to the dystopia that Joy envisions.