By Lawrence Krauss
Shortly after the end of World War II, Albert Einstein, referring to the new global danger of nuclear weapons, uttered his now famous warning: “Everything has changed, save the way we think.” Accordingly, he and Robert Oppenheimer established the Bulletin of the Atomic Scientists to help warn the public about the dangers of nuclear war.
Perhaps the most visible face of the bulletin—for which I am currently co-chair of the board of sponsors—is the “Doomsday Clock.” Created in 1947, the clock graphically reflects how close humanity might be to human-induced apocalypse, in terms of the “number of minutes to midnight”—at which time, presumably, time itself will no longer matter.
In total, the clock has been adjusted 20 times, moving as close to two minutes to midnight in 1953, after the United States and Soviet Union each first tested thermonuclear devices, and as far as 17 minutes to midnight in 1991, after the United States and Soviet Union signed the Strategic Arms Reduction Treaty. Currently, it is set at five minutes to midnight.
Nuclear weapons continue to be the most urgent global threat to humanity: Recent developments in Iran, the continued tension between Pakistan and India, and the United States’ consideration of developing a new generation of nuclear weapons are all cause for great concern. But in the 60-odd years since the creation of the Doomsday Clock, the world has changed, in no small part to technological and scientific advance, making it even more dangerous. Unfortunately, there is no great evidence that our way of thinking about global catastrophes has evolved for the 21stcentury. That’s why the bulletin decided, in 2007, to factor other threats to humanity into the Doomsday Clock.
Since then, we have run three “Doomsday Symposia,” during which key scientists and policymakers assess ongoing global threats to humanity in three areas: nuclear proliferation and nuclear weapons, climate change, and biotechnology and bioterrorism. The last issue has raised a lot of heat in the media in recent years, and the specter of new lethal viruses that might wipe out populations suggested to us that there might be compelling new reasons to move the clock forward again.
Indeed, as biotechnology has undergone in the past 35 years the same explosive growth that physics technology underwent in the previous period, the emerging possibility of biologically induced weapons has increased. We now have the ability to artificially recreate genetic sequences, including viruses. DNA “hacking” has become a pastime at institutions such as MIT, among the same kind of people who used to be so enamored with computer hacking. Finally, the holy grail of genetic manipulation now involves the frontiers of synthetic biology, wherein researchers are attempting not merely to build up genetic sequences base-pair by base-pair, but also to explore the possibility of building novel life forms from scratch.
These developments are thrilling for scientists and technologists who love to take things apart and put them back together. But there remains the terrifying prospect that smart pranksters,DIYers, a laboratory, or more sinister groups could, either by accident or intentionally, accidentally create a new supervirus with the potential to wipe out all other life on Earth. (Hence the furious debate that has surrounded experiments into artificially developing forms of the avian flu virus H5N1 that is transmittable between mammals.) Indeed, just this week, a host of external watchdog organizations have called this week for a moratorium on synthetic biology.
We should encourage the vigilance and rigorous discussion that has accompanied these developments. Happily, however, the bulletin’s experts, including Harvard biologist Matthew Meselson and human genome pioneer and synthetic biologist Craig Venter, suggest the above scenarios are in the near term unlikely at best, pure fiction at worst.