In 1969 President Nixon ordered the destruction of U.S. biological weapons, and two years later he submitted to the Senate the new convention prohibiting the production and stockpiling of bio-weapons. Peripheral issues delayed ratification until 1974, when it became my job as Director of the Arms Control and Disarmament Agency to urge the Senate Foreign Relations Committee to consent to this BW Convention. I warned the senators up-front that “verification of compliance with this convention in countries with relatively closed societies is difficult.” After a couple of minutes’ discussion, the senators nonetheless agreed the Convention should be ratified. Yet later on, the Convention’s unverifiability became a nagging issue. In the 1990s it was discovered that the Soviet Union had massively violated the Convention from the first day it signed it. And after Saddam Hussein had lost the Gulf War in 1991, he was also forced to admit to massive violations.
To assess what arms agreements can do to prevent biowarfare, we need to keep in mind the “dual use” problem. It makes detection well-nigh impossible in authoritarian nations and dictatorships—precisely the countries where violations are most likely to take place. Advances in the life sciences spread throughout the world because, almost without exception, they are intended for peaceful uses. But the boundaries between destructive and beneficial purposes are easily blurred. For instance, a new pharmaceutical vector that helps to transmit a medication to the diseased tissue might be indistinguishable, for practical purposes, from vectors that can be used to magnify the lethality of a biological weapon. The very fact that one of the two uses is beneficial, and hence considered humanitarian, would make it politically difficult to impose stringent controls on the worldwide transfer of such pharmaceutical vectors.
Unfortunately, the BW Convention offers little protection because biological weapons can be developed under the guise of peaceful use and are easy to deliver clandestinely. At least two signatories of the Convention—the Soviet Union and Iraq—admitted that they violated it. In 1992, Russia’s President Boris Yeltsin revealed that the Soviet Union has been developing biological weapons, an illicit program that apparently started right after Moscow had signed the Convention. In 1995, when the head of Iraq’s military industries defected, Saddam Hussein was forced to admit his massive violations of the Convention. Oblivious to these stubborn facts, the UN arms control conference in Geneva was tasked to write a new treaty, a “Protocol” to the BW Convention that would deter such violations.
By 2001, when this Protocol had grown 200 pages long, the Bush administration called a halt to the negotiations. The diplomats who had enjoyed their many pleasant sojourns in Geneva understandably reacted with outrage and insisted the negotiations had to be resumed. Less understandable was their rationale for negotiating this Protocol. It would be “legally binding,” they explained, and therefore effective. But if a dictator is willing to violate the BW Convention—presumably also a legally binding treaty—why on earth would he suddenly feel “legally bound” not to violate this Protocol as well? Evidently, as long as an illusion is politically correct it remains impervious to logic and evidence.
3
FIVE LESSONS OF THE NUCLEAR AGE
Those who’ve governed America throughout the nuclear age and we who govern it today have had to recognize that a nuclear war cannot be won and must never be fought.
—RONALD REAGAN (1982)
THE DRAMA OF THE NUCLEAR AGE teaches painful lessons. The continuing spread of nuclear technology is turning into a disaster of unimaginable proportions. It is moving beyond the control of any national policy or international agreements. It is the quintessential expression of mankind’s cultural split—the inability of institutions to rein in runaway science. How did we get pulled into this awful maelstrom? Specifically, how has the United States, originally the possessor of a nuclear monopoly, ended up facing a crisis of extreme vulnerability, a world where ruthless dictators, terrorist organizations, even doomsday cults and anarchists can some day possess a few nuclear bombs?
Eleven American presidents—from Harry S. Truman to George W. Bush—tried to prevent this from happening. At the beginning, the United States assumed the principal responsibility for the nuclear question, appropriately so since it emerged from the Second World War as the strongest power and the only nation that had built and used atomic bombs. Since then, Americans have devoted an immense effort to the nuclear problem—an intellectual, political, and military endeavor that has no parallel in all of military history. As a longtime participant in this effort, both inside and outside the Pentagon, I feel free to state that much of it took the form of an abstract and cold-blooded theorizing of an eerily academic nature. Nonetheless, when all is said, a stellar accomplishment spans the entire period from 1945 to date. Nuclear war, and indeed any destructive use of nuclear bombs, has been averted.
Lesson One: Benevolence Is Not Enough
Drawing the most useful lessons from the nuclear age will require immersion in the rich and complex history of the last sixty years. I shall select only the most instructive episodes, but to convey the essence I need to start at the beginning.
During the Second World War, public opinion had become inured to devastating bombing attacks on cities—until the nuclear destruction of Hiroshima and Nagasaki. That event thrust a new emotive impulse upon strategic thinkers everywhere. Just one single bomb, oh Lord, could now destroy a major city! The wrenching revelation that one of nature’s most powerful forces had been unlocked slashed like a flaming sword into people’s consciousness, prompting statesmen and military leaders to search out a new approach to war and peace. For months to come, a flow of information deepened the emotive impact of the atomic bomb: first the gruesome photographs, then