I was born under the threat of nuclear devastation, midway between President Kennedy’s inaugural address and the Cuban Missile Crisis, an event that came the closest of any in history to ending human life on Earth. My childhood and young adulthood were spent with a constant uneasiness, the knowledge that—by miscalculation or sheer accident—any day could bring ultimate destruction. One joke I remember from my teenage years concerned what to do in case of a nuclear attack: bend way, way over and kiss your ass goodbye.
At his inauguration, Kennedy described the United States as engaged in a “long twilight struggle,” different than any previously, one that would be won by diplomacy, by an ideological struggle for hearts and minds, as much as military might. Although Kennedy quickly blundered into a botched invasion of Cuba, the United States did, somehow, make it through those years, whether by keen strategic calculation, dumb luck, or a combination of both. While it was not understood at the time, the Soviet Union spent the next 25 years decaying from within, losing the ideological struggle between communism and capitalist democracy by failing to deliver material prosperity to its own people. When the Berlin Wall came down in 1989, the West had won. Most important, the threat was over—or at least greatly reduced—of dying any day by nuclear fire.
Humanity seemed ready to surge forward into a harmonious future. Frances Fukuyama was only the most prominent of a number of voices predicting the “end of history,” that the totalitarianism of the Nazis and communists was through, that liberal democracies had won and what was left was a kind of mopping up operation as the entire world, bit by bit, adopted the freedoms of the West. The 1990s did appear to be a time of great economic gain and of democratization, notably in Latin America and South Africa, although dangers lurked beneath the surface of events. The 21st century quickly brought to the fore two counternarratives to the end of history. The first was the 9/11 attack and the subsequent War on Terror—along with more conventional wars in Afghanistan and Iraq. The second was the 2008 financial crisis from which the world continues to reel. Still, neither of these reversals pose the existential threat to our very survival as a species as was the case under the prospect of nuclear war. A longer, slower process is doing that—climate change, in conjunction with other environmental threats, notably biodiversity loss. Indeed, stories have begun to leak out that an ominous report is forthcoming from the Intergovernmental Panel on Climate Change, confirming that global warming is likely to rise more than had been expected just a few years ago, and that its effects are extremely unpredictable. This seems less devastating than the nuclear threat, yet, eventually, might prove just as dangerous. We are out of the fire of nuclear devastation and into the frying pan of slow heating, a longer twilight struggle than the cold war proved to be, one that tests us in new ways.
Because the threat is measured in decades and centuries rather than hours and minutes, it does not have the same urgency as nuclear apocalypse. Climate change is also impossible to pin down on a single enemy or moment of human folly, making it harder to mobilize against. Our collective actions today might mean devastation in fifty years. Furthermore, for many years the actions of the largest overall emitter, the United States, took place when global warming was only a controversial theory, so it is hard to blame individuals of that time, and therefore difficult to assign responsibility. And we can never be absolutely certain of the long-term consequences, so cynics can cast doubt, playing on the human tendency to avoid action unless pressed. Cause and effect are difficult to reconcile. What lessons can we learn from our previous, miraculously successful, encounter with the nuclear threat?
Once factor is to make use of time and patience and to have a long-term plan, as the United States did in containing the Soviet Union (although messier in execution than might be remembered). The world began to face up to global warming in a series of international conferences, culminating with the 1997 Kyoto Protocol, the first meaningful international agreement to reduce fossil fuel emissions. The mere existence of such a document shows commendable foresight—following the disasters of the twentieth century, the world seemed to be coming together to solve a conundrum, showing concern for future generations. Yet Kyoto was flawed, excusing developing countries from capping carbon emissions. Since these countries were not responsible for the crisis, this seems defensible, but it gave room for the United States to refuse to sign. Nor did the sole remaining superpower come up with a reasonable alternative plan. Historians might blame American leadership in the 1990s for failing to stop the growth of al Qaeda. And certainly, the rush to deregulate the financial sector will be condemned. Yet, in five or six decades, these will likely be dwarfed by our pathetic response to the burgeoning climate crisis, which will be seen as America’s greatest failure.
In a way, the problem stems from the fact that our scientific and technological capacity has grown faster than our wisdom in using it. Technology can be blamed for the disasters of the 20th century, from the trench and gas warfare of World War I to the bombings and blitz of World War II to the mass killing technologies of the Holocaust. In his Battle of Britain speech, with Nazi power at its zenith and air assaults on London killing civilians daily, Winston Churchill spoke of “the lights of perverted science,” with warfare brought to new heights of destructiveness (and this was before widespread public disclosure of the Nazi death camps and the use of Zyklon B, developed as an insecticide, on humans). Twenty years later, Kennedy asked that “both sides seek to invoke the wonders of science instead of its terrors.” Shortly after, Rachel Carson asked for a wiser use of pesticides, which were at the time devastating wildlife, and an understanding of humanity’s place among ecosystems. Indeed, concern about nuclear radiation was a prime motivation behind the environmental movement (Ropeik, 2012).
Yet, at times, we do seem to have found new wisdom to go without new technological power. In 1948, in the wake of the Holocaust, Eleanor Roosevelt led the movement that culminated in the signing of the Universal Declaration of Human Rights at the United Nations, borrowing from the rights that Thomas Jefferson elucidated in the Declaration of Independence. And the environmental movement of the 1960s really did clean up the air and water in the United States, and led to similar efforts around the world. If I am often embarrassed by my country’s many shortcomings and outright failures, I can also be proud of our leadership. By 1987, the foresight of the environmental movement was mixed with a dollop of human rights in the Brundtland Declaration, the bulwark of the sustainability movement.
Yet our success has also led in the direction of complacence and greed, toward the disasters the 21st century is bringing upon us. And the fire of nuclear threat has not been eliminated. As environmental conditions worsen, it may very well lead to new conflicts. The greatest danger is that the frying pan we are all in will exacerbate tensions, leading to “grease fires,” local nuclear exchanges. India and Pakistan are a notable flashpoint among many. Presidents from Reagan to Obama have called for the eventual elimination of nuclear weapons, but we still have a ways to go. Meanwhile, the quest for an international solution to climate change lies in tatters, ruined by American intransigence, global consumerism, and China’s economic explosion. As James Baldwin quoted from an old slave song, “God gave man the rainbow sign / no more water, the fire next time.” We face the floods of rising sea levels and monster hurricanes, as well as a continuing nuclear threat. The choice might not be between the frying pan and the fire—we might have both.
Ethan Goffman is Associate Editor of Sustainability: Science, Practice, & Policy. His publications have appeared in E: The Environmental Magazine, Grist, and elsewhere. He is the author of Imagining Each Other: Blacks and Jews in Contemporary American Literature (State University of New York Press, 2000) and coeditor of The New York Public Intellectuals and Beyond (Purdue University Press, 2009) and Politics and the Intellectual: Conversations with Irving Howe (Purdue University Press, 2010). Ethan is a member of the Executive Committee of the Montgomery County (Maryland) Chapter of the Sierra Club.