Jennifer Smart is a 2D animator and colorist who has produced projects for the Middle East Institute and National Geographic. Currently a Cinema and Media Studies student at the University of Southern California, her long-term goal is to bring intellectually and emotionally challenging subject matter into the public eye through a marriage of illustrative animation and documentary film. She spends her free time listening to NPR and claiming she is “fighting the patriarchy” while her roommates roll their eyes.
How We Learned to Stop Worrying and Forget the Bomb
By Jennifer Smart
At 8 p.m. on Sunday, November 20th, 1983, just over half of the adult population of the United States collectively tuned into ABC television to witness Armageddon. Nearly 100 million people forewent Thanksgiving preparations to confront a harrowing two-hour dramatization of the ultimate—and seemingly inevitable—denouement of the atomic age: nuclear holocaust in silo-rich American heartland. Unlike previous unsuccessful attempts to visualize a global nuclear exchange via mismatched stock footage of fires and traditional warfare, the American television film The Day After (1983) invested in the fodder of Nuclear Freeze Campaign pamphlets with a seven million dollar production value. The film meticulously imitated an actual doomsday scenario with detailed and accurate images of mushroom clouds looming high over the Midwest, instantaneous vaporization, full-body first-degree burns, slow and agonizing starvation, mass late-stage radiation sickness, and inescapable societal degeneration. When viewed today, it is difficult to deny the B-movie elements of the production, like when victims of the initial fiery blast zap off the screen in a montage of low-tech yet disturbing “x-ray” silhouettes; however, as a primer on the horror of thermonuclear war, this was a graphic rendering of the abyss.
However, despite the movie’s careful research and horrific scenarios, public hand-wringing surrounding The Day After highlighted the paradoxes that characterize American dogma about nuclear war to this day. The film proved too grim for the war hawks, who believed in the prospect of limited and contained nuclear war, and too sanitized for the activist scientists, who forecast nothing but corpses and cockroaches for the next world war. Like most congressional studies of the time, the film conceived of nuclear war as a matter of a couple thousand bombs all deployed within a few days, yet individuals outside the military establishment postulated that a nuclear war could drag on for months or years and involve tens of thousands of warheads. That same year, a coalition of scientists helmed by Carl Sagan had warned the public that, were such a war to occur, smoke from countless burning landscapes could cast the planet into “nuclear winter” (Weart 233). In a war involving no more than one-third of the American and Soviet arsenals, one billion people would die immediately and another billion would be critically injured. The rest, the scientists reported, would be smothered under nuclear clouds in an earthly hell of darkness, cold, and death. In the middle of July, the temperature would descend to 40 degrees below zero (Goodman, “Do You Feel Safer?”). Such a scenario is arguably much more difficult, if not impossible to portray with the necessary gravitas in a narrative format. If a story allows human agency to triumph and propagates the illusion that the bomb can be rendered toothless, it risks trivializing the nuclear danger. However, if that story faithfully demonstrates the reality of nuclear warfare as an existential threat to human survival, the author has to present a series of events so hopeless that it assails the agency of the protagonists and the viewers who identify with them. After all, what sort of gripping tale could you tell about the aftermath of nuclear attack if no one survives? The human imagination cannot help but stagger under the moral weight of two billion lives.
This weight registers in the slump of my grandmother’s shoulders when I ask her whether she remembers seeing The Day After when it originally aired. She does not recall the film, but its mushroom cloud fables cast a pall over her formative memories. She quietly tells me about the 50s—how she dreaded her father’s turns standing sentinel on Lake Michigan, scouring the night skies for Russian bombers. She tells me about the 80s—the bomb shelter rusting in her backyard, and the years of canned food gathering dust in her medical school’s basement. These preparations seem vaguely absurd and naive in retrospect, but at the time, they symbolized the reassurance that life as she knew it could continue against all odds, even during the bleakest of nuclear winters. Over decades-long cycles of activism and apathy, coping with the Bomb shaped the American psyche, like deep-sea mountain chains contour ocean currents and weather patterns—in all kinds of hidden ways.
Even with an excellent intellectual grasp of the damage done by a nuclear weapon, its destructiveness is so psychologically unreal that it barely registers a blip on our emotional radars. As the survivors of Hiroshima and Nagasaki dwindle, so does the salience of their testimonies. They transform into artifacts of the only instance of nuclear bombing in human history. No living American has seen their city reduced to rubble by modern warfare. For Americans, nuclear weapons only threatened to annihilate cities in distant countries. Through this temporal and spatial distance magnified by the Cold War, the Bomb’s psychological effects on successive generations was evident from very early on. As far back as the 1960s, a college teacher noted a distinct progression towards greater and great ambivalence when he asked his students each semester about their feelings of nuclear fear. Those who had entered adolescence prior to the bombing of Japan “frankly admitted anxiety, but the next generation did not,” despite what the teacher determined to be an acute awareness of the ever-present danger (Weart 155). In the absence of any tangible reference point from which these students could imagine a nuclear attack, a potential nuclear war’s emotional impact dimmed year after year.
However, when the horrors of nuclear weaponry sporadically do penetrate the public consciousness—as in the autumn of 1983—a common psychological mechanism for coping with the resultant anxiety is denial. According to psychologist Jerome Frank, denial refers to “the exclusion from awareness of certain aspects of reality which, if allowed to enter consciousness, would create strong anxiety or other painful emotions,” and other psychological heuristics, particularly habituation, only bolsters this sense of denial (Frank, 396). Humans, like all living creatures, cease attending to stimuli which persist unchanged over a certain period of time (Frank 396). We simply do not have the intellectual processing capacity to consciously deal with everything hitting us at once. No matter how objectively threatening, continuing stimuli will blend into the environmental milieu given time. The first atomic bomb dropped on Hiroshima generated a shock wave that galvanized global efforts to ban nuclear weapons, as did atmospheric testing, the leap from fission to fusion technology in the 1950s, and the Cuban Missile Crisis in the 1960s. Yet, the images of war stagnated in the abstract, when countries increasingly restrained from dropping more bombs. The end of the Cold War inaugurated a new age of nuclear affairs with multiple hopeful effects: the disintegration of the Soviet Union, the détente between capitalist America and Communist China, and the flurry of new treaties that promised to slash nuclear stockpiles from 70,000 warheads in 1986 to just 4,000 by 2012 all eased the minds of citizens and helped us stop worry about the Bomb. Even the Bulletin of Atomic Scientists relaxed its famous “Doomsday Clock” to seventeen minutes from midnight. Americans, for the first time in four decades, could step out of the cold shadow of imminent nuclear annihilation and breathe a sigh of relief. With the nuclear threat dimmed, we now seem to live in a different reality that no longer puts human existence on a hair trigger.
It takes no feat of imagination, then, to see that nuclear imagery and terminology has lost its fearful resonance in contemporary American culture. My generation inherited a world in which talk of nuclear weapons, radiation, and reactors seldom showed up in media coverage or in serious private conversations. Stereotypes of grainy footage of incinerated trees and mushroom clouds, mingled in memory with “Bert the Turtle” instructing 1950s children to “duck and cover,” rooted my conception of nuclear war in the black-and-white tedium of history. Today, we associate doomsday with more serious problems, like global climate change, post-9/11 terrorism, widespread epidemics and superbugs, and cyber-warfare. We witnessed no Day After-like national controversies about the ethics of depicting war between nuclear powers precisely because that brand of apocalyptic scenario was passé, presented mostly in 1960s and 80s pastiche and cheesy science-fiction fare. If the box office is anything to go by, nukes are so last century! By 2014, the concept of a nuclear holocaust was anachronistic enough to spur “adaptation” on the part of the Terminator franchise to “current cultural anxieties” (Sperling). The filmmakers opted to turn Skynet into a hulked-out NSA doppelgänger, with the emphasis on their terrifying surveillance network rather than their plan to bomb humanity back to the Stone Age. Rise of the Planet of the Apes (2011) and Mad Max: Fury Road (2015), both Cold War-era reboots, took similar revisionist tacks—swapping their mushroom clouds for melting ice caps.
When contemporary media does depict nuclear devastation, it rarely divorces itself from the Cold War context. The proliferation of post-apocalyptic video games, like Duke Nukem (1991), Metal Gear Solid (1998), and most notably the Fallout franchise, established a new irreverent tone towards the Bomb, by the early 2000s. Set in an alternate world in which society prioritizes nuclear technologies over miniaturization of electronics, Fallout is a retro-futuristic role-playing game that allows players to wander the post-nuclear wastelands of various American metropolises. The third installment, Fallout 3 (2008), opens by spawning the player in Vault 101, a gigantic bomb shelter replete with icons of 1950s culture. The character of the “Vault Dweller” celebrates their tenth birthday in what resembles a classic American diner, complete with vinyl-upholstered booths, bar stools, and a Wurlitzer 1015 jukebox. Juvenile delinquents, called “Tunnel Snakes,” don leather jackets and slicked hair in true greaser fashion. The game’s mascot, “Vault Boy,” issues grave warnings about nuclear blast effects in propaganda posters and the manual, all the while maintaining his cheery grin in the tradition of “Bert the Turtle.” Fallout plunders this Cold War kitsch to comment on the ironies of the early atomic age, when apocalyptic violence stewed beneath a white picket fence. This forces a critical distance between the player and the scenario presented. Historian William Knoblauch writes that “the game’s reliance on 1950s imagery suggests that nuclear war was only ever really possible during the early Cold War. Put simply, Fallout 3’s apocalypse is born of a distant, but culturally familiar, 1950s era,” making players never think about the modern-day implications of nuclear warfare (Blouin 134). The game’s satirical approach also discourages emotional engagement with in-game characters affected by the fallout. Rather than empathizing with the Vault Dwellers, the game encourages players to scoff at the past’s naïveté and take comfort in our cynical knowingness. We patronize and think: Look at those idealistic schmucks. Of course their faith in the atom was their downfall. But we know better. We ran the gauntlet of the atomic age and came out wiser for it, when, in reality, we would have probably acted the same way if put under those circumstances.
By confining the nuclear age to the temporal extent of the Cold War, the people, who suffered through the hideously prolonged expectations of disaster and the false alarms of the Cuban Missile Crisis and the mid-80s escalation, are able to assign those traumas through the past tense. The predominance of nostalgia in popular culture and the memorialization of deteriorating nuclear monuments only exacerbates this tendency. New York Times journalist George Johnson recently wrote a piece ruminating on his visit to the “Russian Woodpecker” in Ukraine, a “gargantuan steel structure” lined with radio towers built to give the Soviet Union early warning if the U.S. launched a nuclear onslaught. He describes his penchant for collecting memorabilia, like graphite from the first nuclear reactor and rocks from the Trinity Site in New Mexico, as a way to “make the abstractions of nuclear fission and nuclear politics feel more real,” as though grasping for anchors in the present brought him nothing but rust (Johnson, “A Time for Revisiting Real Fears”). One can easily find analogues in modern-day Los Angeles: about 225 Civil Defense Sirens decay atop poles throughout the city, often stripped of their paint and hidden among palm trees and skyscrapers. Even though the last siren tests were in the late 1980s, self-described siren hunters and Cold War history buffs now seek and photograph them like rare birds. The rediscovery and exhibition of nuclear icons as sites of cultural heritage often cultivates these artifacts’ “chaotech” aesthetic of corrosion and decay (Brummett). It is as though the reclamation of the Cold War apparatus by natural forces in some ways retrospectively subverts the climate of immense, Sisyphean conflict that suffocated Cold War culture. Although this process of embalming does enable historical reflection, it also perpetuates the notion that society has successfully and permanently exorcised the specter of nuclear attack. However, while nuclear symbols root themselves with the iconography of the 20th century, Americans should realize that nuclear weapons are far from obsolete.
Now, more than two decades since the end of the Cold War, around 14,900 nuclear weapons, some stronger than the Fat Man and the Little Boy bombs dropped in Japan, continue to pose an escalating threat to humanity and the biosphere. Globally, these nukes are divvied up among nine nations: the U.S., Russia, the U.K., France, China, India, Pakistan, Israel, and North Korea. Recent studies by atmospheric scientists showcase how even in a “contained” regional nuclear conflict between two of these countries, like India and Pakistan, using just half of their current arsenals could produce climate repercussions even more severe than those predicted in the 1980s doomsday models (Robock and Toon 71). Additionally, the recent resurgence of far-right, authoritarian strongmen, like President Donald J. Trump, in most, if not all, nuclear-armed countries, places the world in a new Cold War environment, as these ruthless leaders encourage the employment of such dangerous weapons. To quote the editors of the literary magazine n+1, “the installation of an extralegal and extrajudicial personality into the presidency — an office that has been expanded, through Republican and Democratic administrations, decade after decade, to dangerous excesses of power”—has put more than 7,000 warheads to the whims of a thin-skinned real estate mogul (Breitman, “Unpredictability”). There is no failsafe. The president has the sole legal authority to conduct unilateral and arbitrary nuclear war. The only barrier is a web of norms, taboos, and phobias, which are feeble psychological deterrents for a figure who has advocated for an arms race with Russia and thinks the military should be more “unpredictable” with nuclear weapons (Breitman, “Unpredictability”).
To historians familiar with the bomb’s psychological impact in the Cold War years, the possibility of sleepwalking into nuclear annihilation comes with a powerful sense of déjà vu. During the first wave of nuclear awareness in the late 1940s, images of mass destruction were anticipatory, as they summoned visions of crumbling civilizations; a reality that would not become possible for another two decades. By contrast, the holocaust scenarios of the 1980s were conservative to a fault, continually outpaced by an ever-shifting playing field. Yet, we have adapted accordingly. In 2017, as the Doomsday clock approaches midnight, our would-be nuclear prophets shout scientific near-certainties at elected officials, press, and public alike, only to hear their warnings returned in mocking echo.
Works Cited
Bartlett, Andrew. “Nuclear Warfare in the Movies.” Anthropoetics – The Journal of Generative Anthropology, vol. 10, no. 1, 2004.
Blouin, M., Shipley, M., Taylor, J. The Silence of Fallout: Nuclear Criticism in a Post-Cold War
World. Newcastle-upon-Tyne, United Kingdom: Cambridge Scholars Publishing, 2013. ProQuest ebrary. Web. 7 February 2017.
Breitman, Kendall, and Kevin Cirilli. “‘Unpredictability’ on Nukes Among Drumpf Keys to Muslim Respect.” Bloomberg.com. Bloomberg, 23 Mar. 2016. Web. 09 Feb. 2017.
Brummett, Barry. Rhetoric Of Machine Aesthetics. 1st ed. Westport, Conn.: Praeger, 1999. Print.
Fallout 3. Rockville, MD: Bethesda Softworks, 2009. Computer software.
Frank, Jerome D. “Nuclear Arms and Prenuclear Leaders: Sociopsychological Aspects of the Nuclear Arms Race.” Political Psychology, vol. 4, no. 2, 1983, pp. 393–408. <www.jstor.org/stable/3790947>.
Goodman, Ellen. “Do You Feel Safer?” The Washington Post, 5 Nov. 1983. Web. 2 Feb. 2017.
Johnson, George. “A Time for Revisiting Real Fears: Sobering Relics of a Nuclear Threat That Has Spread.” The New York Times, 17 Nov. 2014.
Robock, A., Toon, O.B., “Local Nuclear War, Global Suffering.” Scientific American, Jan. 2009, pp. 74-81.
Sperling, Nicole. “See Our ‘Terminator: Genisys’ Cover Story.” Entertainment Weekly, 2017. Web. 2 Feb. 2017.
Weart, Spencer R. The Rise of Nuclear Fear. Cambridge, US: Harvard University Press, 2012. ProQuest ebrary. Web. 2 February 2017.
Leave a Reply