Masters at the Movies
A Life in American Cinema: The Nuclear Option
Paul Boyer, November 2008
It’s a bit ironic that films loom large in some of my scholarly work, because I came late to the movies. From childhood through my late teens, I attended the Brethren in Christ Mission in Dayton, Ohio, founded in 1912 by my grandfather. In those days the Brethren in Christ Church, a small Mennonite-related denomination, imposed strict rules of behavior and dress on its members, banning smoking, drinking, dancing, and moviegoing. (Years later, when I heard the joke that Southern Baptists oppose premarital sex because it can lead to dancing, I fully understood.) I recall one mission sermon in which the minister reported soberly that he had seen an acquaintance looking at a movie poster in front of a local theater.
On the other hand, my father loved technology, and he not only acquired a wire recorder (short-lived predecessor of the tape recorder) shortly after World War II, but also a 16-millimeter movie camera and projector. Thus, while commercial films remained taboo, we happily watched home movies in our darkened living room.
My initial exposure to Hollywood came when I went to southern California in 1951, at 16, to attend a church-affiliated private high school. My first movie was MGM’s 1952 Technicolor epic Quo Vadis, which I rationalized on the grounds that it portrayed early Christians heroically confronting Nero’s persecution. It was quite an introduction: Deborah Kerr played Lygia, the beautiful young Christian who converts her lover, the Roman commander Marcus Vinicius (Robert Taylor), to her faith. Peter Ustinov was Nero; Sophia Loren made her Hollywood debut as a slave girl; and Quo Vadis made film history for the most costumes in one movie: 32,000. In 1953 I fell in love with Leslie Caron in Lili. I still find myself humming “The song of love is a sad song, Hi Lili, Hi Lili, Hi lo.”
In Paris in 1955–57, where I served for two years as a conscientious objector on loan from the Mennonite Central Committee to UNESCO’s Coordination Committee for International Voluntary Work camps, I began to understand that films could play a cultural role beyond mere entertainment. James Dean in Rebel Without a Cause made a big impact, but more memorable was Alan Resnais’ harrowing documentary of the Nazi death camps, Night and Fog. As the audience watched in shocked silence, a woman suddenly screamed “Non,” and ran out of the theater.
Later, I encountered Humphrey Bogart and Ingmar Bergman at the film festivals held during Harvard’s exam week at the Brattle Theatre near Harvard Square. Casablanca and Wild Strawberries still remain my all-time favorites. As Aljean Harmetz wrote in the New York Times in 1992, recalling those Brattle festivals, and specifically the repeated showings of Casablanca, “It was more than just going to the movies. It was sort of partaking in a ritual.”1
Though I didn’t realize it at the time, Stanley Kramer’s On the Beach (1959), based on Neville Shute’s novel about humanity’s last days after a thermonuclear war, inaugurated the process by which my moviegoing and my scholarly interests began to intersect. I saw the film with friends on New Year’s Eve 1959 in Times Square, and as we emerged from the theatre around midnight, the contrast between the revelers jamming the streets and the movie we had just seen struck me powerfully. (Much later, as a visiting professor at UCLA, I had the opportunity to relate this experience to Stanley Kramer, and to thank him for making On the Beach.)
In the early 1980s, when I began to explore the impact of nuclear weapons on American thought and culture, the importance of movies became obvious, and films became integral to my research, writing, teaching, and lecturing on this topic.2 Though not a specialist on nuclear films, I’ve benefited from the work of scholars such as Jack Shaheen, Mick Broderick, Jerome Shapiro, Robert Torry, and others.3
I soon realized that nuclear-age movies were not mere visual window-dressing. They influenced how Americans thought about nuclear issues, and they help one map the larger cultural and political trajectory of the nation’s nuclear history. MGM’s The Beginning or the End (1947), made with the Truman administration’s blessing, introduced egregious factual distortions to justify the atomic bombing of Japan.4 The Day the Earth Stood Still (1951), released as the early postwar movement for the international control of atomic energy gave way to cold-war imperatives, can be read as both an idealistic call for international control and a coercive insistence on America’s global hegemony.
Subsequent films both reflected and intensified successive waves of nuclear awareness and activism. The years from the mid-1950s through 1963 (when the Limited Nuclear Test Ban Treaty was signed) saw a surge of activism triggered not only by nuclear-war fears but also by the deadly radioactive fallout from atmospheric nuclear tests. As my generation well remembers, anxieties spiked during the 1961 Berlin crisis, when President Kennedy warned us that nuclear war could be imminent and urged everyone to build fallout shelters. Then came the October 1962 Cuban Missile Crisis, when the world teetered on the brink of nuclear catastrophe.5 Meanwhile, in On Thermonuclear War (1960) and Thinking about the Unthinkable (1962), strategist Herman Kahn of the RAND Corporation coolly discussed America’s capacity to survive an all-out thermonuclear war and explored the deterrent value of an automated nuclear-retaliation system, the so-called doomsday machine, operating beyond human control.
Unsurprisingly, filmmakers of this era addressed the nuclear danger. Alain Resnais’ Hiroshima Mon Amour (1959), with a screenplay by Marguerite Duras involving an affair between a Hiroshima architect and a French actress who had come to the city to make an antiwar film, juxtaposed documentary film clips of the devastated city of 1945 with the actress’s recollections of a wartime romance with a German soldier. As the images of destruction give way to scenes of bustling postwar Hiroshima, so the actress’s wartime memories fade. Though the theme is forgetfulness, the atomic-bomb scenes conveyed their own message to edgy audiences of 1959.
Nuclear fear inspired a wave of late 1950s mutant movies. In Godzilla (1954), the fearsome monster emerges from Tokyo Bay, roused from its long slumber by atomic bombs. In Them! (1954), giant ants crawl from New Mexico’s A-bomb test site. In The Incredible Shrinking Man (1957), the unfortunate protagonist begins to shrink after his sailboat passes through a glowing radioactive cloud from a distant nuclear test.
Surging nuclear anxiety found more direct expression in On the Beach and in two 1964 films, Sidney Lumet’s Fail-Safe, based on the 1962 novel by Eugene Burdick and Harvey Wheeler, and Stanley Kubrick’s Dr. Strangelove, loosely based on Two Hours to Doom (1958, published in the United States as Red Alert), by Peter George, a British RAF officer. Both novels recount humanity’s narrow brush with catastrophe as U.S. nuclear bombers head toward Russia following breakdowns in America’s strategic-command structure.6
While Lumet’s earnest Fail-Safe, starring Henry Fonda, presented the story as a serious melodrama, Kubrick, collaborating with humorist Terry Southern, opted for black comedy. Beneath the humor, however, lay a deadly serious commentary on nuclear-deterrence theory, which Kubrick avidly followed, owning, as he wrote, “60 or 70 books” on the subject. In Lumet’s film, humanity survives (though Moscow and New York are lost) and a shaken U.S. president vows to work for world peace. Kubrick, by contrast, followed the logic of deterrence doctrine to its horrifying conclusion. As bomber pilot T. J. “King” Kong (Slim Pickens), hysterically waving his cowboy hat, rides a nuclear bomb down toward a target in Russia and Vera Lynn croons the World War II ballad “We’ll Meet Again,” mushroom clouds engulf the world. With a stellar cast including George C. Scott, Sterling Hayden, and a brilliant Peter Sellers in three different roles, Dr. Strangelove remains the unchallenged classic of the genre.
The next wave of nuclear movies came in the late 1970s and early ’80s, again paralleling a renewed surge of public awareness. The 1979 Jane Fonda film China Syndrome, coinciding with the near-disastrous accident at Pennsylvania’s Three Mile Island nuclear power plant, intensified a campus-based anti-nuclear-power movement. Attention shifted to the nuclear arms race after 1980, with the Reagan administration’s nuclear-weapons build up, belligerent cold-war rhetoric, and renewed emphasis on civil defense. (“If there are enough shovels to go around, everybody’s going to make it,” one Pentagon official famously commented.) Once again, movies reflected the public’s uneasiness and skepticism about official reassurances. The 1982 documentary Atomic Café, alternately hilarious and frightening, evoked some of the more bizarre aspects of the Atomic Age. In War Games (1983), a high-school computer geek (Matthew Broderick) hacks into the supercomputer at the NORAD (North American Aerospace Defense Command) base at Cheyenne Mountain, Colorado, and nearly triggers World War III. The Day After (1984), a made-for-TV film shown on ABC, portrayed the effects of a nuclear attack on Kansas.
The high-tech missile-defense shield proposed by Reagan in March 1983 was quickly dubbed “Star Wars”—an allusion to George Lucas’s 1977 space epic. As historian Stephen Vaughn would later remind us, Reagan’s wistful idea also weirdly echoed his 1940 Warner Brothers film Murder in the Air, featuring the Inertia Projector, a secret weapon capable of destroying incoming enemy aircraft with mysterious energy rays.7
With the cold war’s end, Americans’ nightmarish dread of global thermonuclear holocaust gave way to less apocalyptic dilemmas involving nuclear proliferation and radioactive waste disposal. As nuclear terror faded, so, too, did the morally engaged films of earlier decades. The H-bombs, ICBMs, and nuclear devices that litter such diverse 1990s action movies as Under Siege, John Travolta’s Broken Arrow, George Clooney’s The Peacemaker, Arnold Schwartzenegger’s True Lies, and Jackie Chan’s First Strike function mainly as plot devices or special-effects gimmicks. In the 1996 blockbuster Independence Day, the government’s attempt to use nuclear warheads to destroy a humongous alien spaceship hovering over Washington, D.C., is little more than a minor diversion amid the spectacular special effects. Such opportunistic exploitation of nuclear weapons for entertainment purposes contrasts starkly with earlier movies such as On the Beach, Dr. Strangelove, Fail-Safe, and China Syndrome which, while hardly ignoring box-office considerations, also took seriously the issues they addressed.
A rare exception to this escapist fare was Rhapsody in August (1991) by the octogenarian Japanese director Akira Kurosawa, in which an aging woman of Nagasaki flashes back to the moment when the bomb fell, while her visiting grandchildren, enjoying their holiday in the modern city, heedlessly ignore her.
My life at the movies, from Quo Vadis to Rhapsody in August and far beyond, took directions I could hardly have foreseen in 1951. Nevertheless, as citizen, historian, and ordinary consumer of American popular culture, I have been shaped by more films than I can possibly remember. My minister grandfather, already in his forties when D. W. Griffith released Birth of a Nation in 1915, never saw a commercial movie in his long life. (He died in 1971, just short of his hundredth birthday.) Nevertheless, to his credit, he never criticized the different choices I made on that matter, as on many others.
Paul Boyer is the Merle Curti Professor of History Emeritus and a former director of the Institute for Research in the Humanities at the University of Wisconsin-Madison.
2. For discussions of nuclear-age films, see my analyses in: By the Bomb’s Early Light: American Thought and Culture at the Dawn of the Atomic Age (New York: Pantheon Books, 1985; reissued, Chapel Hill: University of North Carolina Press, 1994), index entry “motion pictures related to atomic bomb”; Fallout: A Historian Reflects on America’s Half-Century Encounter with Nuclear Weapons (Columbus: Ohio State University Press, 1998), 95–110, passim; 176–77, 189–90, 200–35, passim; 242, 253; and “Dr. Strangelove,” in Mark C. Carnes, ed., Past Imperfect: History According to the Movies (New York: Henry Holt and Co., 1995), 266–69.
3. Jack G. Shaheen, ed., Nuclear War Films (Carbondale: Southern Illinois University Press, 1978); Mick Broderick, Nuclear Movies: A Filmography (Northcote, Victoria, Australia: Post-Modern Publishing, 1988), and “Is This the Sum of Our Fears? Nuclear Imagery in Post-Cold War Cinema,” in Scott C. Zeman and Michael A. Amundson, eds., Atomic Culture: How We Learned to Stop Worrying and Love the Bomb (Boulder: University Press of Colorado, 2004), 125–47; Jerome F. Shapiro, Atomic Bomb Cinema (New York: Routledge, 2002); Robert Torry, “Apocalypse Then: Benefits of the Bomb in Fifties Science Fiction Films,” Cinema Journal, 31:1 (fall 1991), 7–21. This is only a representative sampling of scholarship relating to nuclear-age film.
4. Mick Broderick, “The Buck Stops Here: Hiroshima Revisionism in the Truman Years,” in Rosemary Mariner and G. Kurt Piehler, eds., The Atomic Bomb and American Society: New Perspectives (Knoxville: University of Tennessee Press, forthcoming).
5. In 2007, on the NPR program Wait, Wait, Don’t Tell Me, White House press secretary Dana Perino (b. 1972) light-heartedly recalled her mystification when a reporter mentioned the Cuban Missile Crisis. “Wasn’t that like the Bay of Pigs thing?” she later asked her husband, according to reports.
6. In Fail-Safe, the crisis arises from a technological communications failure; in Two Hours to Doom (as in Dr. Strangelove), it involves a deranged SAC commander. Nevertheless, the similarities of the two novels resulted in a lawsuit, eventually settled out of court. See Vincent Lobrutto, Stanley Kubrick: A Biography (New York: D. I. Fine Books, 1997), 242–43.