Skip to main content

The Cultural Cold War As History


ISSUE:  Summer 1993

“Without the cold war,” John Updike’s observant Harry Angstrom wonders in the final and most restful of the “Rabbit” novels, “what’s the point of being an American?” For nearly half a century, the geopolitical contest between two superpowers cast a giant shadow over public life, pervading it so fully that the national identity itself seemed to become disfigured. The Cold War defined the deepest fears and stirred the most disquieting anxieties, and its startling eclipse into more conventional international rivalries requires an historical perspective that does justice both to the intensity of that dread and to the reassuring strengths that democracy has revealed. Such double vision is especially necessary of the era—spanning roughly the decade of the 1950’s—when the global contest most forcefully impinged upon American society and culture. The confidence with which victory has been proclaimed should not distract the historian from the responsibility to assess the price of that eventual triumph—or to fathom how precariously the equilibrium between freedom and order was achieved.

Though “gentlemen do not read each others’ mail”— Colonel Henry L. Stimson’s famously quaint justification for abolishing the code-breaking Black Chamber of his Department of State in 1929—historians do; and among the letters of Bertrand Russell is a 1952 disclaimer to a Cambridge, Massachusetts resident that “I have never been a Marxian. My very first book, published in 1896[, ] contained a vigorous attack on Marx and I have never mentioned him except critically.” The author of The Practice and Theory of Bolshevism (1920), a volume based on a tour of Russia that would have saved much suffering and grief had its insights been pondered, was appalled by the impact of anticommunism in suffocating liberal opinion. “In travelling about America,” the philosopher wrote his correspondent, “I have found that those who hold views similar to those which you express dare not give utterance to them except in the depths of the most inviolable privacy. . . . I think that if you were to live for some time in any Middle Western state or in California, you would realize that you have been living hitherto in a little island of enlightenment by no means typical of your country.”

The transference of power to the Republicans eight months later plunged Lord Russell into “views about world politics [that] are gloomy in the extreme,” he told another American correspondent. “Eisenhower appears to be a complete nonentity, and American policy during his presidential term will, I fear, be completely governed first by [Robert A. ] Taft and [John Foster] Dulles and then by McCarthy. I do not know in what proportion to divide their conduct between wickedness and folly. It will, I fear, lead to World War before long.”

Such fear was not unique, spurring another Nobel laureate in literature to flee the United States in June 1952, after an earlier exile of 14 years, eight as an American citizen. “Barbarism is descending upon us,” Thomas Mann warned, “a long night perhaps and a deep forgetting.” The novelist envisioned Eisenhower as a “respectable” forerunner of fascism, another Hindenburg who would serve as a catalyst to empower a Hitler. Repression would deepen, and the whole ghastly cycle that had driven Mann from his native land might be repeated. Dying in Zurich in 1955, the venerated novelist could not know how false was the analogy that he had derived from recent German history, how wildly exaggerated his nightmares were. But it remains an historical problem how such pitch-black fears might have arisen, how the domestic Cold War itself could have become so menacing, and how and when its danger to both peace and freedom receded.

Far from perfecting more powerful instruments of intimidation and tyranny, the Eisenhower administration seemed to apply Friedrich Engels’ doctrine of “the withering away of the state” instead. Far from facilitating the spread of McCarthyism, the Republican President helped grease its skids and slowly curtailed its excesses—first by ending the Korean War, then by letting the Wisconsin Senator have enough rope to overreach himself. Neither Thomas Mann nor Bertrand Russell realized that Ike’s election indirectly spared the United States from an even worse outbreak of political primitivism. Indeed, had the Chicago Tribune’s “ Dewey Defeats Truman” headline been accurate that morning of Nov. 3, 1948, a Republican inauguration might have automatically detoxified the body politic of its accumulating rancors, and a party system ordinarily characterized by compromise and forbearance might not have been rocked by a summation of the New Deal-Fair Deal continuum as “twenty years of treason.”

No facsimile of European fascism erupted, nor did the Cold War trigger anything more apocalyptic and devastating. In 1948 the National Security Council had formulated certain strategic principles in Operation Dropshot, claiming that “peaceful coexistence. . .is an illusion and an impossibility,” and that “spontaneous collaboration between individuals in the [two systems] is evil and cannot contribute to human progress.” But such principles could hardly have been consistently applied. That year the Truman administration had also wisely spurned Lord Russell’s own unpacific recommendation to threaten a preventive nuclear attack, if the Soviet Union did not agree to Western proposals for world government. During the Korean War the terrible loss of life that included 54, 246 Americans must be tabulated; but that conflagration was localized, in part because no one in the Truman administration took seriously Senator McCarthy’s insistence upon marching to Moscow and Peking, if necessary, to open their files and discover the motives behind the Chinese intervention in the Korean War, (It was pointed out at the time in a book on General Douglas MacArthur that McCarthy’s demand would have produced “the costliest quest for primary sources ever undertaken.”) That quest was sensibly avoided. In Europe there was suffering, repression—and bloodshed, as Soviet tanks rumbled through the streets of East Berlin and Budapest and later Prague. But because neither the United States nor its NATO allies intervened in Eastern or Central Europe, the Warsaw Pact became the only alliance in history whose military campaigns were conducted entirely against its own members. Eisenhower and his Secretary of State bore some responsibility for encouraging the doomed revolts of 1953 and 1956, but their restraint avoided at a tragic price the world war that Lord Russell and others had anticipated.

II

Yet even if the worst auguries of the immediate postwar era were not realized, the domestic consequences of the Cold War were bad enough. Very few would now disagree with the drift of ex-Secretary of the Interior Harold L. Ickes” 1950 opinion of McCarthyism—”a putrescent and scabious object that is obnoxious to the senses of sight, smell and hearing—a thing obscene and loathsome, and not to be touched, except with sterilized fire tongs.” That is also why revisionist portrayals of the 34th President have by now probably gone too far. The dress code of too many historians has included the “I Like Ike” buttons festooned on their lapels. Such scholars risk evading the paradox of the 1950’s, which is that civil liberties were infringed to preserve freedom in general. While democracy in the abstract was exalted, while “the West” was elaborately defended, particular rights were enfeebled or ignored, with the acquiescence of the Eisenhower administration. A libertarian lament for what Americans did to one another in the Ike Age does not require nostalgia for an earlier era; for the nation’s past does not radiate with any golden age of freedom of expression, when the protections of the Bill of Rights was comprehensively and satisfactorily ensured.

But during the Cold War freedoms were fettered and even dignity was trashed in a callow and indecent way. The Veterans Administration denied disability pensions to anyone convicted under the Smith Act, even when cited for heroism in the Second World War; the ill, aged, unemployed Dashiell Hammett was kept under FBI surveillance even after he got out of Federal prison; the death sentence of Ethel Rosenberg was not commuted because, President Eisenhower reasoned, the Soviets would then elect to recruit female spies. Communist Party leaders feared that they themselves would be thrown into concentration camps, as principal victims of an American fascism. J. Edgar Hoover did his best to prove the cliche that even paranoids can have real enemies. The Red Scare struck the bar, too. According to the most careful historian of this “unequal justice,” the descendants of Alexis de Tocqueville’s American aristocrats “mobilized an array of professional weapons: contempt citations, disbarment, exclusion, and subtler forms of coercive socialization” against fellow attorneys who were deemed too leftist. The United States Information Agency had a rule that prohibited mentioning—much less quoting from—Marx, Engels, Lenin or Stalin, which posed editorial difficulties for the Agency’s noteworthy scholarly magazine, Problems of Communism. (The rule was ignored. ) Paul Robeson’s name was omitted from the official list of All-American college football players of 1917 and 1918 (retroactively reducing those squads to ten men) and from the 1955 edition of Famous Negro Music Makers.

The impact of the Cold War was too mixed, however, to merit the unqualified hostility of historians. Consider, since our ground time here will be brief, two developments involving the First and Fourteenth Amendments. At least a couple of hundred college teachers lost their jobs for political reasons, even as literary historian Granville Hicks was claiming that Communists were usually too incompetent to teach anyway—a deterministic criterion that would earlier have excluded Hicks himself. Other radicals, like the historian William Appleman Williams, were subjected to the harassment of the Internal Revenue Service as well as the House Committee on Un-American Activities, but managed with pluck to survive. When ideology was supposed to be about as dead as chivalry, liberalism was beleaguered and radicalism so rare that the campus ideal of a marketplace of diverse ideas shriveled. In such an academic ambience it was natural for ex-president Charles Seymour to boast that “the last two gifts I got for Yale in my administration were half a million for the teaching of religion. . .and half a million for American studies. . . . More power to this movement to strengthen Religion and Americanism.” When MIT’s Paul Samuelson, whose professional prestige was unsurpassed, could expel Marx from the primary tradition of economic thought as “a minor post-Ricardian,” it was not surprising that so many other academicians preferred to lay off the hard stuff. Though a baccalaureate of the Frankfurt School of Social Research, Brandeis University’s Herbert Marcuse ignored Marx entirely in his major foray in cultural criticism in this era, Eros and Civilization (1955), not even giving the radical tradition from which he had sprung a decent burial.

The ideal of academic freedom was nevertheless given Constitutional protection during this very era. In 1952 two former law school professors, Justices Felix Frankfurter and William O. Douglas, articulated the principle of free inquiry in their concurring opinion in Wieman v. Updegraff, which invalidated a statute requiring a loyalty oath to be administered to Oklahoma state employees (including faculty in public colleges). “Teachers. . .must be exemplars of open-mindedness and free inquiry. They cannot carry out their noble task if the conditions for the practice of a responsible and critical mind are denied to them,” Frankfurter proclaimed; and Douglas agreed that for teachers to swear that they were neither subversives nor members of any Communist front constituted such a denial. Five years later the principle of academic freedom was confirmed by the overwhelming majority of the Court in [Paul] Sweezy v. New Hampshire, after a distinguished lecturer and economist (and, it so happened, a non-Communist member of the Progressive Party) had refused to answer a state legislature’s questions about his “opinions and beliefs.”

The impact of the Cold War on civil rights was also rather mixed. Because the struggle for black equality included Communist participation, white supremacists had a convenient way to smear the movement. The FBI’s notorious campaign to torpedo the effectiveness of Martin Luther King, Jr. has now been fully ventilated; and the bureau’s success can be partly gauged by the gratuitous malice of one of its own ex-informants, President Reagan, who in 1983 alluded to transcripts of phone taps (sealed until 2018) by throwing the following stink bomb: “We’ll know [if King was a Communist} in about 35 years, won’t we?” After 1954 Southern whites would have mounted a virulent assault on integrationists anyway, even had Communists absented themselves from the freedom struggle; but Cold War tensions made segregationist demagoguery a bit more truculent. To condemn “Dr. Karl Gunnar Myrdal,” that sinister “Red psychologist,” because his American Dilemma was put on the Supreme Court’s reading list was simply to hurl a chunk of raw meat at racist constituents who were quite carnivorous anyway; such slanders were rather marginal to the battle against civil rights.

One healthy effect of the Cold War was the growing realization that segregation gave communism a trump card in its competition for the allegiance of peoples of color. “The world is white no longer,” James Baldwin had proclaimed, “and it will never be white again”—a demographic datum that required a more serious look at racial injustice. In 1955, when a 14-year-old Chicagoan was murdered in Mississippi for allegedly wolf-whistling at a white woman and an all-white jury quickly acquitted the two killers, the federal government refused to intervene; but it could not avoid the nasty consequences that such cases as Emmett Till’s had in reducing U. S. standing in the Third World. It would be nice to know when politicians do the right thing for the right reasons, but the wrong reasons remain a fall-back position.

III

Cultural expression was thwarted and distorted during the domestic Cold War as well. Visiting England in 1956, Premier Nikita S. Khrushchev told the exiled Charlie Chaplin that he was “a genius” and announced: “They repudiate you—but we honor you.” Visiting the United States in 1959, Khrushchev was criticized for jamming the Voice of America, but thundered back: “Your great Negro singer Paul Robeson was denied the right to go abroad for some five to seven years. Why was his voice jammed,” his concerts cancelled and invitations withdrawn? Fear so pervaded Hollywood that Charles Buchinsky, a Pennsylvania coal miner’s son, had acted in 11 movies before seeing the advantage of changing his surname—which supposedly smacked of somewhere disturbingly beyond the Urals—to Bronson. Cincinnati’s first baseman Ted Kluszewski kept his name; but his team did not, switching from Reds to Redlegs. The blacklist in the entertainment industry was once an act of muzzling that dare not speak its name; by now a sagging shelf of books, plus articles and films, have made such political firings and exclusions an absorbing and exasperating saga of eggheads versus fatheads.

Because the FBI scrutinized so many artists and intellectuals, its files have become key reference works of the cultural Gold War (even if they might be classified under Fiction). Others were not allowed to visit America—like Graham Greene; some were not allowed to stay—like Arthur Koestler, whose Darkness at Noon (1941), plus his chapter in The God That Failed (1950) and some essays in The Yogi and the Commissar (1945), defy improvement as testimony to the horror of Communist despotism. Yet despite a Congressional resolution, Koestler’s request for permanent residency status was denied, because he was a former Communist. The obtuseness displayed in the political monitoring of culture shows how little point there is in studying American history without a congenital appreciation of folly. HUAC permitted such experts on communism as Adolphe Menjou and the mother of Ginger Rogers to sound off at taxpayers’ expense, while one Committee member had to ask Arthur Miller to identify Ezra Pound. The congressman was reassured that, though Pound was “one of the great poets of this century,” the former supporter of Italian Fascism was also “an anti-Communist.” In the Personnel Security Board’s report on the physicist J. Robert Oppenheimer in 1953, one of the greatest French writers of the century shows up as “a certain Dr. Malraux”; and the conspiratorial anti-Americanism of Europe’s leading Existentialist recorded in an FBI memorandum provoked Hoover to instruct his agents: “Find out who Sartre is.” No wonder then that Chaplin, who was not a Communist, once remarked that he would respond to a HUAC summons by arriving in his tramp outfit. Since the inquisitors never subpoenaed him, Chaplin’s protagonist in A King in New York ridicules the Committee by having him squirt its members with a fire hose instead. Released abroad in 1957, this bitter comedy did not premiere in the United States until 1976.

In the damage assessment of the domestic Cold War, the role of the Communist Party itself cannot be ignored. The government conspicuously abused its own power; but it is reasonable in this case to blame the victim, too, as the author of its own destruction. The party was of course much weaker than its persecutors and antagonists, but its relative impotence should not exempt it from historical and moral scrutiny. The tyranny of the Soviet Union was the original version; the CPUSA was therefore the dubbed or subtitled version—and was far less a threat to national security than to the fabric of humane values. What it worked for was monstrous, identifying with the power of a totalitarian state that true democrats naturally found abhorrent. If patriots in the 1950’s minimized or neglected the wrongs of their own society, complacently pretending that it was a field of dreams, the killing fields of the USSR left the American Communists indifferent or sometimes brazenly apologetic. If the politicians and publicists of the 1950’s reduced genuine friction and stifled dissent under the rubric of consensus, as though echoing Jefferson’s first inaugural address (“We are all Republicans, we are all Federalists”), they could be contrasted with the Soviet Premier who, even a year after his “secret speech” at the 20th Party Congress, proclaimed: “When it is a question of fighting against imperialism, we. . .are all Stalinists.” Thus the American comrades, to quote ex-President Nixon with reference to other “enemies,” “gave them a sword”; and the historian needs to operate on direct and alternating currents of criticism.

Though scholars continue to disagree in allocating and assessing responsibility for the Red Scare, its evaporation by the early 1960’s now seems fairly evident. When the big chill ended cannot be measured with precision and is open to interpretation. But when the Camp David accords between the two giant states were signed in 1959, when the Supreme Court began to hamstring the investigating committees and the inquisitors were no longer so grand, when the credits on films like Exodus and Spartacus— written by the blacklisted ex-Communist Dalton Trumbo—rolled in 1960, when the prison sentence of Communist functionary Junius Scales was commuted without the condition of his turning informer in 1962, when a beloved young president was shockingly assassinated by a former defector to the Soviet Union without inciting vigilantism, the powers of fanaticism were shown to be exhausted. Americans had gained a greater sense of proportion and, without opposing an interventionist and ardently anti-Communist foreign policy, were willing to sign the death certificate of the Red Scare at home. Autopsy notes reveal that the dominant impulses of American life were being channeled elsewhere. Those proclivities were not ideological but pecuniary, not public but private, not counter-subversive but consumption-oriented. In a sense the citizenry had been asked to choose between defending the ramparts of democracy, which required self-sacrifice, and celebrating the fruits of capitalism, which required selfgratification. In this conflict between duty and pleasure, between witch-hunts and tail fins, one need not be a member of Mensa to predict the winner.

To a considerable degree, the Cold War priorities of what Eisenhower called the “military-industrial complex” also enabled such conflicts to be finessed. The means of production were controlled by the same hands as the means of destruction; a booming economy not only provided guns and butter but managed to make the needs of warriors and consumers compatible. The push buttons that were designed to make housework easier came from the same laboratories as the push buttons for guided missiles. The brilliant inventor of the Polaroid camera, Edwin Land, was among the key intelligence consultants for the U-2 flights over Soviet territory. While working for Mattel, Inc. , Jack Ryan helped design the most popular doll in history, Barbie, a fetish of commodities from the moment of her birth in 1958. While working for Raytheon, Ryan also helped develop the Hawk and Sparrow III missile systems. A descendant of Ralph Waldo Emerson, the versatile R. Buckminster Fuller, applied systems theory to architecture by constructing the first geodesic dome in 1948; his best customer was the Pentagon. When Richard Nixon and Khrushchev came under one roof in 1959 to ignite “the kitchen debate” over the relative merits of washing machines and rockets, companies like General Electric, Westinghouse, and Goodyear were lavishing much of their advertising budgets on promoting the necessity and efficiency of their military weaponry. The secretary of defense at the time of the debate was Neil McElroy, who ran the world’s largest planned economy outside of the Soviet Union’s and had previously served as the CEO of Procter & Gamble. Yet even in 1957 there emerged a sign not only of the fallibility of private enterprise but also of an alternative route to military power and even domination of the heavens; the rise of Sputnik and the fall of the Edsel occurred almost simultaneously. But otherwise few phenomena of the 1950’s seemed more self-evident than the vitality of American capitalism, and such prosperity could not easily be squared with the demands of incessant political vigilance.

IV

Nor, in any event, do many Americans make good haters. They prefer reconciliation (or amnesia), and are usually ready to let the losers keep their horses for the spring plowing. No grudges against her seem to have hampered Lillian Hellman, an apologist for Stalinism who, after outmaneuvering HUAC in 1952, eventually joined the editorial board of The American Scholar (1973—78). The playwright received a standing ovation at the 1978 Oscar ceremonies after having been glamorously portrayed in Julia by Jane Fonda—all without any more owning up to her pretty-in-pink past than Martin Heidegger had felt obliged to explain his pro-Nazism. As a later Communist Party dissident explained, in refusing to talk to a reporter: “We don’t want to air our dirty Lenin in public.” Perhaps “the torment of secrecy” was a legacy of shame from the 1950’s that both sides—from bureaucrats and blacklisters to Communists and their sympathizers—shared. Long after any legal or political or economic penalties would have been imposed, some former “progressives” continued to deny or evade the implications of the cause they served, as the journalist Carl Bernstein discovered in confronting his own parents in Loyalties (1989). Such absence of candor not only perpetuated the postwar behavior patterns that aroused such suspicion in the first place, but also violated Bob Dylan’s injunction in “Absolutely Sweet Marie” on Blonde on Blonde (1966): “To live outside the law, you must be honest.”

When bold, open, and direct methods of resisting authority were used in the 1960’s, the residual forces of political orthodoxy suddenly looked anachronistic, as pacifists, civil rights activists, and other non-Communist radicals discredited the passions of the previous decade. When HUAC foolishly subpoenaed Jerry Rubin and Abbie Hoffman in 1966, the two proto-Yippies put on a clown show that topped the committee’s and robbed it of its power to intimidate. Later, when another whimsical antiwar activist, Father Daniel Berrigan, managed to play rope-a-dope with the FBI for apparently as long as he wanted to be a fugitive, the bureau that had served as the mission control of the domestic Cold War was stripped of its aura of sanctity.

No self-respecting scholars can ever conclude their enterprise without discerning the need for further research, and this tradition requires at least a modest proposal for anyone curious about the period: to consider the Cold War not only globally (as a geopolitical contest) but also locally. How did the anti-Communist fears of the 1950’s affect American communities? What were the consequences of the Red Scare for ordinary American life? Were academic communities, as Lord Russell claimed, atypical “islands of enlightenment” in this era? Pick a specific town, preferably an idyllic Norman Rockwell community—say, Stockbridge, Massachusetts, the village in the Berkshire Mountains which the illustrator himself chose when he moved from Vermont in 1953 (a couple of decades ahead of the exiled Aleksandr Solzhenitsyn). The radiance of Roosevelt’s Four Freedoms had been indelibly presented in the illustrator’s version of the purpose of World War II, with presumably universal ideals applied to an exclusively national setting, even if freedom of speech and freedom from fear were not quite as secure as they should have been a decade after that struggle. Rockwell’s Saturday Evening Post cover of May 26, 1945 of the young veteran’s return home to his family and neighbors and bashful sweetheart suggested how domestic the pursuit of happiness was expected to be.

Stockbridge itself was not protected from the pressures of the Cold War, however, since its Austen Riggs Center was the haven where Erik H. Erikson worked after 1950, upon refusing to sign the University of California loyalty oath at Berkeley. The psychoanalyst had never been a Communist, though he had been a vulnerable non-Aryan refugee from Nazism; and Erikson’s principled willingness to risk his job and his security was not only a sign of his courage; the ease with which he found a home and soon fame validates some of the benevolence and decency of “the American way of life” that Rockwell painted. Another resident of Stockbridge (though part-time) was Reinhold Niebuhr, an ideologist of the Cold War who nevertheless generally opposed the sacrifice of civil liberty. Perhaps no American intellectual was more influential in the postwar era, whether chairing the advisory committee of the Department of State’s Policy Planning Staff (under George F. Kennan), or explaining how communism itself represented the perverted offspring of the very Enlightenment which was the source of America’s own ideals. Practicing the humility that he preached, Niebuhr managed to reconcile anticommunism (especially abroad) with opposition to anticommunism (in its conservative and reactionary versions) at home, and conferred upon ADA liberalism a philosophical profundity.

Stockbridge was hardly typical, but then no community can be; nor could its three famous residents be immune from the sort of changes that were percolating just below the surface in the 1950’s, from the frustrated impulses that could be quelled for only a few more years. In the following decade, the sunny-side-up mood darkened; and the innocence that was so central to the national faith was ruptured. Only four of Rockwell’s 317 covers for the Saturday Evening Post, from which he parted in 1963, had depicted any blacks at all. A year later the Saturday Evening Post was excerpting what must have seemed an artifact from another world, The Autobiography of Malcolm X; and soon thereafter Rockwell would paint for Look Magazine a grim depiction of murdered civil rights workers. Suddenly “the old swimming hole was polluted,” one curator observed. So rapidly had the velocity of history accelerated that Rockwell even did a cover for the May 1967 issue of Ramparts, a radical monthly, in which he portrayed the fierce philosopher who symbolized the opposition to the arms race and the Vietnam war that the Cold War itself had sanctioned: Bertrand Russell. It is interesting that both Niebuhr and Erikson were fascinated by the example of Gandhi, whose independence movement Moral Man and Immoral Society (1932) had already described as the best method for American blacks to adapt in their quest for justice. Erikson would win a Pulitzer Prize for his study of Gandhi’s Truth (1969), in which he developed the notion of “pseudospeciation” as an indirect way of criticizing the intervention in Vietnam, which Niebuhr even more strongly condemned. In 1953 the psychoanalyst had discussed “totality and wholeness” at a major academic conference in Boston on totalitarianism. Eighteen years later he would be “rapping” with the Black Panthers’ Huey P. Newton, as the very cohesiveness of American society seemed imperiled, and its dream of escaping the complicity of history was shattered.

It was clear by then to many intellectuals that the nation’s faith in its own righteousness would have to be surrendered. The Cold War had been used to justify not only the restriction and violation of civil liberties, but also the subjugation of culture to politics—as in the revelation of CIA subsidies to the Congress of Cultural Freedom and Encounter magazine. Ramparts itself had helped not only to expose such corruption but also more generally to trace the nemesis of imperial power—how enmeshed the American polity and its leaders had gotten in the moral quicksand into which other countries had already sunk. Historical “necessity” became the mother of intervention, which had dangerous consequences. Such a foreign policy was a dramatic break in the continuity of American statecraft, which once constricted the United States to serving only as a moral example, wishing others well. This vision, which John Quincy Adams had so eloquently championed more than a century earlier, was discredited after the collapse of collective security to resist totalitarianism in the 1930’s had caused a world war. Those who had lived in the Age of Jackson could scarcely have foreseen the Age of “Jacson”—the code name of the GPU political assassin who planted an ice-axe in Leon Trotsky’s skull and was awarded, at Stalin’s command, the Order of Hero of the Soviet Union.

Such were the methods that the Cold War was intended to combat, and the gentleman’s prim code manifestly belonged to an earlier era rather than to the patriotic consensus that had motivated both the Rev. William Sloane Coffin (Yale ‘49) and William F. Buckley (Yale ‘50), both earlier tapped for Skull and Bones, to join the CIA, an organization dominated by men of pedigree and social connections. The change in the political conduct of gentlemen was so sharp, the global struggle seemed so durable, that it will take time to adjust to the shock to the system. Now that communism in the Soviet Union has disappeared (and even the USIA’s Problems of Communism is defunct), the seriousness of the Soviet diplomatic and ideological challenge may not be believed, and even its actuality impugned or represented as surreal—just as triumphalism and national pride may obscure the cost to liberal and civic ideals that the domestic Cold War inflicted. It is therefore up to historians—and to other, thoughtful citizens—to see the public culture of the Red Scare bifocally, and to ponder the question that “Rabbit” Angstrom poses. That question is still at the center of the national destiny and identity.

0 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Recommended Reading