Skip to main content

The Great War and American Memory


ISSUE:  Winter 2003

Academic historians have a penchant for tagging simple things with fancy titles. One example is interviewing or seeking out recollections by participants in events. Right after World War I such journalist biographers as Ray Stannard on Woodrow Wilson and Burton J. Hendrick on Walter Hines Page did those things without thinking twice about them. Twenty years later, however, when academics belatedly discovered the value of this kind of evidence they dubbed their practice “oral history.” There is also the matter of posing alternatives to what actually happened. Historians cannot avoid doing it, and in fact they do it all the time. But among academics it is customary to disdain such speculation as “iffy history”—customary, that is, unless the practice goes under the fashionable rubric of “counterfactualism.”

Another favorite academic term is “discursive,” which usually means rambling or meandering. Called by any name, this is the best way to approach the subject of what World War I has meant in American memory. One bit of discursiveness is to pose the question that Shakespeare has Juliet ask, “What’s in a name?” What lies behind the different names that the British and Americans give to the cataclysmic conflict that raged from 1914 to 1918? The British called it the “Great War,” and they have persisted in using that name even after a second conflict of even vaster scope and devastation marked this earlier one as the first in a series. That name never caught on in the United States. Instead, Americans have usually called this conflict the “World War,” and they did so even before the advent of its successor. The term originally arose in Germany, to refer to the means by which that nation meant to achieve “world power”—weltkrieg as the path to weltmacht. In the United States, Theodore Roosevelt used the term in 1915, but it really came into use in 1919 and 1920 when German writings about the war were translated and published here.

The difference in usage is easy to understand. Europeans disagree, but American intervention was what really transformed the 1914—1918 conflict into a true world war. This was not just a matter of expanded geographical scope. Rather, both American entry into the war in April 1917 and the ideological justification that President Woodrow Wilson gave for this action transformed the stakes of the conflict. The Bolshevik Revolution six months later further altered those ideological dimensions. True, before 1917 both sides had tried to make mischief for their adversaries by fomenting nationalist revolts among the other side’s subject peoples or encouraging radical dissent in their homelands and borderlands. In 1917, two German ploys—their attempt to entice Mexico into the war against the United States with the Zimmermann telegram and their repatriation of Lenin to Russia by means of a sealed train—would become the most famous examples of such strategic and ideological guerrilla raiding. But it was really American entry, together with Wilson’s war aims and peace program, that began to make this war something more than a traditional European national-dynastic-imperial conflict carried out by modern technological means.

Another bit of discursiveness is blatant plagiarism. All but one word of the title of this essay are openly and egregiously pilfered from Paul Fussell’s The Great War and Modern Memory. To anyone interested in World War I or military history or historical memory this book comes as nothing short of an epiphany. When I first read this book I felt the way John Keats did when he first looked at Chapman’s translation of Homer:

Then felt I like some watcher of the skies
  When a new planet swims into his
  ken;

Or like stout Cortez, who with eagle eyes,
   He stared at the Pacific— . . .
Silent, upon a peak in Darien.

Just as Chapman enabled Keats to overcome his lack of knowledge of Greek, Fussell enabled me to see World War I and how it has resonated through people’s consciousness ever since in ways that I could not have done before. In all honesty, I have to confess that Fussell’s lengthy treatments of British poets, fascinating as they are, did not capture my fancy as much as other things in the book. For example, there was his observation about how disconcerting it was for the Tommies of this war to go back and forth from the warmth and comfort of home to the rigors and horrors of the trenches within a few hours—which he contrasted with separation that World War II American GI’s experienced with their long overseas deployments. Another fascinating observation of Fussell’s is how literate the British soldiers of World War I were and how so many of them carried The Oxford Book of English Verse in their kits. This was the moment of highest literacy in history in two ways. First, cheap printing and mass public education had spread the ability to read so broadly in the Western world, and, second, other media, such as radio and motion pictures, had not yet undermined the primacy of print in conveying information, opinion, and entertainment.

My favorite of Fussell’s insights is his recounting the story about how in 1917 British mining troops dug under the German lines in Belgium and filled seven tunnels with thousands of tons of explosives. When detonated, those explosives produced the loudest human-made sound up to that time, audible even across the Channel in England. The ensuing havoc and confusion almost enabled the British to break through the German lines. Only their own disorganization and lack of initiative prevented them from exploiting their breakthrough. Perhaps that failure also sprang from what John Keegan has more recently emphasized—the lack of reliable “real time” communications between commanders and front-line troops. Something else went wrong, too. The explosives in only five of the seven tunnels went off. Of the remaining two, one blew up in the 1950’s, while the other remains buried and unexploded to this day. Taken together, the massiveness of the operation, the force of the technology, the persistent ineptitude of the commanders, and the long-running aftereffects seemed to me to furnish a compelling metaphor for the meaning to this war, called by whichever name.

What I wish to do is bring this war home to America. Specifically, I would like to add a few further thoughts about “what’s in a name” by offering a further reflection on why Americans do not call this the Great War. I would also like to try to connect this war to American memory. I have no pretensions to being anywhere near the literary scholar that Fussell is, so I shall not even try to relate this war to American literature. Others have done that well, especially with such writers as Ernest Hemingway and John Dos Passos. I have nothing to add to what those scholars have written, except to offer one observation. Long ago, Malcolm Cowley and others noted that the members of this famous “lost generation” found disillusionment on the Western Front mainly because they were looking for it already. Far be it from me to denigrate anyone’s experience in war, but I do think that, especially in the case of Hemingway, there is a disparity between cause and effect. He was wounded shortly after his arrival in Italy and spent the rest of the war in hospitals. Neither he nor the other American writers went through the same experiences of trench warfare as such British writers as Wilfrid Owen and Siegfried Sassoon. I think that disparity is emblematic of the differences between the European and American experiences and memories of this war.

II

It is worth considering in general what the American experience and contribution were in this war. First of all, the United States entered the war later than any other major belligerent and was in the war the shortest time. Only a little more than 18 months elapsed between intervention and the Armistice. Furthermore, except for token units, no American ground forces saw action on the Western Front before early 1918, and the vast majority of them were in combat only during the last three months of the war. Likewise, in one way, those forces were not absolutely essential to winning the war. The critical moment had come in the spring of 1918, when the British and French stopped that massive German offensive before any appreciable numbers of Americans could reach the front lines. After that, the Germans had no chance of winning.

This observation is not meant to denigrate the American contribution to the Allied victory. In several respects, this country’s intervention was absolutely vital to the survival of the Allies. In 1917, America’s being in the war forestalled a disastrous financial collapse by the British, which would have curtailed or might even have cut off their essential overseas supplies of food, munitions, and other materiel. Likewise, the influence of the American admiral William S. Sims, in favoring the convoy system and the availability of American naval vessels for convoy duty succeeded in blunting the effectiveness of the Germans’ all-out submarine campaign. In 1918 perhaps the most important American contribution was an imponderable one. Who can argue against the proposition that knowing that “the Yanks are coming” played a huge part in nerving the British and French to hold out in their last gasp effort against the German offensive? Moreover, vital as their feat was, all that the British and French succeeded in doing in the spring of 1918 was to not lose the war. They had no strength to turn the tide. It took the vast, growing numbers of doughboys of the American Expeditionary Force, the “AEF,” to break the stalemate on the Western Front and achieve victory.

A bit of “counterfactualism” is in order here. Let me pose the question of what the war might have looked like if it had lasted longer. The Allied war plans called for a crossing of the Rhine and an invasion of Germany early in 1919, with a timetable for total victory by summer. The major role in that last planned phase of the war necessarily fell upon the AEF, because this was the only force large enough to pull off such offensive operations. In addition, that last phase of the war might have looked different in weaponry from the previous four years. The British had finally succeeded in building more reliable tanks, and they were learning how to deploy them effectively in combat. So were the Americans. Such bright young officers as Major Dwight D. Eisenhower never made it to Europe because they were still stateside, training in tank warfare. Similarly, in the air, planes were coming into production with more powerful engines capable of dropping much larger payloads of bombs and deploying over longer cruising ranges.

It is not hard to surmise what that longer war would have looked like. It would have looked much more like World War II. In contrast to the previous four years of stalemate in the trenches, this aborted later phase would have been a war of movement. Perhaps the tanks would not yet have spearheaded a blitzkrieg, but they would almost certainly have breached enemy lines and permitted the Allied forces to keep advancing. Also aiding those advances would have been larger scale air campaigns with tactical bombing in support of the ground operations. Doughboys would have moved into the German heartland, and their commander, General John J. Pershing, might have realized his fondest wish of leading a victory parade in Berlin down Unter den Linden. Politically, “Black Jack” Pershing’s two worst breaks were the tragic death of his charming and savvy wife in 1915 and the war’s sudden ending before he could lead the conquest of Germany. By itself, that achievement might have been enough to overcome his political deficiencies and make him president. As it was, World War I would be the only war besides Korea and Vietnam not to propel a military commander or hero into the White House. Instead, this war would spawn a civilian hero president in the person of Herbert Hoover.

The brevity of participation and the unfulfilled military ambitions of World War I go a long way to explain why Americans have never called it the Great War. It was not just a matter of the term “world war” fitting the correct notions of the meaning of our role in the conflict. Rather, it was the inescapable circumstance that in certain critical respects this was not a great war for the United States. This conflict was destined to share the fate of all but two American wars. With the exception of only the Civil War and World War II, this would become another of America’s “forgotten wars.” In many ways, such a fate is undeserved. World War I witnessed the first rapid, full-scale mobilization of manpower and the industrial economy for a major conflict. The raising, training, and equipping of five million troops and the dispatch of two million of them overseas within less than a year and a half were remarkable feats. This experience paved the way for fighting the rest of the wars of the 20th century. But two incontrovertible facts remain: not that many of the American forces saw action, and their role in the field did not appear to be indisputably decisive. The contrast with World War II is striking, and with that second war coming soon after this one it is not hard to see why this one fell into the shadows in American memory.

Still, those shadows did not fall right away, nor did they fall with equal darkness for everyone in the United States. For American military officers, memories of World War I shaped thought and conduct in important ways. With the significant exceptions of Eisenhower and Omar Bradley, all the leading commanders of World War II had gone overseas with the AEF. George Patton and Douglas MacArthur both saw action on the Western Front and displayed the more and less attractive traits of character that marked their careers a quarter century later. Billy Mitchell developed his ideas about air power out of his experiences in France. The most important formative experience for later command from this war belonged to George Marshall. One of the strengths of John S. D. Eisenhower’s recent history of the AEF, Yanks, is its recounting how Marshall’s different assignments in 1917 and 1918 prepared him so well for his role in World War II. Yanks could well have been subtitled “The Education of George C. Marshall.”

Before those officers’ and other veterans’ experiences in World War I could affect their conduct in another war, two decades intervened in which memories of this war were strong and affected many aspects of American life. Looking at these other aftereffects, what is most striking is the similarity rather than the difference between European and American behavior. This similarity surfaced almost at once after the Armistice. Despite Wilson’s differences in rhetoric and ideology with the other Allied leaders, he displayed the same divisiveness that they did about peacemaking. Looking back at the Paris peace conference, it seems amazing that Wilson and his cohorts were able to put together a settlement at all. He wanted and got the League of Nations. He also wanted relatively non-punitive treatment of the defeated powers, and critics are still disputing how much or how little of that objective he attained. Otherwise, the “Big Four”—Wilson, David Lloyd George, Georges Clemenceau, and Vittorio Orlando—clashed repeatedly over territorial claims, reparations, colonial adjustments, and nationalist aspirations of formerly subject peoples. Their divisiveness—not the specific terms of the Treaty of Versailles—was what doomed it to failure. With so little common agreement in the face of the defeated foe, it was no wonder that the victors lacked the will to maintain this settlement. The United States was the first to defect, through its failure to ratify the treaty and join the League.

III

The domestic controversy that kept the United States outside the Versailles settlement—better known as the “League fight”—hinged in part on memories of the conflict that had just ended. Wilson maintained that the United States must become a fully participating member of the League and guarantor of the peace treaty in order to prevent another world war of even larger scale and destructiveness—which he believed was a grave danger. His critics and opponents not only recoiled from what they regarded as sacrifices of sovereignty and unwise or unnecessary commitments under Wilson’s plan, but many of them also scoffed at the likelihood of the defeated powers rising from the ashes. Only a small minority among the participants in the League fight either rejected the notion of some kind of American interest in insuring the victory just won or criticized the settlement as too harsh on the vanquished. Imperfect as the measures of public opinion then were, all indicators showed that only a small minority of people held to isolationist views and rejected overseas commitments out of hand. In short, there was no “retreat into isolation” during the League fight, and almost no one expressed disillusionment with the war or its outcome. Those things came later.

For the first decade after the war, neither isolationist sentiment nor disaffection from the war gained much ground. Isolationism remained the faith of a small sect of politicians and intellectuals, nearly all of whom ranged on the left side of the political spectrum. During the 1920’s this group played an interesting and often constructive role in American foreign policy. Under the titular leadership of Senator William E. Borah, who became chairman of the Foreign Relations Committee in 1924, this group promoted disarmament, non-interventionism in the Western Hemisphere, anti-colonialist sympathies elsewhere, and non-coercive alternatives to collective security, especially the outlawry of war, which culminated in the Kellogg-Briand Pact of 1928. Under the Republican administrations of that decade, those policies usually coexisted and sometimes clashed with a resurrection of the party’s pre-war predilections for discretionary involvement in international power politics. A full-scale confrontation between those two approaches did not occur until the Manchurian crisis of 1931, and that belonged to the breakdown of international order of the next decade, not the comparative international calm of the 1920’s.

In various measures, disaffection from World War I characterized the isolationist group of the 1920’s. Some of its leading lights, such as Senators Robert M. La Follette and George W. Norris, had opposed intervention in 1917 and never repented their stands. Others, such as Borah and their Senate colleague Hiram Johnson, quietly repudiated their earlier unenthusiastic support of the war. Newer recruits to this group also questioned the wisdom of intervention. So did a number of historians and other writers who called themselves “revisionists” and expressed doubts about both moral differences between the sides in the war and American motives for entry into the war. But it would be wrong to exaggerate the strength or prominence of these views. By and large, the war remained a non-controversial memory, featuring rituals of rhetorical gratitude to the veterans and little concern about any repetition of what had happened before.

The world turned upside down in the 1930’s, and these attitudes went topsy turvy, too. With the renewal of Japanese expansion in Asia and the advent of Hitler in Germany, it was clear that international order was breaking down. Mussolini’s conquest of Ethiopia and the outbreak of the Spanish Civil War soon provided further proof. Earlier it had been easy to be casual about world affairs and memories of the war, but now all kinds of people high and low began to take these matters seriously and take stands in ways that they had not done before. The most striking aspects of this sea change of opinion were its breadth, strength, and near unanimity. Among the three principal victors of the world war, Britain, France, and the United States, sentiment among politicians and the public turned rapidly and decisively in an anti-interventionist direction, with corresponding expressions of disillusionment with the war. This swing had different names on the two sides of the Atlantic. In Europe it came to be called appeasement; in the United States it was called isolationism. There were national peculiarities to expressions of these attitudes, but at bottom they were strikingly similar everywhere.

Why this massive shift in attitudes occurred is an interesting question. Obviously, on one level this was a response to deteriorating, more dangerous international conditions. But that fact does not sufficiently explain either the intensity or the root causes of this change. Sooner or later, in examining this question, all roads lead to the Depression. It is tempting but not terribly facile to say that everything about this turn of affairs stemmed from the Great Depression. Unquestionably, economic woes in Japan and Germany paved the way for the rise of aggressive expansionism in both countries. Likewise, widespread misery in Britain, France, and the United States fed a climate of fear and defeatism, exacerbated by a generalized sense of insecurity. In America, the special circumstances of the discrediting of big business and the Republican party and the corresponding rise of the political left enhanced the attractiveness of the isolationist and disillusionist views earlier preached by La Follette, Norris, Borah, and Johnson. La Follette had died in 1925, but one of his sons took his place in the Senate, while the other became governor of Wisconsin. The others were still around, and they attracted new recruits in such senators as Gerald Nye and Burton K. Wheeler. In 1934 and ‘35, this group found a major platform for their views when Nye chaired a Senate subcommittee that investigated the alleged role of the munitions industry in getting the United States into the world war. The Nye Committee produced little hard evidence of any such role, but it generated sensational headlines and aired revisionist and disillusionist views to a wide audience.

In America, the upshot of this shift in attitudes and a reflection of the influence of the Nye Committee was an upsurge in isolationist sentiment that found its way into law and policy. Starting with the Senate’s surprise rejection of membership in the World Court in 1934—which had previously been pushed by Republican presidents as well as now by the Democratic president, Franklin Roosevelt—both houses of Congress swung overwhelmingly isolationist. In 1935, ‘36, and ‘37, Congress passed successive Neutrality Acts by lopsided margins. Those laws forbade American ships to sail into war zones or ports of belligerent nations, citizens to travel on merchant vessels belonging to belligerents, banks to lend money to nations at war, manufacturers to sell any armaments or other specified war-related products to warring countries. All of those measures sprang from ideas that had first been advanced by anti-interventionists during World War I as ways to keep the country out, and they all reflected arguments that revisionists had advanced since then to explain why the United States had wrongly gone to war. One of the few newspapers to criticize the Neutrality Acts scoffed at one of them as “an act to keep the United States out of the war of 1914-1918.” This was clearly a case, not of fighting the last war, but of something even more common, trying to stay out of that war.

Nor did the isolationist upsurge end with the passage of laws on Capitol Hill. One other idea first put forward to keep the country out of the world war came within hailing distance of becoming an amendment to the Constitution. This was a proposal to require a popular referendum in order to declare war except in case of attack on the continental United States. Perennially offered by Representative Louis Ludlow of Indiana, this amendment languished in the House Rules Committee until December 1937. Then, in response to the Japanese attack on the U. S. S. Panay in China, a majority of the members of the House signed a discharge petition and brought the Ludlow amendment to the floor for a vote in January 1938. Thanks to lobbying and arm twisting by FDR, the amendment failed, 188 to 209, but so many votes cast for such a drastic measure attested the strength of isolationist sentiment. The president’s opposition was a belated move on his part. In the 1936 campaign he had given the only foreign policy speech by either candidate. Roosevelt had then declared, “We are not isolationists except insofar as we seek to isolate ourselves completely from war.” A true-blue isolationist like Borah or Nye would not have said it any differently. Another manifestation of such sentiment was the college student organization called the Veterans of Future Wars, which called for pensions to be paid at once to young men against service in upcoming conflicts. This was a tongue-in-cheek expression of the same sentiment that had led British students to sign the Oxford Oath, under which they pledged not to fight for king and country.

Memories of the world war entered directly into this sentiment. 1937 was the 20th anniversary of American intervention in the world war. On Capitol Hill, members of Congress marked the occasion by honoring their colleagues who had voted against the declaration of war. Coming in for special praise was Senator Norris. By this time, new-style public opinion polling had come into use, and in April 1937 a Gallup poll found that 70 per cent of respondents believed that it had been a mistake to enter World War I. Clearly, then, broad agreement, verging on a consensus, existed among the public and their political representatives. “No more foreign wars” was the watchword of the day, presumed to be the beginning and the end of foreign policy wisdom. The closest things since then have been the “no more Munichs” slogan of the post-World War II era and the “no more Vietnams” sentiment of the 1970’s. In all these cases, when nearly everybody agreed on what should be the lodestar of foreign policy, plenty of hard thinking was not getting done.

It might be tempting to close an examination of the memories of this world war by noting those manifestations of attitudes. After all, within a few years, World War II would cast its deep, eclipsing shadow over what had gone before, and that second war would become the reference point for memory and supposed lessons. But it would be wrong to stop looking at the memories of the earlier war at this point, for one simple reason. What happened to the “no more foreign wars” attitude? It did not last. Within two or three years of that 20th anniversary of intervention, those attitudes would no longer command the allegiance of a majority, at least not without serious reservations. Within five years, those attitudes would themselves be in near total eclipse. Clearly, something more was at work in the late 1930’s than just disillusionment with the world war and isolationist sentiment.

IV

Looking back with the inestimable benefit of hindsight, what seems remarkable is not how strong and durable the upsurge of anti-interventionist sentiment of the mid 1930’s was. Just the opposite. It is remarkable how relatively weak and transient it proved to be. True, that weakness was not at all apparent at the time. Isolationist convictions remained strong in some quarters right down to the attack on Pearl Harbor. The isolationist superstar of the time, Charles Lindbergh, was preparing to address a big rally sponsored by the premier anti-interventionist organization, the America First Committee, that very night. Yet from the outbreak of war in Europe in September 1939, the Roosevelt administration pursued a settled policy of weakening the Neutrality Acts and giving aid to the Allies in this new global conflict. To the frustration of his more straightforwardly interventionist supporters and aides, FDR moved cautiously, often downright deviously, but he moved steadily to make a sham of neutrality and to put the United States in a state of quasi-belligerency that lacked only an outright declaration of war. Some of this he did by executive action, but many of his actions had to go through Congress, where he never lost a vote. Who, in 1937, could have predicted that this would happen?

No historian has ever given a really satisfactory answer to why that isolationist flood tide of the mid-‘30’s ebbed so fast and so far. The fault lies in not recognizing that there is an important question to be answered. One way may be to hark back to the experience and the memory of the earlier war. What I have studied about that time leads me to believe that, incomplete and unsatisfactory as much of what happened in 1917 and ‘18 seemed to many people, those events had a profound and lasting effect on Americans.

Another unpredictable, puzzling development also hints at such effects. Diminished though they were in strength and appeal, the isolationists did not fade away in 1940 and ‘41. During 1941 congressional opposition grew against the Roosevelt administration’s moves to raise the pitch of quasi-belligerency. Not only a one-vote squeaker on the extension of the draft act in the House in August but also the October and November neutrality revision votes there showed that resistance to intervention was still strong. So did the public opinion polls, which revealed something like schizophrenia toward the war. On the question of whether Britain’s defeat would pose a security threat to the United States, around 80 percent answered yes. Then, on the follow-up question of whether, in that event, the United States should enter the war, roughly the same percentage answered no. Given the lasting heat of the debate in 1940 and ‘41 and the persistence of anti-interventionist sentiment, who would have predicted that all debate and doubt about the war would evaporate so completely after Pearl Harbor? But that is exactly what happened. Former isolationists either lapsed into silence or climbed aboard the war bandwagon with varying degrees of enthusiasm.

Of course, much of the post-Pearl Harbor rallying stemmed from the sheer fact of the Japanese attack. Some of it also reflected the moral clarity of the conflict. Even before Pearl Harbor, isolationists and anti-interventionists had labored under the moral burden of seeming to give aid and comfort to the world’s bad guys. Something else was at work, too. Those fragmentary and imperfect measures of American opinion 20 years earlier had not lied. The earlier war does seem to have instilled a widespread conviction that the United States was inextricably involved in international politics and had an inescapable obligation to maintain peace and resist aggression. All the disillusionment and isolationist flowering of the 1930’s had never quite erased those convictions. It is not possible to explain either the rapid ebbing of isolationism at the end of the ‘30’s or its total eclipse during World War II without harking back to the lingering and powerful aftereffects of World War I.

That second war was America’s “great war” in all the ways that its predecessor had not been. American participation lasted over twice as long. American mobilization and contributions were even more massive than before, and this time they were indisputably decisive in winning the Allied victory. But Americans fought their 20th-century “great war” as the sequel and successor to World War I. This was true not only in the military experience gained earlier and lessons learned from the earlier conflict. It was also a matter of how Americans justified the war to themselves. The public came out of World War II with an enduring commitment to collective security, and they sold themselves on this proposition as a matter of rectifying the past errors from the previous war. World War II witnessed a posthumous apotheosis of Woodrow Wilson, whose reputation had practically fallen into the gutter in the ‘30’s. Likewise, both the United Nations and the more general commitment to international involvement came clothed in the garb of belatedly heeding the warnings of Wilson as a prophet who had tried to save the world from another cataclysm.

World War I was never the “great war” for Americans the way it was for Britons and Europeans. It did not loom so large at the time, and it does not exert the same hold on American memory. Beyond question, the Civil War has stood the test of time as such a great war. World War II may also have attained that status. The current nostalgia and popularity of World War II owe a lot to the huge numbers of its veterans—the largest such veteran population in American history—and their children, who grew up in the shadow of their fathers’ and mothers’ experiences. But the size and decisiveness of World War II seem likely to entrench it firmly in American memory for generations to come. World War I does not and will not enjoy that degree of fascination—which is too bad. If it was not a great war for America, it was the precursor and shaper of memory that made the later U.S. role in the great war of the 20th century possible. For that reason alone, World War I should not be put away on the shelf of America’s forgotten wars.

0 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Recommended Reading