Skip to main content

Up for Grabs


ISSUE:  Summer 1997

A child of immigrants, I like to think that the muddled English of my parents’ generation was an exercise in patriotism, its imprecision testimony to democracy’s premise that one man’s opinion is as good as another’s. Take prepositions, for example. In the 1940’s, a visitor to our Bronx apartment who consented to the obligatory cup of coffee would invariably be asked, “What would you like to it?” “You mean with it?” I would interject.

“Have it your way” my mother would respond. That lost world where prepositions were up for grabs is vividly evoked in the movie, Avalon, a charming immigrant saga whose most memorable scene depicts an older brother, aghast that the family’s Thanksgiving dinner had commenced before his arrival, blurting out, “You cut the turkey without a brother.”

Such usage tends to verify Yeats’ grim foreboding about the center not holding. How can it when we no longer know with what preposition we are to center the center? Must we, as I was once instructed and now instruct, center on? Or may we, as my students insist, center around?

A certain predilection for ambiguity was implicit in the melting pot trope with which generations of Americans like me were raised. Not that we were spared withering examples of the need for clarity and precision, as demonstrated by my Hebrew school teacher, who, lighting up yet another Chesterfield, endured the impertinence of my friend, Melvin, pointing to a sign above the blackboard that warned in bold letters, “NO SMOKING.” The rumpled Mr. Juvenal nodded and said, to the accompaniment of a stream of smoke, “Very smart, wise guy. Does it say POSITIVELY NO SMOKING?”

Such fastidiousness eluded my Uncle Nathan, who was into nuance of a sort. An immigrant himself, he was more often than not unemployed, dependent on periodic handouts from his younger brother, my Uncle Moe. Now and then Nathan provided an accounting of monies received, as when we were on our way to his mother’s, my grandmother’s, funeral some 37 years ago. Between reprises of “Dixie” that he whistled absent-mindedly, Nathan disclosed how he had spent Moe’s latest subsidy.”I bought myself a suit, a coat, and assorted other utensils,” he announced. Perhaps Nathan meant other assorted utensils. But to question Nathan’s meaning was to invite his customary riposte, “I meant what I said.”

Uncle Nathan’s story came to mind when another gatherer of utensils, an anthropologist at my university, questioned some comments I had made at a recent symposium in which a fellow historian and a philosopher were considering the merits of Cultural Relativism vs. Cultural Imperialism. I had, innocently I thought, asked whether the juxtaposition of these concepts implied a denigration of the Western tradition. Imperialism, after all, amounts to what some political theorists, in characteristically graceful language, call a “negative evaluative-descriptive” term. One need only listen to the litany of abuses attributed to imperialism to recognize the moral trough into which one is cast by even questioning its unqualified malevolence. The abolition of suttee and measures against the slave trade are dismissed as incidental, self-serving expedients in an irrepressible career of oppression and victimization.

The philosopher impressed me with his intellectual rigor, as philosophers are apt to do. Nathan would undoubtedly have admired his presentation.”This guy is some speaker,” he would have said, an encomium usually reserved for only the most grandiloquent of rabbis on the highest of holy days. Relativism and imperialism, the philosopher argued, were not the only or most meaningful alternatives. Indeed, there was an ambiguity to the former that made its application in a philosophical sense highly controversial. To my suggestion that relativism has primarily positive associations in today’s cultural climate, the philosopher insisted that such was not the case in philosophy. The crucial distinction, rather, was between relativism and absolutism, cultural imperialism undoubtedly being a form of the latter—a subtlety that summoned up the image of the linguist listening patiently to a lecture about the abundance of positive negatives and the paucity of negative positives, only to mutter gravely, “Yeah, yeah.” Perhaps it was my obduracy in insisting on an appreciation of nuance that led the anthropologist to urge that I not be so sensitive to criticism of Western civilization.

Were they still alive, Nathan and Moe would both be astonished to learn that their nephew has been designated a Keeper of the Western Intellectual Flame. It astonishes their nephew as well. Everything, after all, is relative—another proverb of my youth—and surely I know, as well as the next person, that everyone is entitled to his own opinion, whether about the selection of a preposition or, vide Nathan, all that may properly be subsumed under the rubric “utensils.” Yet, entitled or not, I confess, my ingrained sense of tolerance is occasionally strained by the primitive egalitarianism that I so often encounter among my own students.

That Mussolini made the trains run on time may no longer suffice as a counterweight in the balance sheet of fascism. Still, there are two sides to every story, as my students are forever reminding me. When I discuss Hermann Hesse’s Siddhartha in my annual effort to convey to students in my World Civilizations class an “Eastern” worldview, I ask about the representation of Kamala, the seductive temptress. “She’s a prostitute,” one student recently said dismissively. To which another responded, “That doesn’t mean she’s not a nice person.”

Of course not. It all depends on your perspective. Indeed, informed opinions abound among intellectuals engaged in a variety of disciplines and endowed with varying talents. Take my line of work, the history trade. In a graduate seminar some years ago, one of my less accomplished students wrote a paper about the late A.J.P. Taylor, whose self-congratulatory memoir exaggerates both his socialism and his patriotism. Accordingly the student decided that Taylor was a National Socialist. When I suggested that such a sobriquet specifically applied to Hitler’s Germany, he responded, “That’s your opinion.” So it was. Still, whatever controversy Taylor’s Origins of the Second World War stirred when it was published in 1961, to call him a National Socialist reflects an appalling ignorance of a diabolical 20th-century ideology. After further protests by the student, I insisted that he was simply wrong, that A.J.P.Taylor was not a National Socialist and that it was not a matter of opinion. All of which succeeded only in costing me points in the annual evaluation, where students are asked to rate the faculty on a long list of desirable qualities, including, “Does the instructor encourage students to express opinions?”

Perhaps I make too much of such minor episodes for a man trained to think that prepositions are up for grabs and that there are two sides to every story. And yet the ambiguity about prepositions in the 1940’s did not extend to matters of good and evil, right and wrong. As Pat Buchanan likes to say of the good old days, “There was right and there was wrong, and we knew it.” We knew Hitler was evil and Roosevelt was good, the Ten Commandments good, slavery evil. John Charmley has recently come down hard on Churchill for sacrificing the empire rather than making a deal with the Nazis, an interpretation that makes one despair of the illusion that young Oxford dons are even remotely connected to the real world.

That Pat Buchanan’s own household thought Franco and Joe McCarthy good hardly vitiates my argument. After all, the Buchanans, like the Kriegels, were entitled to their own opinion. Indeed, the politics of my childhood facilitated both an appreciation of Realpolitik and a spirit of accommodation, and I religiously kept my end of a silent bargain that had me applauding Pius XII on the Movietone News as long as my Irish and Italian friends clapped for Roosevelt. But look at FDR now. They say he didn’t do enough to save the Jews. For that matter, look at Pius XII, whose Mariology indelibly stamped my childhood impressions of Catholicism, and who apparently did nothing at all. When George Schuster, an eminent Catholic layman and one of the founders of Commonweal, reviewed Rolf Hochhuth’s “The Deputy,” he was aghast that the Pope, “the courteous and kindly man who helped whomever he could, regardless of whether they belonged to his faith or another,” could be represented as morally culpable.

II

For me things began to fall apart with Glazer and Moynihan’s Beyond the Melting Pot, one of the last readable books to be published in the social sciences. We never melted, I learned, a sentiment now so common that I discover in a book written almost four decades later, that all humor is ethnic. Now that the melting pot has been succeeded by unbridled ethnicity, however, the pleasing illusion that we once shared a common heritage is shattered, succeeded by the urgency to accommodate the sensibilities of all the deprived. This is not an altogether novel indulgence. It was evident during my childhood in the exorbitant claim by my Irish parochial school friends who insisted that it had been St. Brendan who discovered America as well as by Jewish antiquarians who decided that Chaim Solomon had financed the American Revolution, the latter an historiographical exaggeration since supplanted by the African-American invention that independence would not have been won without Crispus Attucks.(To be sure, the retrospective self-congratulation of “mainstream” America is equally simplistic and self-serving. A New Yorker cartoon some decades back rendered the line from Emma Lazarus inscribed on the State of Liberty—”the wretched refuse of your teeming shores”—as “Welcome, garbage.”)

Still, chauvinistic indulgences aside, we were prepared to distinguish between fact and fiction, and while grade school history may well have been the stuff of fantasy, we deferred, however grudgingly, to those in a position to know. We accepted that there was such a thing as truth and were ready, even eager, for it, a disposition that explains my Uncle Moe’s rhetorical habit of inviting correction by anyone in a position to rebut him: “Am I wrong? Tell me if I’m wrong.”

Veracity was to some extent a function of consensus, but a consensus among the informed, the expert, the trained, an educated elite that we aspired to join. The overarching interpretation of American history for my generation was what is now dubbed the “consensus school,” which contained conflict within a broader vision that focussed on a pattern of collective harmony and liberal improvement. The consensus was compatible with the melting pot interpretation, as well as with Horace Kallen’s later and more refined concept, “cultural pluralism.” It is hardly surprising to discover, in Peter Novick’s, That Noble Dream: The ‘Objectivity Question’ and the American Historical Profession, that many leading scholars of the consensus school were of a Jewish background. As Arthur Schlesinger, Jr. writes of earlier immigrant generations and their children, “they expected to become Americans” and to partake of a common culture.

Nor is it coincidental that the rage for ethnicity and particularism should have been accompanied by an abandonment of “objectivity,” and what some fear is the imminent disintegration of the social cohesion that supposedly bonded Americans of an earlier generation. Those scholars who lament the end of consensus, like Schlesinger, also retain a commitment to the no longer axiomatic notion of objectivity, signifying not so much agreement on scholarly interpretation as on such fundamental matters as what constitutes proper investigation, the rules of evidence, and an adherence to “the facts.”

That “the facts” are now encapsulated in quotation marks attests the uncertain nature of the historian’s trade. It is, surely, a crude view of history to subscribe to what one of my colleagues called the “Dragnet” paradigm (after the detective, Joe Friday, in the 50’s television series whose signature was the line, “Just give me the facts, Ma’am”). That the facts we select may tend to corroborate our predilections is an elementary premise of introductory historiography courses. Moreover, one may be so conditioned ideologically as to deny any facts that tend to qualify, much less refute, one’s inherited wisdom. Mahmoud Abdul Rauf of the Denver Nuggets, for example, confidently explained his refusal to participate in the ritual of standing for the national anthem before basketball games because the flag was a symbol of tyranny and oppression. Apparently he could not, as President Nixon used to say, square that with his conscience. “I don’t think you can argue the facts,” he said.”You can’t be for God and oppression.”

It may be an enormous leap of sophistication from Mahmoud Abdul Rauf to the brilliant historian, Carl Schorske, one of the participants in a celebrated controversy more than a decade ago about a young scholar’s work on the alleged support of big business for the Nazis. Raked over the coals by critics for sloppiness in research including misuse of evidence, his work was defended by a number of eminent scholars, among them Schorske, who accused his critics of excessive reliance on “facticity.” Gertrude Himmelfarb, a staunch defender of the old order, considers “facticity” a peculiar charge of “postmodern” historians, who have created an intellectual atmosphere that intentionally encourages a disregard of “the coercive ideas of truth and reality.”

What is the proverbial intelligent lay reader to make of such disputes, when even the professional historian is himself so unsettled by new methods almost guaranteed to leave any traditionalist gasping for air? Take Simon Schama’s Dead Certainties, a title belied by the very possibilities the book explores. Schama deliberately blends fact with fiction, so that the reader is unable to distinguish them. No doubt this device is intended to serve a higher truth, though Schama himself has wondered aloud whether it is historical truth.

My reluctance to embrace the new may well be chalked up to the crotchetiness that accumulates with age. If Harold Rosenberg’s celebrated description of modernism as “the tradition of the new” applies, then I am as much an antimodernist as an antipostmodernist. “Historians,” writes Gordon Wood, “are usually the last to know about current fashions,” a prescient remark which tends to dramatize the contrast between the old and the new. Accordingly, when a young colleague recently delivered a paper to our university’s Humanities Center in a program about “The Gendered Body,” he noted the generational gap that exists among scholars. Indeed, a historian’s participation in a program called the Gendered Body is itself a stunning commentary on the changes in the profession, even if the changes have proceeded at a less menacing rate than alarmists fear. After all, one of my aging colleagues was astonished almost a quarter century ago, when he noticed a paper to be given at the American Historical Association’s annual conference on “Buggery in the British Navy During the Napoleonic Era.” Did the designated period suggest a craze of the time, or were there a store of papers suddenly available that, willy-nilly, happened to lay buggery bare for that era?

III

My colleague need no longer worry about chronology, since history may no longer be time-bound. Several decades ago, Le Roy Ladurie advertised a history that does not move. It now apparently moves again, though not necessarily in linear progression. All of which has me in a quandary when I am inevitably asked by college freshmen awaiting their first history examination, “Do we have to know dates?” I would settle for sequence, which could avoid the kind of muddle that I recently came across on a freshman World history exam: “Thales of Miletus established the Milesian School in Athens after Alexander the Great conquered the city.” It may not be crucial, after all, that a student know that Thales of Miletus preceded both the “Golden Age” of 5th-century Athens and Alexander the Great’s conquests of the following century (albeit not of Athens).

To be sure, there is some validity to the charge that older, more traditional scholars are ill-equipped to make the fine distinctions necessary to prevent even the most sophisticated among them from indiscriminately characterizing all postmodern scholars as subversives. Dominick LaCapra, for example, takes Gertrude Himmelfarb to task for precisely such a transgression. On the other hand, some among the postmodern generation indulge in a corresponding distortion when they lump traditional historians under a simplistic and anachronistic rubric of old-fashioned scholars deluded by a discredited adherence to positivistic certitude, to which no one any longer subscribes. As Yogi Berra, recently awarded an honorary degree by Montclair State University, is reputed to have said of a fashionable New York restaurant, “No one goes there anymore. It’s too crowded.” One is hardly likely to encounter a contemporary historian like the 19th-century French scholar, Fustel de Coulanges, who, responding to the applause of students as he entered a lecture hall, exhorted them, “Do not applaud. It is not I who speak, but history which speaks through me.” An academic historian who now hears an ovation may more reasonably surmise that his lecture has finally concluded rather than just begun.

My gendered-body colleague thinks that while we may not admit to so naive a claim as Fustel de Coulanges, traditional scholars in effect subscribe to such a notion, as evident in the wistful ideal of “disinterestedness.” An imaginative device to which we pathetically cling, disinterestedness, we are advised, is unattainable. There is no such possibility in the world of the “linguistic turn,” where the “truth claims” of his/her “belief system” determine the “cultural worker’s” “subject position.”

If one subject position’s belief system is as good as another’s, we must indeed settle for truth claims rather than truth, regardless of the evidence. Indeed, evidence is no longer a matter of much magnitude. As Richard Price observes in a recent reflection on historiography and narrative, postmodernist history betrays a “lightness of connection between theory and sources.” On the other hand, a prominent scholar of the British working class, Patrick Joyce, defends theory’s “decades-long challenge to received ways of historical thinking,” and questions “a complacent faith in empirical research”—shades of the excessive reliance on “facticity” so derided by Schorske.

The “resistance to theory” was noted by one of theory’s stars, the critic Paul de Man, whose own collaborationism with the Nazis, belatedly revealed after his death, provoked some unkind critics to dwell on de Man’s resistance to the Resistance. In their characteristically late encounter with postmodernism, historians may well prove a more formidable obstacle to the ascendancy of theory, given their archaic affinity for chronology and “facticity.” These antiquated proclivities are inherent in the enterprise of that traditional historical research which dwells on the reconstruction of events, the better to assess their significance. Thus, Harold Parker’s Three Napoleonic Battles contains a footnote that, I am persuaded, was designed as a teaching device for graduate students. Discussing the battle of Waterloo, Parker estimates that General Grouchy was 12 to 14 miles from the battlefield at 11 o’clock on the morning of June 18th. Considering the equivalent of French “leagues,” the position of the combatants and the probable rate of speed from one country crossroad to another, he concludes that Napoleon “probably had a rough notion of the distance Grouchy had to cover.” Similarly, Peter Laslett discovered, in a close reading of a classic that had been studied for centuries, that John Locke’s Second Treatise on Government must have been written during the Exclusion Controversy between 1679 and 1681, rather than as a justification of the Glorious Revolution in 1688.

That the researcher experiences a rush at the discovery of such trivia may be difficult for the theoretician to understand. In my first article published some 30 years ago, I established that Henry Hobhouse the Tory, rather than John Cam Hobhouse the Whig, circulated an inaccurate report about the composition of a committee among the parliamentary opposition, whose members determined to defeat the existing government appointed by King William IV after his dismissal of the Whigs in 1834.Such trivia—”mere events” to the Annales School—must conjure up the obscurantism of Mr. Casaubon to the spinners of grand theory. What difference does it make whether Napoleon thought Grouchy could come to his rescue, when John Locke crafted the Second Treatise, or whether it was John Cam Hobhouse or Henry Hobhouse who circulated the inaccurate report about the composition of a parliamentary committee in opposition?

The point of much contemporary theory, as distilled by John Toews, is “that we have no access, even potentially, to an unmediated world of objective things and processes that might serve as the ground and limit of our claims to knowledge of nature or to any transhistorical or transcendent subjectivity that might ground our interpretation of meaning.” Or, as my colleagues say, “Whom can tell?” Whom, indeed! One yearns for a latter day Dr. Johnson who, informed of Bishop Berkeley’s conclusion that there was no world of matter, kicked the stone and remarked, “I refute it, thus.”

Like my Uncle Nathan, Samuel Johnson meant what he said. No wonder I am a Keeper of the Flame. I keep to the tradition of Nathan, Moe, and Matthew Arnold, who is invariably brought up by proponents of theory as the exemplar of an outmoded, immutable objectivism. To rescue “the best that has been thought and said,” after all, is a futile enterprise when there is no best and thought is itself suspect. That Arnold in his own time sought to expand the sphere of culture beyond England to Europe hardly commends him to a multicultural generation who assume the conceit of the West. It was Arnold, moreover, who elevated the “disinterested” critic to cultural prominence. That the critic, as Lionel Trilling emphasized, was to interpret “the best that has been thought and said” as a standard of measure for his own time and place hardly mitigates his arrogance.

The modern historian, as Gertrude Himmelfarb points out, necessarily assumes a relativistic posture as a condition of his investigation. But that is distinguishable from the postmodern historian’s “absolute relativism,” wherein the impossibility of certitude signifies the worthlessness of aspiring to “objectivity.” It is understood that the historian’s interpretations are necessarily tentative, but the lack of absolute finality is hardly an indication of futility.

Take, for example, a recurrent controversy in modern British history—whether the aristocracy was “open” to new families from the 16th through the 19th centuries. In a characteristically ambitious work, An Open Elite?, Lawrence Stone and Jeanne Fawtier Stone challenge the traditional interpretation, the question mark in the title signifying their skepticism. Another husband and wife historical team, David and Eileen Spring, initiated a debate about the Stones’ findings that ranged over several journals for two years, by questioning the Stones’ interpretation of their own data. At a recent historical conference I attended, the glimpse of a participant’s forthcoming work tended to restore the traditional interpretation of the aristocracy’s openness.

The extent to which the English aristocracy was open hardly engaged Nathan and Moe. Nonetheless they would have been curious had the question been broached, if for no other reason than the opportunity to peruse the evidence and interpret the statistics. Having distinguished himself as an outstanding handicapper when armed with the scratch sheet of The New York Daily Mirror, Uncle Moe would have been at home with the figures, while Nathan, more taker than giver, less adept perhaps, was known occasionally to play the numbers, when his pockets were brimming with Moe’s largesse. Sporadic participation in the numbers racket, after all, was part of the decent drapery of life. It was an immigrant’s exercise of occasional conformity, participation in a ritual that confirmed one’s fellowship in a new community of brethren. Still, the democracy of the New World provided some detachment for a disinterested study of the Old, and the question of an open aristocracy was worth pursuing if only for the chase. No doubt, considering the evidence, Uncle Nathan would have concluded, “By me, that’s open!” And he would have meant what he said.

0 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Recommended Reading