Skip to main content

Play It Again, Sam


ISSUE:  Spring 2001


The BBC recently invited views on the greatest Englishman of the millennium. A short list was made, then there was a phone-in, and the race was eventually won by William Shakespeare, no less, with Winston Churchill in second place.

Neither of them went to a university, as it happens, and the three Cambridge candidates—Oliver Cromwell, Isaac Newton, and Charles Darwin—admittedly did not poll well. Nor did William Caxton, the first printer. But then there were no Oxford men at all, so perhaps Cambridge should not repine. One natural conclusion to draw here, it seems fair to say, is that the English have had a good millennium.

Not everyone was pleased, however, at Shakespeare’s triumph, and the announcement brought Lewis Wolpert, a London biologist, to the microphone, sounding disappointed. Shakespeare was good, he said, and he put things well. But Darwin had transformed the way the human race looked at itself and at every other species—and that, he felt, was better.

The suggestion that Shakespeare did not change the way mankind looks at anything is an arresting one, even if you accept—as in charity you must—that the biologist still thought he was a great man. It is not a judgment to dismiss, and I am happy to accept it in the spirit in which it was meant. No answer to say that Shakespeare has changed a good many lives, including mine. In fact he has been changing it hour by hour for more than half a century ever since, as a schoolboy in wartime Australia, I saw Laurence Olivier play Henry V in a local cinema, stayed on to see it a second time around and then a third, and went straight home to read it at a single sitting, hardly less excited by the bits Olivier had left out than by what he had kept in. Then a ship to England, a degree, and more than 30 years teaching Shakespeare (and other authors) at Cambridge. That is not the stuff out of which autobiographies are made. There is nothing interesting about getting what you are after, though it may be a bit unusual. Be careful what you want when young, Noël Coward used to say. You may get it in middle age.

The biologist was still right, in his challenging way. Changing yourself is not changing the world or how mankind sees the world. And yet the radio public did not err, and Shakespeare is somehow more than Darwin. No waking hour of my life for more than half a century has passed without thinking of something he wrote, and it is only under severe self-restraint that I abstain from quoting him in conversation. No doubt there is an argument for calling all that trivial, but then it is bound to be a trivial argument. You might easily admit that Shakespeare never told you anything you did not already know and that almost all his plots are borrowed; so a claim to originality cannot easily be made, or made at all. But it does not need to be. Shakespeare is still more than Churchill, who saved his country, or Newton, who discovered the secrets of the cosmos, or Darwin who, after a youthful visit to the South Seas, displaced the biblical story of creation with a theory of natural selection. He is still more than they.

There is something worrisome about that admission, all the same, and what is true about it is even more worrying than what is false. The falsities, in any case, are mostly misunderstandings. There are perfectly plain senses in which Shakespeare tells you what you did not know, though they are usually rather uninteresting. In the history plays he tells you quite a lot about the Hundred Years War, and much of it is true, even if Europeans did not in fact use gunpowder as early as 1415. But then, for what it is worth, I could have learned about the Battle of Agincourt anyhow. I could have learned about the cult of honor, for that matter, without hearing Olivier say:

But if it be a sin to covet honor
I am the most offending soul alive.

You do not have to read books at all, or see films, to know that there are people like that, though whether it is a good idea to be like that is another matter. But what fascinated, and still fascinates, is the way he puts things. Until you have heard one of his plays you do not know that concision can be so concise, clarity so clear, grandeur so great. I have no desire that life should be like that. You can easily have enough of life. It is because life, or rather living, is not like such that I want him.

 

That seems to be a truism more easily admitted in music than in words. When Ingrid Bergman tells Sam, as he sits at a bar piano in Casablanca, to play it and then sing it—she does not actually say “again”—nobody bothers to ask why, when she knows it already, she should want to hear it at all. That would be to miss the point. It is because she knows it that she wants it, and the same goes for reading Homer’s Iliad or a novel by Dickens. Literature, as Ezra Pound used to say—and it does deserve repetition—is news that stays news. Shakespeare stays news. In fact there is said to be a movie about how he, or some one of the same name wearing tights, once fell in love.

The question is not whether but why, and the supreme value of repetition and reminder is a point that in recent years the critical tradition has obstinately missed. A recent editor of Hamlet in the New Arden series, for example, has altered “What’s Hecuba to him or he to Hecuba” to “what’s Hecuba to him or he to her” on the impressively crass ground that the repeated name can only be an actor’s over-emphasis. That shows how scholarship can rot the brain. Or again, one critical theorist, her admirable mind sadly deformed by formalism, once earnestly explained to me that it cannot possibly matter what a poem is about, because great poetry only tells you things you already know, like “All men are mortal” or “I love you.” That is a breath-taking misunderstanding, and you would have to be a critical theorist to believe it. But then the role of repetition is simply not part of the critical dialogue—playing it again—and Hamlet’s question “What’s Hecuba to him?” is still a good question to ask. Why, after all, should an actor weep as he recites the familiar griefs of a long-dead queen of Troy? And why does his audience? What is Hecuba to anyone? It would be an exaggeration to say that no one has recently tried to answer such questions, and the answers, taken together, are called humanism. But somehow, towards the end of the millennium, they have mostly been forgotten or laid aside.

II


Humanism is the doctrine—ancient, medieval, and modern—that mankind is fundamentally the same in all climates and in all ages, that what it shares as a species is vastly more important than what divides it, whether class, gender, or race. “Hath not a Jew eyes?” as Shylock puts it unforgettably. That explains why we need to be told, and told and told again, what is already known—confirming, as human creatures, a common membership of mankind. As Samuel Johnson memorably said in the Rambler (March 24, 1750), “men more frequently require to be reminded than informed.” It is a principle that holds as true in daily life as in the arts. “Don’t forget to put the cat out” can be a useful thing to say, even if you put it out every night. Or again, a wife might helpfully say to her husband, as they discuss their son: “He’s only 17, you know.” It is not that he does not know it, or that she imagines he does not. It only means that he needs to be reminded. “A thought is often original,” said Oliver Wendell Holmes, “though you have uttered it a hundred times.”

He meant, of course, apposite and helpful, and his use of another word can take one into deep waters. It is so easy to assume that only the original can be good. In academic life, especially, originality is hugely over-valued. C.S. Lewis used to be amused by the complaint that his Narnia stories are mostly borrowed, since his chief motive in writing them had been to ensure that ancient stories are not forgotten by the young. Sam the pianist, in Casablanca, has nothing of his own to sing; but he sings it well in a club where the boss has forbidden him to sing it at all, and it gets the boss out of the inside room. If the song had been original, it would have got him, and the lovers, absolutely nowhere. Those in academe, similarly, who have nothing to say—nothing of their own, that is, nothing original—are often disparaged or despised. But why should they be? Try imaging an academic department in which everyone is original all the time, or even most of the time, and you will have imagined a bare garden. Nothing would get done. Nothing would be taught, since knowledge depends on a body of familiar and assured belief like simple arithmetic. I remember Isaiah Berlin remarking of a colleague that, though he was good at explaining things, he never had an original philosophical idea in his life. Presumably he meant an original idea that was true, or at least plausible to the extent of being worth discussing, since he must have been aware that anybody can be original and wrong. But if people need more frequently to be reminded than informed, as Johnson said, unoriginality can be a good idea; and in moral philosophy an original idea is very unlikely to be a true one. So the responsible moralist is bound to repetition, like Prometheus to his rock. Some one originally discovered that slavery is wrong, just as some one once invented the wheel. Neither can now be honored, since their names are unknown. But can you even imagine, in the present age, having a moral idea as large as anti-slavery that was at once new and true?

Odd, then, that thought should be confused with originality. Albert Einstein, it is said, was once asked what you should do if you have an idea, and he replied endearingly: “I don’t know, I only ever had one.” He meant one that was his own. Holmes was no doubt in the same confusion when he called an oft-repeated thought original, Shakespeare had thoughts in abundance, in that greatly underrated sense. They were not original, and they did not need to be. Shylock tells his enemies what they badly need to hear, even if they know it already. When Isabella begs Angelo in Measure for Measure to show pity for her condemned brother, he is saying nothing she does not already know when he replies:

I show it most of all when I show justice.
For then I pity those I do not know.

Laws, in short, defend the weak. That is what they are for. But it can be useful, if you are a loving sister with an erring brother, to be reminded of it.

 

Testing the limits of humanism, there are some bolder points to be made here. Plato dazzlingly extended the principle of knowledge-as-remembrance when, in a dialogue called Meno, he showed Socrates convincing an ignorant slave that he understands Pythagoras’ theorem about triangles and has always understood it. In other words, all knowledge is remembrance of a previous life. That is no doubt carrying humanism too far, and certainly farther than it needs to be carried, since there is a world of difference between seeing the elements of a problem and seeing the conclusions to be drawn. The ignorant slave was not wrong if he thought he had learned something from talking to Socrates. Johnson carried that principle too far, too, in a farcical way, when he told Boswell that because he wore his boots indifferently on one foot or the other, anybody could. Of course people are individual as well as similar, and it was Johnson who was the oddball here. Some know more geometry than others, some wear their clothes differently. Humanism can be over-done. There is still something instructive about such exaggerations. They test the limits of the humanistic case, and it is by observing those limits, and contemplating the wide expanses of nonsense that lie beyond them, that you learn to love the rich pastures that lie within. A commonplace is common for good reason.

III


The chief enemies of humanism in the last century—the 20th— were class and race; the chief enemy in the present age has been multiculturalism, which has added gender and a good deal else, along with a ready contempt for anything that smacks of eternal values or an established canon of masterpieces. Shakespeare, as a Dead White European Male, is central to that tradition, which was recently championed by Harold Bloom in The Western Canon, so he was perhaps lucky (all things considered) to be voted the Millennium Englishman. Or perhaps his victory suggests that most people have not swallowed multiculturalism. Good news, if true. All of which might help to confirm an encouraging surmise: that radical skepticism in morality and the arts is not widely accepted; that we do need to be told what we already know, and told it again and again. What is more, we know that we do. There is no need to fire the canon. Play it again, Sam.

Some years ago Iris Murdoch, in The Sovereignty of Good, remarked that most people, whatever they are told, do not accept that values are something individuals and communities impute to their own moral choices in life, or to stories they read or see. Common observation suggests they believe that values are there and that they can get them right or wrong. You can choose the wrong book or the right one, the right (or wrong) life-companion. It is philosophers and theorists, not the ignorant, who contest and deny all that. De-education is rife, in short, and teaching can make you worse. It is so easy, as she says, to be corrupted by bad philosophy. There speaks a professional philosopher who sadly watched generations of students being endangered by an academic training and who remembers the follies of her own youth. The case against the canon is not something students think of for themselves. They have been told that values are socially conditioned. They have been actively encouraged to believe that judgments, to be true, need stated and agreed criteria. Classes and seminars have urged them to accept that preferences are never more than matters of opinion. The thought of it all, unless you are of a hardihood beyond the ordinary, is enough to make you pale.

IV


Hecuba was a queen of Troy, a city which fell, after a long siege by the Greeks, betrayed by a wooden horse. The wooden horse of anti-humanism—gender studies, black studies, gay studies and the like—is already within the gates. In the lives of nations, anti-humanism is matched by a cult of separatism: to be Slovak is not to be Czechoslovak, Québecois to be Canadian, Kurd to be Turkish or Kosovan to be Yugoslav. No one denies there are differences. It is the claim that such differences are crucial that is new in its vehemence and scope. It affects little things as well as big, and it has its whimsical side. Some years ago the smallest county in England, Rutlandshire, was incorporated for administrative convenience into neighboring Leicestershire. All that sounds uninteresting, and is. But mark the sequel. The spirit of Rutland has asserted itself. Is the hegemony of Leicester, a good half hour away, to be endured? Evidently not. At all events Rutland lives again, and some sort of honor has been satisfied. Not that anybody for a moment can tell you what it was all about.

The spirit of Rutland haunts academic life. How do you know that non-whites see the world as whites do? How, then, can whites determine the grades of non-whites, or men of women, or straights of gays? To tell the truth, we have been here before. In the 1930’s a Nazi apologist called Jakob Hommes solemnly announced that even an understanding of the two-times table differed according to skin-color. So if you happen to be black, the proposition that 2 plus 2 equals 4 is only to be understood negroidly—negerisch. Like a lot of Nazi ideology, that is an echo of 19th-century Marxism. In his Notes to Anti-Dühring Friedrich Engels remarked that though, among whites, mathematical axioms may seem self-evident even to children, a bushman or Australian aborigine might find them difficult or impossible. Marxism is nowadays associated with severely economic interpretations, but that is a misunderstanding. Though economics governs history, Engels explained in a letter of January 1894, “race is itself an economic factor.” So Marxian economics was racist, and proud of it. So is a lot of educational theory, and the forgotten truth about multiculturalism is that it has totalitarian roots. Hommes spoke for Adolf Hitler. Indeed the Left has long been racist. Marx and Engels publicly advocated genocide, Hitler acknowledged his debt, and the common humanity of man was denied by the great dictators long before most who are now living were born.

V


It may be possible, though risky, to venture an explanation for the rebirth of multiculturalism, or the New Racism, in recent years. The most probable explanation, I suggest, and the least mentioned, is the fear of silence, and here the chief sources are French. The death of Marxism in the 1970’s—Mann est mort—was followed in Paris by the fast fading of La Nouvelle Critique. Lines of students who had not read a line of Marx chanted the slogans of the workers’ revolution, then took cheerfully to dogmatic skepticism. You may not have much to say. But at least you had something to shout.

What is there to shout about now? Capitalism did not collapse, the United States left Vietnam, and an awkward silence had to be filled. As nature abhors a vacuum, so does an intelligentsia abhor a silence. In a world where papers have to be written, books and articles published, line-shooting is all but inevitable. Most intellectual silliness arises from a need to say something—anything at all. In La Révolution Introuvable, on the events in Paris in 1968, Raymond Aron acutely remarked that modern teaching practice lays an absolute requirement on the young to have opinions. Hence the trouble in the streets, the rock-throwing on the boulevards.

Since the 1950’s the adolescent has been not merely allowed to express opinions on world affairs. He has been forced to do it, as an academic requirement. It may seem absurd to the middle-aged, Aron argued, that a teenager should imagine he had the answers to the woes of the world. But what would you expect? Around the age of 14, at school, he was made to write a paper on Racine’s concept of love before he knew what love was. The Cold War called on him to take sides whenever he saw a headline in a newspaper. Of course he reaches for the nearest answer—usually the word of a friend—and that word can convince him that capitalist society, as Marx predicted, is doomed to collapse through a class war. He does not know how long ago Marx predicted it, or whether there have been any class wars since the 1840’s, and can be amazed when you tell him. I have seen educated people astounded to learn that Marx lived in the 19th century, died as long ago as 1883, and was born a year earlier than Queen Victoria. It is a wonder Lytton Strachey did not write his life. It shows you how long it takes news to sink in when you reflect that the slogan Marx est mort hit the boulevards in the 1970’s; and in 1968, when the Sorbonne revolted, the first volume of Das Kapital was already a hundred years old and more.

With the collapse of the New Left, in the 1970’s, and then of the New Right, there was a sudden need for something—anything—to say. It was the worst of awkward silences, and multiculturalism arose out of that sudden need. The end of socialism, as Richard Rorty has candidly called it, writing as an ex-Marxist, also means (as he forbore to say) the end of anti-socialism. What use refuting what nobody any longer believes? The cure for silence was the sudden invention of a clutch of new causes, or old ones revamped. Feminism, in the 1970’s, dragged the rhetoric of oppression out of workers’ revolution into issues of gender and community; multiculturalism into issues of race, at a moment when racism was thought to have been forever discredited. As in earlier days, the impulse was less conviction than the kind of hunger Aron identified: an absolute need to be seen to have an opinion, a profound fear of silence. It is a long time, in classes or on chat-shows, since anyone was heard to remark that he did not know enough to offer a view.

T. S. Eliot thought there was not enough silence, and the point has not lost its force. Perhaps people should be encouraged, for a while, not to have opinions—at least not about everything. Perhaps, in the new millennium, we should look more closely and lovingly, as Shakespeare did, to the wisdom of the ages. Perhaps this is a time for hesitation and withdrawal—a time to shut up, think harder, look longer. Or as the lovely lady almost said in Casablanca, to play it again.

0 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Recommended Reading