In the years following the great American technological achievement of landing men on the moon before the Russians did, engineering here became not a booming but a depressed field. When the morality of majoring in engineering was not questioned, the sanity of doing so was. By the mid-1970’s engineering enrollments were down by a third from their heyday in the shadow of Sputnik, and still not all graduates could find jobs. Now, only a decade later, an about-face has occurred on college campuses. Soaring engineering enrollments are causing overcrowding in long-neglected laboratories; and although there is a shortage of engineering faculty at a time when a paucity of open positions in the humanities have hundreds of applicants, nonengineering students seem no longer to look down on the engineers. It is not so simply a matter of engineering majors getting more and higher-paying job offers on graduation, for the engineering job market is still a very volatile one. Rather it seems to be a perception that our whole culture is changing, and that understanding technology is indispensable to coping with the future. And who should be in a better position for understanding technology than the engineer? Thus, in addition to teaching overcrowded classes of engineering majors, engineering faculty are being asked more and more to teach engineering and technology to liberal arts students.
For a long time, engineering students were stereotyped as social misfits with slide rules hanging from their belts who were ignorant or insensitive to the traditional culture of the humanities. Today, however, the humanities student is being challenged because he or she is technologically illiterate or not numerate. While no engineering student can graduate from an accredited school without taking approximately one-eighth of his or her courses in the humanities and social sciences, many a history or English major earns a degree without nearly so much mathematics or science, let alone engineering. Thus a curious change of perception has occurred as to who is and who is not liberally educated in an increasingly technological age.
What seems to have triggered the change of perception is the personal computer. This object of the highest technology is held out as the sine qua non of the 21st century, and no one—engineer, scientist, or humanist—wants to be left out of the New Year’s Eve party in 1999.Scientists and engineers are perceived as knowing all about computers, which interminable advertisements tell us are now not only for calculating numbers but also for processing words. And since words have for so long been the domain of the humanists, the humanists have taken note. This is a wonderful development, for the computer is indeed an excellent machine to provide a bridge between the two cultures. Whether it be used for numbers or words, the personal computer tempts its user to cross over to the other culture. It is like using a dictionary: you go to it for one purpose, but you browse about and cannot help but learn something else. So the engineer and scientist who have gone to the computer for their calculations have found that it is a wonderfully patient writing companion with whom they can explore the English language, And the humanist who has gone to the computer as a word processor has found it inescapable to learn a bit about the science and technology of the device, lest he or she lose the words entrusted to its memory. Thus one of the unforeseen contributions of the personal computer may be to get the two cultures to come to this bridge from their separate sides and to meet at the middle. Such a meeting has been a long time coming, and in the meantime engineering and technology have emerged out of the shadow of science, with which they have so often been confused.
The public perception of science (and technology as applied science) was radically altered by the development of the atomic bomb, which showed not only how powerful could be the fruits of science but also how clandestinely scientists (and nameless engineers) could work toward revolutionary ends. If secret cities, laboratories, and factories could be built, if basic physical theories could be developed into pieces of hardware that could obliterate tens of thousands in a single flash, if millions upon millions of dollars could be spent, and if tons upon tons of silver from the U.S. Treasury could be diverted to be used in giant magnets designed to separate isotopes of uranium in an effort dubbed the Manhattan Project, then what else might scientists and a government do without consulting the governed?
By the 1950’s there was an international paranoia that abuses tolerated in one government during times of war would be extended by others to times of apparent peace. Even before Sputnik was launched, there was considerable concern in the West that the Soviets were training more applied scientists. The federal research and development budget was growing by unprecedented amounts, and science was being accepted more and more not only as a worthwhile endeavor but also as a seedbed from which would spring the fruits known as technology. Although scientists always desired free rein and argued for the purity of the research they conducted without regard for its consequences, the reality of the atomic bomb, and its unnamed and unimagined but imaginable successors, was lost on no one. The war had made urgent an issue brewing since the earliest days of the Industrial Revolution: how to integrate the newer scientific-technological knowledge with the traditional humanistic cultures ?
James B. Conant, president of Harvard at the time, excused himself from explicitly discussing the bomb in series of lectures he delivered in 1946.Instead of worrying for the moment about that one manifestation of scientific and technological development, he dealt with the more general problem of scientific literacy. He summarized his argument as follows:
In spite of his mentioning only science, Conant’s reference to machines suggests that he also had technological literacy in mind, though in the spirit of the times of “applied” science, scientific literacy must have been assumed sufficient to complete an education. This view was to remain for some time.
. . . we need a widespread understanding of science in this country, for only thus can science be assimilated into our secular cultural pattern. When that has been achieved, we shall be one step nearer the goal which we now desire so earnestly, a unified, coherent culture suitable for our American democracy in this new age of machines and experts.
In a 1955 speech before the British Association for the Advancement of Science, Jacob Bronowski looked ahead to the present, saying, “It is certain that the educated man of 1984 will speak the language of science.” Bronowski went on to plead for widespread scientific education as the antidote for a society in which scientists rule the scientifically illiterate. Science, Bronowski argued, is a part of culture and thus should be a part of the education of those who might be in power, just as scientists should have as part of their education the rudiments of literate, humanistic culture. The problem of balancing the sciences and the humanities in the ideal curriculum was reawakened.
When Bronowski called for science as part of the education of a cultured person, he was merely raising anew the debate that Thomas Huxley and Matthew Arnold engaged in in the 19th century. Two centuries after the Industrial Revolution, it was time for the issue to rise again, especially since scientists were not only claiming knowledge of the innermost workings of the mind and of the atom but also were seeking larger and larger federal-government appropriations. Artists and humanists had certainly never known that kind of public financial support, even when they worked on their federally funded New Deal projects.
While Bronowski called for science to be accepted as a legitimate part of culture, he largely spoke to scientists and told them what they were predisposed to agree with, if not fundamentally at least territorially. He called for the educational process to include such specifics as statistical methods, statistical mechanics, atomic physics and chemistry, the scientific method, and even a small piece of personal scientific research for every boy and girl. But Bronowski’s speech, even though entitled “The Educated Man in 1984,” made little impact outside the scientific establishment. Although Bronowski distinguished the newer, scientific culture from the more traditional humanistic one, his speech was largely ideological and abstract. The speech’s memorable anecdote concerned Winston Churchill’s directive in 1945 to launch the British effort to make an atomic bomb. According to Bronowski, Churchill confessed to being personally quite content with existing conventional explosives but could not ignore the technological counsel that the new weapons would make the conventional virtually obsolete. Bronowski’s point was to emphasize the importance of technologically-informed decisions, which are possible in a democracy, he argued, but not in a dictatorship where the contentment of the leader could jeopardize the welfare of the citizens.
About a year after Bronowski’s appeal for science as part of culture, an essay by C.P.Snow appeared in the 1956 Autumn Books Supplement of The New Statesman and Nation.This essay began on much the same premise as Bronowski’s, that science was somehow just not quite considered a part of culture, but while Bronowski emphasized sound educational reform, Snow related gossipy anecdotes and dropped names freely. Furthermore, he did not so much spell out concrete proposals for incorporating science into culture as he did point out and lament the differences between the scientific and the literary cultures. Snow entitled his essay “The Two Cultures.”
There was remarkably little reaction to Snow’s argument in the correspondence department of the New Statesman.There were a few letters, but nowhere near the volume of mail elicited by a report, published only a month before Snow’s essay, calling for more students to study science instead of the humanities. The study, entitled “New Minds for the New World,” raised the specter of the Soviet threat and documented the paucity of pure and applied scientists being turned out by Britain as compared to the Soviet Union. The report called for more young people to study science as it confessed: “Our present culture is, of course, unscientific. It is also in part anti-scientific.”
Though Jacob Bronowski and C. P. Snow were lobbying on both sides of the Atlantic for the intellectual legitimization of science, humanists apparently did not feel particularly threatened in their dominant cultural position. F.R.Leavis, the prominent Cambridge literary critic, was being criticized for his lecture, “Literature in My Time,” in which he declared that, apart from D.H.Lawrence, there was no literature in his time. This prompted the novelist J.B.Priestley to write some scathing “Thoughts on Dr. Leavis” for the pages of the same volume of the New Statesman that contained Snow’s claim to cultural status for all the science of the time. Unlike the few quiet letters responding to the idea of “two cultures,” however, the Priestly-Leavis debate generated volumes of mail that went on for months in the letters column of the magazine.
Yet, not until Snow expanded his slight essay into the Rede Lecture, published as The Two Cultures and the Scientific Revolution in 1959, did anyone really seem to take notice of the idea Bronowski articulated in 1955 and Snow christened in 1956. Leavis had ignored Snow’s earlier essay in the New Statesman but took exception to the attention given to a new scientific culture, especially if it were at the expense of the traditional literary and historical culture. He delivered a terribly ad hominem lecture, entitled “Two Cultures? The Significance of C.P.Snow,” which was subsequently published in The Spectator.Leavis found Snow “intellectually as undistinguished as it is possible to be,” and “in fact portentously ignorant.” The Two Cultures, Leavis said, “exhibits an utter lack of intellectual distinction and an embarrassing vulgarity of style.”
What had incited Leavis in the few years between Snow’s modest essay and the Rede Lecture? Well, when the Russians launched Sputnik, all the pedagogical warnings about the Soviet edge in technologists and all the theoretical musings about science being a new part of culture were suddenly matters of political and ideological life and death. The artificial moon beeping down at the polarized Earth made the issue of who was and who was not a great contemporary novelist seem among the most trivial of pursuits.
C. P. Snow’s “two cultures” are sometimes envisioned as separate islands populated by those conversant in either Shakespeare or thermodynamics but not in both. Scientists, sharing a common “method,” were seen by Snow to have “the future in their bones”; the humanists, represented principally by the literary intellectuals, were “natural Luddites” who had “never tried, wanted, or been able to understand the industrial revolution, much less accept it.” And this distinction was crucial to Snow’s message, for he saw science and technology as not only the principal ingredients of 20th-century material wealth but also the necessary ingredients for the development of the Third World. Asian and African countries were ready for their own industrial revolutions, and Snow described as “suicidal and technologically illiterate” those products of the old humanistic culture who thought the underdeveloped countries should be content to creep up to the standards of the industrialized countries in a matter of centuries.
The issue Snow delineated was that of the informed use and dissemination of technology, and he echoed Bronowski’s call for educational reform. Yet even as Snow was delivering his lecture on the two cultures, engineering students were required to fill the equivalent of one full semester of their four-year curriculum with social science and humanities electives. Since Snow’s Rede Lecture we have had: the rise and fall of federal research and development budgets; the rise and fall and rise again of engineering enrollments; the infusion of foundation grants to colleges and universities to infuse even more humanities into science and engineering education. These grants have spawned numerous programs, generically termed Science, Technology, and Society programs, which have formalized and institutionalized interdisciplinary activities of faculty and students. If not their intent, at least their effect has been of humanizing scientists and engineers rather than informing humanists about scientific method and culture. Only very recently has there been a strenuous effort to introduce science and technology into the traditional liberal arts curriculum.
As C. P. Snow acknowledged even in his Rede Lecture, to divide the educated world into two distinct cultures is at best arbitrary. Social scientists presented a special difficulty to Snow’s dichotomy, and in his “second look” he admitted that a “third culture” was on the way. Snow believed this third culture to comprise a body of intellectual opinion he saw then emerging—from such fields as sociology, political science, economics, and psychology—”concerned with how human beings are living or have lived.” But some other nontraditional disciplines, which could presumably also provide a bridge between Snow’s two cultures, seem sometimes to be without firm anchorages in either one. And signs of weakness do not bode well for the success of bridges.
A recently-published book, Global Stakes: The Future of High Technology in America, states that its purpose is “to pull together in one place the facts and figures, arguments and opinions, that affect the future of high technology in America. . . .” The book’s title page identifies three authors writing “with” a fourth, and a back page “about the authors” identifies them as international consultants on policy. What makes the book relevant to the discussion here of C.P. Snow’s pivotal lecture is its summary of the two cultures debate in a chapter entitled “The New Education.” Not only is Snow identified as an “English philosopher,” but also his allusion to Shakespeare and thermodynamics is related with a reference to “Newton’s Second Law of Thermodynamics.” Now even those literary intellectuals who disagree with Leavis about whether Snow was a novelist would certainly not dub him a “philosopher,” and no scientist has ever confused the Second Law of Thermodynamics with Newton’s Second Law of Motion. This is an amazing pair of gaffes to have escaped four authors and, one should think, at least one editor, suspended somewhere between cultures of which they seem possibly never to have been a part.
For all their presumed faults, the allegedly incompatible spokesmen for each side of the familiar two cultures are generally, if nothing else, sticklers for detail. That is to the good, for when the details get fuzzy, the thinking gets fuzzy. Indeed, it might be argued that the world, the Third World included, might better be served by a pair of separate but equal cultures of careful specialists than by a gaggle of generalists who cannot go beyond a sloppy explication of the issues. When the times demand it, the humanists can sit down with the scientists, whether or not there are members of a third culture present, and meet on the common ground of appreciation for discipline. For in the end it is discipline, the rigorous attention to the detail of one’s work, that distinguishes the scholar from the dilettante. That is not to say that one cannot reach beyond his discipline, only that one must do it with the same care and rigor that he pursues his first discipline.
When the Alfred P. Sloan Foundation published its recent occasional paper on “The New Liberal Arts,” it reopened the questions of acculturation. Some 25 years after Jacob Bronowski and C.P.Snow pleaded for science and technology as parts of culture, a foundation was finally willing to make a financial endorsement of the idea. It is time, argued Sloan spokesman Stephen White, not only for the engineer and scientist but also for the general liberal arts student to understand the computer, applied mathematics, and engineering since the future is clearly and inextricably involved with such subjects. Exactly how to implement such a radical change in the traditional liberal arts curriculum is another matter, however, and colleges and universities have been invited to search their souls and budgets over the matter in proposals to the Sloan Foundation.
While applied mathematics and engineering are not new and may remain esoteric and foreign disciplines to the liberal arts student (and faculty) for some time to come, computers have quickly become ubiquitous and thus hard to ignore. And the benign presence of the computer in homes, schools, and businesses everywhere may do more to bring about the integration of technology into the mainstream of culture than all Bronowski’s and Snow’s pleas, plus all the well-intended efforts of government agencies and private philanthropy. In particular the perfectly named personal computer, the instrument to embrace at once the human value of individuality and the highest technology of the late 20th century, has the capability to bridge the gap between the two cultures—and with plenty of clearance for the great commerce of all the other cultures to continue unobstructed.
The personal computer displaces at the same time the typewriter of the humanist and the calculator of the engineer, yet it can provide a common ground for mutual understanding. Whether used as a word processor or a number cruncher, the microcomputer is a robot shuttle diplomat. It moves freely back and forth between the cultures, alternately speaking the language of the writers of prose and of equations. And in its keyboard the personal computer unites the objects of literacy and numeracy in a way heretofore unrealized.
The classic typewriter keyboard, with its QWERTY arrangement of keys (said to be laid out deliberately to impede their too rapid depression so that the original mechanical contrivance would not be forever getting tongue-tied), was never hospitable to the ten numbers, which always seemed out of reach and never quite in the repertoire of the touch typist. Though they appeared in strict order, so unlike the jumble of letters, the numbers were struck with the eye, and not with the finger alone. Equations were a nightmare to type, and many an old keyboard did not even have an equals sign. More than one model expected the user to employ a lower case I for the numeral 1, while the capital O and cipher, though on different keys, were indistinguishable. Such inconsistencies were adapted to the way cotton ribbons were tolerated.
The advent of the computer brought with it the necessity at once to combine and separate more the alphabet from the numbers. Computers did not read marks on paper, but holes in tapes or cards. Each letter and number had its own combination of holes and not holes, of zeroes and ones, of ons and offs, for in the final analysis that is all the electronic brain is capable of distinguishing—the openness or closedness of switches, the yes’s or no’s of precedence. The typewriter’s little l could no longer be ambiguous, and the big O and the zero had to be distinguished. The number 1 was added, and the letter O was distinguished from the number 0 the way a sergeant is separated from a private—by a stripe.
If the typewriter favored the humanist, the modern pocket calculator favors the numerically inclined. Its keypad puts the ten digits within easy reach of one set of fingers, and in a tight numerical order. Human calculators learn this keypad as secretaries learn the typewriter keyboard:
with some variations in the placing of the zero and the decimal point on the bottom line. This is a very sensible arrangement of the keys, with the digits ascending from the zero in ascending order. One’s finger must reach higher for a higher number.
7 8 9
4 5 6
1 2 3
The telephone dial, like the typewriter keyboard, was another technological artifact that confused the alpha and numeric symbols. It was not the letter O but the digit zero tha t stood for “Operator.” Furthermore, the rotary dial was a classic example of innumeracy that, like the QWERTY keyboard, puts 0 after 9 instead of before l, where it properly belongs. And a curious dilemma appears to have developed with the touch tone-telephone number pad. Meant to allow us to enter our charge and bank account numbers, so we might pay and save by phone, the touch-tone number layout retained the vestiges of the alphabetic telephone exchanges in the letters associated with the number holes on the old rotary dial. To arrange the telephone buttons in the same numerical order as the calculator would have introduced a very peculiar arrangement for the letters of the alphabet. When in the future the groups of three letters associated with most of the number keys are dropped from the telephone buttons the way exchange names have by and large already been dropped, the numerate society may make a trivia question out of the numerology of telephone buttons as they are arranged today:
1 2 3
4 5 6
7 8 9
* 0 #
This disagreement in the arrangement of calculator and telephone number buttons is as unfortunate as if a major typewriter manufacturer departed from the QWERTY keyboard arrangement. Only certain keyboards intended for the use of children, who are expected to hunt and peck at the letters and numbers, are arranged in strict alphabetic and numeric order. What large computer manufacturers have adopted is a dual keyboard. Not only is the conventional QWERTY keyboard there for the humanist, but also the number pad with the digits in calculator order is given a separate place off to the right of the “typewriter” section, allowing masses of numerical data to be entered by the touch method. And just as humanists and technologists have in the past ignored the other culture, so word processors and number processors may now ignore the part of the keyboard that is not their immediate concern.
The computer and its keyboard are thus an excellent metaphor for the two cultures problem, and indeed for the problem of knowledge and disciplines generally. By containing the 26 letters of the alphabet and the 10 digits, the keyboard presents to the user the potential to express and explore any literate or numerate idea. At the same time, however, the redundancy and duality of the keyboard enables one to ignore whole segments of the machinery of thought and expression, and it is possible for one to reject completely the access he has to an entire half of the electronic brain.
Yet the computer and its keyboard provide not only a metaphor but also a means for 20th-century man to overcome the two cultures problem in ways that Bronowski in his eloquence and Snow with his anecdotes could not have foreseen. While the computer may be considered expensive by some standards (notably, the cents sign is perhaps the only symbol from the old typewriter not on the keyboard of the most popular personal computers), it is in fact already within the purchasing grasp of the great middle class and promises to become less and less costly in the same evolutionary way that the pocket calculator has. Thus we can realistically look ahead to a time before the turn of the century when the computer will be as familiar an appliance in the home as the television set.
Because of its essential duality, the home computer dissolves the distinctions between the two cultures. The humanist who wants to word process may choose to ignore the number keys, but he cannot ignore certain technical details: the capacity of his machine’s memory, what floppy disks it uses, and the technical procedures to save and retrieve his words. If he wants to print out his manuscript, he must find a compatible printer, and if he wants to do things right he may even read the manuals and user’s guides (poorly written and incomprehensible as they may seem to be). By involving himself with this technology, for it is that if nothing else, the humanist begins to appreciate the other culture with neither awe nor disdain. He begins to understand that technology is human, that is, it has limitations.
No one can use a computer without coming to the realization that the human race has nothing to worry about. While the capabilities of the computer may seem limitless at first flash, the writer who endures to type in more than a short manuscript soon realizes how limited the machine’s memory really is. While it may be long, it is not deep. A floppy disk cannot hold a novel, nor can the computer plot it. What it can do is manipulate the words and paragraphs in a terribly obedient way, so that the writer need not be intimidated by the task of repeatedly retyping a whole manuscript, even the long parts that he does not want to change. The computer can be called upon to check spelling, but the writer soon learns that electronic spelling checks do not catch the mistyped or misused principal for principle, for both agree with the dictionary they are checked against. The nontechnologist who uses a computer soon learns how silly are the fears that this device can ever replace man.
The technologist, on the other hand, who might have for so long eschewed typewriters as the tools of workers in words and not in equations, cannot help but be drawn to the QWERTY hardware and the wordy software that enable him to hone his thoughts into sentences that can be considered humanistic. The tireless video screen will show draft after draft of his latest paper without calling for a halt the way secretaries are wont to do. With the word processor in his computer’s memory, the engineer or scientist can seek the elegance in his prose that he seeks in his equations. And with practice at the computer terminal, he can get it.
When and if the technologist masters the English language, he will come to appreciate that the humanist’s traditional tool demands as much precision in its proper use as do the tools of science and technology. For all the narrowminded conventional wisdom, there is more latitude in employing the mathematical language of calculus or a computer language such as Basic than is commonly allowed. True, there are rules that must not be violated if one is to achieve valid results, but so are there rules of grammar that one must not violate if the humanistic argument for employing the English language is to be taken seriously. If the computer will not accept a command containing a syntactical error, neither will the referees of a scholarly journal be inclined to accept a paper in history, say, if it filled with spelling and grammatical errors. Anyone who admits the importance, or at least the desirability, of literacy should recognize the equal importance, or desirability, of numeracy —and vice versa.Discipline demands discipline.
Technology is here to stay. Its elements are as much a part of our culture as those of the traditional humanistic disciplines. The modes of thought of engineers and scientists, the languages of the physical, biological, and computer sciences, and the muscular and metal disposition to operate a personal computer as naturally as one operates a motor vehicle must become essential to the educated man or woman in the closing years of the 20th century. Without these, we cannot deal confidently with such issues as nuclear power and nuclear weapons, genetic engineering, acid rain, telecommunications, energy, or even budget deficits and proposals to eliminate them.
Jacob Bronowski’s expectations for the educated person of the mid-1980’s have not yet been realized, but at least they are now more than one person’s hopes. The computer, though not even mentioned by either Bronowski or Snow a quarter-century ago, may indeed prove to be the mechanism by which we will both broaden the concept of tradition and bridge the gulf between the traditionally antipathetic cultures. The new intellectual must recognize that keeping up with advances in science and technology is imperative for those who seek to march in the avant-garde of tomorrow.