To a large extent, the history of ideas is the history of words. Consider, then, these two: “individualism” and “self.” Can we imagine what our present-day discourse would sound like without these two nouns? We can’t live without them. But at the same time, we can hardly stand to live with them. The duel between individualistic “rights talk” and communitarian “responsibility talk” used to be a roughly even match. But today it is only rarely a contest, for the former nearly always wins, hands down. And, just as in so many of the professional sports of our day, those few contests that do occur tend to be fought out between near-freakish exaggerations of the human form, players whose bulky or etiolated physiognomy bears little relationship to the dimensions of ordinary human life. We are not comfortable with the prospect of an entirely rootless, atomistic, and self-seeking culture. But the alternatives seem to appeal even less; and so we sleepwalk onward.
All the more reason, then, to remind ourselves that “individualism” is actually a relatively new addition to the lexicon of Western thought—and that our concept of the “self,” particularly in its reified, psychological sense, is even more so. That is not to deny that both words have long and distinguished pedigrees, informed by rich antecedents and fertile anticipations. They did not acquire their power overnight. Indeed, a deeply rooted belief in the dignity and infinite worth of the individual person has always been a mark, and a mainstay, of what we imprecisely call Western civilization. But being rooted in the past is different from understanding it, or drawing upon it rightly. Sometimes a long look backward may be needed, if only to assess where one has arrived, not to mention where one ought to be going.
One does not need to look very hard to find a host of antecedents. Elements of the West’s foundational emphasis on the individual can be detected as far back as classical antiquity, particularly in the Greek discovery of “philosophy” as a distinctive form of free rational inquiry, and in the Greco-Roman stress upon the need for virtuous individual citizens, if one is to sustain a healthy republican political order. Other elements appeared later, particularly in the intensely self-directed and self-oriented moral discipline of Hellenistic-era Epicureanism and Stoicism. And perhaps even more importantly, the traditions and institutions arising out of Biblical monotheism placed heavy emphasis upon the infinite value, personal agency, and moral accountability of the individual person. That emphasis reached a pinnacle of sorts in Western Christianity, which brilliantly incorporated the divergent legacies of Athens and Jerusalem into a single universalized faith, one that enabled every individual man and woman, irrespective of tribe or nation, language or culture, to come into a full and saving relationship with the Deity.
Nearly all the most influential spokesmen for Western Christianity, whatever their differences on contested questions of faith or morals, served to reinforce this belief in the central importance of the individual human person. The Confessions of Saint Augustine bespoke the importance of the individual precisely through the gripping tale of individual conversion it rendered. The 15th-century Renaissance scholar Pico della Mirandola offered an effusive Oration on the Dignity of Man as a ringing assertion of the individual human being’s profound moral freedom. To be sure, Protestant reformers, particularly those of Calvinist hue, took a far dimmer view of unredeemed human nature, and a far more robust view of the doctrine of original sin. And yet, with their belief in the priesthood of all believers, which radically deemphasized the efficacy of the institutional Church, and their insistence that salvation was to be gained only through an uncoerced individual confession of faith in Jesus Christ as the redeemer of a sinful world, the advocates of a reformed Christian faith intensified the emphasis upon the conscience and choices of the individual believer. That such changes lent support to the key economic developments of the age, particularly as reflected in the rise of commerce and capitalist enterprise, only made it all the more likely to succeed.
None of these expressions of belief, explicit and implicit, in the individual amounted to anything approaching what we mean by modern individualism, however, and it is important to grasp why. Such freedom as the premodern individual enjoyed, particularly since the advent of Christianity, was always constrained by two things: either by belief in the existence of an objective moral order, which could not be violated with impunity by antinomian rebels and enthusiasts; or by belief in the inherent frailty of human nature, which indicated that virtue could not be produced in social isolation. Although nearly all influential Western thinkers before the dawn of modernity had conceded the signal importance of the individual, none employed the term “individualism” to express that belief.
Instead, “individualism,” like many of our most useful terms, began life as a term of abuse. It first appeared in French, and in the discourse of one of the most fierce opponents of the French Revolution and modernity. The 19th-century French archconservative Joseph de Maistre devised the label of “individualism” to describe much of what he found horrifying about the Revolution: its overturning of established social hierarchies and dissolution of traditional social bonds, in favor of an atomizing and leveling doctrine of individual natural rights, which freed each individual to be his or her own moral arbiter. Maistre’s “individualism” was not an affirmation of personal dignity, but a nightmare of egotism and moral anarchy run riot.
A few years later, the French writer Alexis de Tocqueville, in his classic study Democracy in America (1835—1840), also employed the term “individualism” in a manner that was subtler, if no less critical. Individualism was, he argued, a characteristic pathology of “democratic” societies, i.e., societies which lacked any legally recognized distinctions of rank and station among their members. Although the American nation was but a few decades into its history, Tocqueville already found individualism to be one of its defining characteristics— and therefore, he thought, a characteristic of all modernity, since America represented the avant garde of modern history, the first “great republic.”
But Tocqueville’s complaint was different from Maistre’s. Egotism, he thought, was an emotional disorder, a “passionate and exaggerated” self-love, of a sort one could find throughout human history. The selfish ye shall always have with you. But individualism was something else. It was a self-conscious social philosophy, “a mature and calm feeling, which disposes each member of the community to sever himself from the mass of his fellows and to draw apart with his family and friends, so that . . .he willingly leaves society at large to itself.” For Tocqueville, individualism was not merely a self-indulgent form of social atomism, or a sustained hissy fit, but something quite new: a conscious and calculated withdrawal from the responsibilities of citizenship and public life. For Tocqueville—who was, unlike Maistre, a qualified friend of democracy— there was no greater threat to the new order than this tendency toward privatism.
So “individualism” began its life as a critical term. Now and then, particularly when it is employed by social critics, one will see indications that it has not entirely lost this sense. And yet the critical view of these two French critics seems strikingly at odds with the self-conception of most Americans, who after all had no experience of feudal, aristocratic, monarchical, and other premodern political institutions, and who are likely to see individualism, in one form or another, as a wholly positive thing, the key ingredient in what it means to be American. Such a view is, of course, a bit too simple. It presumes that American history is nothing more than the story of an unfolding liberal tradition, a tale that can be encapsulated in the classic expression of individual natural rights embodied in the Declaration of Independence. Such a view ignores the profound influence of religious, republican, radical communitarian, socialist, feminist, and other non-liberal elements in our national saga, including the most illiberal institution of all, chattel slavery. What national commitment to individualistic values we now possess has certainly evolved over time, and there have always been countercurrents challenging the mainstream.
Nor, for that matter, is it always easy to know what is meant by the term “individualism.” It is used, albeit legitimately, in a bewildering variety of ways. It may refer to the self-interested disposition of mind that Tocqueville described, or to the passionate egotism Maistre deplored, or to the self-reliant frontiersman or self-made small businessman praised in American popular lore. “Individualism” may refer to an understanding of the relationship between the individual and society or state, wherein the liberty and dignity of the former is to be protected against the controlling pressures of the latter. More radically, it may point toward a philosophy of the state or society in which all political and social groups are viewed as mere aggregations of otherwise self-sufficient individuals, whose social bonds are entirely governed by consensual contract. Even more radically, it may point toward the increasingly popular view that, to the maximum degree possible, the individual should be regarded as an entirely morally autonomous creature—accountable to no person and no putative “higher law,” armed with a quiver of imprescriptible rights, protected by a zone of inviolable privacy, and left free to “grow” and “develop” as the promptings of the romantic self dictate. All these meanings of “individualism” have in common a presumption of the inherent worth of the individual person. But they diverge in dramatic ways.
And yet, qualifiers aside, there can be little doubt that the dominant American tradition in our own day is one of endorsing the highest possible degree of individual liberty and self-development in political, religious, social, and economic affairs. American history is a record of the defeat or erosion of most competing ideas. Whether the realities of American political, social, and economic life actually reflect such an advanced level of individualism on the level of behavior is, of course, another question, to which I can turn in a moment. But it seems clear that the highly pejorative connotation that “individualism” had at the time of its origins never taken deep hold in Anglo-American discourse.
If anything, the language of individual rights, and the tendency to regard individual men and women as self-contained, contract-making, utility-maximizing, and values-creating actors, who accept only those duties and obligations they elect to accept, grew steadily more powerful and pervasive in the latter part of the 20th century. The recourse to individual rights, whether expressed as legal rights, voting rights, expressive rights, reproductive rights, sexual rights, membership rights, or consumer rights, has become the near-invincible trump card in most debates regarding public policy, and it is only in the rarest instances (such as the provision of preferential treatment for members of groups that have been subjected to past legal or social discrimination—a policy that has always been controversial) that this trump has been effectively challenged. The fundamental commitment to what Whitman called the “solitary self” has never been stronger.
Again, it is important to remember that this has not always been the state of affairs in America. Much of the best scholarship in colonial and early national history in recent years has reminded us of just this fact. Political scientist Barry Alan Shain’s book The Myth of American Individualism argues powerfully that it was not Enlightened liberalism, but a very constrained form of communitarian Reformed Protestantism, that best represented the dominant social and political outlook of early America. The political theorist Michael Sandel, one of the most influential communitarian critics of rights-based liberalism, has recently argued that until the 20th century, America’s public philosophy was largely based on the “republican” assumption that the polity had a formative, prescriptive, “soulcraft” function to perform in matters of the economy, the family, church-state relations, personal morality, free speech, constitutional law, privacy, productive labor, and consumption. That assumption, observes Sandel, has been so completely undone by the individualistic liberalism of our own day that we have forgotten it was ever there.
Yet even Sandel would concede that in the end it was the expansive, mid-19th-century voices of men like Ralph Waldo Emerson and Walt Whitman, romantic American nationalists and prophets of the unconstrained self, that have had the better end of the debate, sounding the liberatory yawp that has resounded over the rooftops and resonated through the streets of the American imagination, even unto to the present day. It was Emerson who declared famously that a society is a “conspiracy against the manhood of every one of its members,” and that “nothing is at last sacred but the integrity of your own mind.” And it was Whitman who declared that “the Great Idea” was “the idea of perfect and free individuals,” and that “nothing, not God, is greater to one than one’s-self is.” And although both men would live long enough to be deeply disillusioned by the crass economic opportunism and material acquisitiveness that took hold of American society in the post-Civil War years, one could hardly deny that such driving, self-interested ambition was a logical corollary to the spirit of unrestrained self-development. So, too, was the unforgettable image of Mark Twain’s Huckleberry Finn, the semi-noble, semi-savage boy “lighting out for the territory,” rather than face any more of the pinched and morally questionable rigors of “sivilization.”
As the example of Huck Finn suggests, American thought and expression have always been rich with figures of heroic individuality—and correspondingly poor in convincing and binding representations of community or social obligation. Whether one considers our accounts of the great colonial religious controversies, such as those involving rebels Roger Williams and Anne Hutchinson, or the moral fables embedded in our popular culture, such as that offered in the movies One Flew Over the Cuckoo’s Nest, The Dead Poets’ Society, and Fiddler on the Roof, we seem to have a boundless appetite for fables of personal liberation. We are almost invariably asked to side with the put-upon individual, cast as an unjustly thwarted soul yearning to breathe free, and we are instructed to hiss at the figures of social or political authority, the John Winthrops and Nurse Ratcheds of life, whose efforts to sustain order establish them instead as monsters and enemies of humanity.
There have, however, been a few notable efforts to present a counterexample to this pervasive celebration of individuality. The immense human suffering and dislocation wrought by 19th-century industrialization led to a rash of Utopian novels, perhaps best exemplified by Edward Bellamy’s fabulously best-selling 1888 fantasy Looking Backward, an effort to imagine a perfected postindustrial Boston, reconstituted as a socialist cooperative commonwealth in the year 2000. Bellamy openly reviled individualism, proposing in its place a post-Christian “religion of solidarity,” which would radically deemphasize the self, and instead emphasize social bonds over individual liberty (and traditional Christian doctrine). The popularity of Bellamy’s book showed that there was a market hungry for such ideas, and many of the most “progressive” forces of the day, whether one thinks of the cooperation-minded Knights of Labor, the theological advocates of a modernist “social gospel,” or such Progressive reformers as Herbert Croly, Jane Addams, and John Dewey, unreservedly admired and emulated its spirit.
The Progressive movement itself advanced, at least in some of its manifestations, a new corporate ideal, which sought to downplay individualism and instead to defend and preserve “the public interest,” in the face of industrial capital’s depredations. In the hands of a sophisticated thinker like Dewey, and some of his followers, a case was made that the values of community and individuality, far from being in opposition, are mutually supporting and mutually sustaining, particularly in an age dominated by large industrial combinations, immense asymmetries of wealth and power, and vast impersonal networks of communication. It was pointless, in their view, to attempt to restore the small-scale community of days past. The forces of economic and social modernity had rendered such community, with its personal bonds and face-to-face business transactions, obsolete. The task ahead was the creation of something new, which Dewey called “The Great Community,” a systematically reconstituted social order that, it was hoped, would adapt the best features of the old community forms to the inexorable realities of the new economy and society.
In retrospect, though, the new corporate ideal seems never to have had a fighting chance. Historians have patiently documented a thousand ways in which American life in the 20th century has in fact become more corporate, more organized, more standardized. But Americans’ self-conception has never quite followed suit. Perhaps doing so would have cut too much against the American grain. To be sure, the privations of the Great Depression gave the values of community and solidarity a temporary boost in American social thought, as the historian Richard Pells has convincingly argued. But even Franklin Roosevelt’s New Deal, riven as it was by pragmatic accommodations and intellectual inconsistencies, paid such values little more than lip service.
The decisive blow, however, was administered by the rise of the totalitarian regimes of Europe, whose terrifying success in suppressing the individual for the sake of the collectivity threw all corporate ideals into doubt and disrepute, from which they have yet to recover. The concerns generated thereby decisively shaped both the liberalism and conservatism of the postwar years. Libertarians like Ludwig von Mises and Friedrich Hayek, liberals like David Riesman, Lionel Trilling, and Reinhold Niebuhr, even conservatives like Robert Nisbet and Russell Kirk—all paid their disrespects to the Leviathan state, and thereby called into question the efficacy of any modern corporate ideal. Instead, the social and political thought of postwar America seemed to be dedicated to a different ideal: the guardianship of the self.
There were examples galore of this neo-individualist turn. Riesman’s The Lonely Crowd warned against the conformism of “other-direction” in the American personality, and William Whyte’s The Organization Man deplored the predominance of a “social ethic” in America’s white-collar classes. Ayn Rand’s fierce pop-Nietzschean novels celebrated the autonomy of the individual creative genius, and reviled the dullness of hoi polloi. Neo-Freudian psychology concerned itself with the problems of the ego, and such leading psychological theorists as C.G. Jung and Erik Erikson focused obsessively on the problem of individuation. Even the emergence of a New Left movement in the early 1960’s, which purported to rebel against the complacency of its liberal forebears, did little to alter this trend, since the movement’s communitarian tendencies were no match for its commitment to a radical, near-anarchic standard of behavioral and expressive liberty, in speech, dress, sexuality, drug use, and so on. As such, it provides a textbook illustration of the difficulty entailed in pursuing the politics of progressive reform while remaining programmatically suspicious of any and all sources of authority and value outside the self.
This difficulty represents a serious obstacle not only to radicalism, but to the reform aspirations of both liberalism and conservatism in contemporary times, since each of these ideological camps contains within itself anarcho-libertarian elements that, while undeniably popular, work against the establishment and sustenance of communal values, and thereby undermine the very idea of a stable public interest. For conservatives, the principal such obstacle stems from an ideological commitment to economic liberty; for liberals, it arises out of an equally rigid commitment to moral and expressive liberty. In crucial ways, both ideological camps have in common an unwillingness to accept the need for an authority, a tradition, an institutional nexus that is capable of superseding individual liberty in the name of social cohesion and the public interest.
In the age of modernity and postmodernity, then, the self has become the chief source of moral value. The term “self,” which has had an amorphous history, has nevertheless evolved into something crucially different from the term “soul.” It is a psychological term, largely stripped of metaphysical implications. The “self” is understood as the seat of personal identity, source of mental cohesiveness and psychological integrity—the vanishing point, as it were, where all lines of psychological energy converge in the life of a “healthy” and “integrated” individual. The word “soul” maintains a link to the transcendent realm, and is suggestive of an imperishable essence distinct from the bodily state. But the “self” is immanent, secular, worldly, transitory, adaptive, pragmatic. “Soul” is a word that rarely crosses the lips of modern thinkers, unless they employ it in a deliberately rhetorical or fanciful way, as do the neo-Jungian advocates of “soul-making,” or the authors of books proclaiming the “lost soul” of this or that wayward entity. “Souls” are judged by the vanished God of faith; “selves” by the all-too-present God of health. By way of compensation, though, there remains a lingering ambiguity about “self”—some residue of the romantic “authentic” self always lurking in the corners of psychotherapeutic discourse, promising a kind of God-experience, though one that is free of creedal, dogmatic constraints. The “authentic” self may well be drawn toward “spirituality,” but not toward conventional organized “religion.”
There are complications inherent in such a strictly psychological approach to the self. As the communitarian thinker Charles Taylor has argued in his magisterial study Sources of the Self, personal integrity inevitably rests on a moral foundation, on a set of prerational moral presuppositions. In other words, a moral disposition toward one’s world, and a prior assent to certain moral criteria, are the preconditions of there being any psychological order and consistency at all in a human personality. If Taylor is right, then health is built upon morality, rather than the reverse. And this is often precisely what we are doing when we talk about the self. As Joseph Davis has pointed out, therapies for survivors of childhood abuse, which have become the principal model of psychological “recovery” in our time, are openly geared toward the construction of new life-narratives, ones that serve to overturn the disabling effects of the abused individuals’ life experiences. The cool pastel language of “narrativity” is intended to give an aura of moral neutrality to the process. But that is pure deception, thinly veiling what is actually going on. In fact, far from being morally neutral, such “redescriptions” are in fact meant to reallocate moral praise and blame in the client’s world, in entirely new and “healthier” ways. Most often, they do this by recasting the client as a “victim,” and the abuser as a moral transgressor. The explicit language of sin may have been banished, but not the lingering concept of moral responsibility. No less than the born-again evangelical, such individuals are engaged in rewriting the story, and the meaning, of their lives.
But now the question arises: does it not matter tremendously whether or not these new narratives are true, or can be sincerely believed to be true? Can such therapies succeed if they are self-consciously regarded as nothing more than the construction of empowering narratives? Can morality be effective if it is denied the support of some warrant of truth? Here is a juncture at which the difference between a modernist and postmodernist approach to the self can be readily discerned. A modernist like Freud would have insisted that it is the truth about ourselves that sets us free. A postmodernist would respond that there are multiple truths, and one of the key elements in the achievement of psychological health is the ability to navigate between and among the elements of this multiplicity.
There is no escape, then, from difficult epistemological choices, even in the anti-philosophical precincts of individual psychology, let alone the less buffered realm of public affairs, where one has to be constantly on guard for collective self-deception. One possible solution to the general dilemma of truth-discernment has been sought by scholars such as Richard Rorty, who are promoting the revival of the philosophy known as pragmatism. They tout pragmatism as a method that can arrive at consensual and provisional “truths” that will serve the purposes of the hour, even public purposes, without falling into the pit of nihilism on the one hand, or committing the error of reintroducing “objective truth” on the other. This suggestion has proved fruitful in the academic realm, a realm to which pragmatism would seem to be particularly well suited, since it legitimates the kind of continuous discussion and constant rethinking that characterize academic life at its most engaging. American academics approve of pragmatism, because pragmatism is good for business. How successful such methods can be in actual political and social operation, however, remains to be seen, since they seem to amount, in practice, to little more than a mechanism for producing arbitrary pronouncements, propounded by elites but dressed up in the language of majoritarianism (or majoritarian pronouncements that are dressed up to appear as if they are principled and disinterested dicta). The provisionality of pragmatism, the very feature that makes it so attractive in academic settings, makes it peculiarly unsuited for public life, a realm that needs to draw upon the statement of broad values and large principles to produce a workable consensus for governing. What looks like redescription in the lecture hall looks a whole lot like dissembling or evasion in the public square.
An even bigger problem with a subjectivist moral order is the inherent instability of the self. One of the most powerful themes of postmodernism is its assertion that the modern self cannot bear the weight placed upon it by fragmented modern life, and that in fact, the multiplicity of our world requires us to operate on the basis of multiple selves. Just as in atomic physics, where the unsplittable entity (the a-tomos) turned out to have an unnumbered multitude of particles and subparticles in its makeup, so too, the self has proved to be a complicated and vulnerable entity, as vulnerable as the idea of truth itself. Rene Descartes had inaugurated modernity with the assertion that the “I” is the most fundamental building block in our apprehension of reality, the still point in a moving world. Now it appears that the self, far from being foundational, is the most protean and variable thing of all. In the postmodern view, the search for “individual integrity” and “authenticity” is outmoded. The postmodern self is not a unitary thing, but an ever-shifting ensemble of social roles—a disorderly venue in which the healthy ego functions less as a commander-in-chief than as a skilled air-traffic controller.
It is hard to know how to respond to this description, or to the phenomenon it describes. Ought one celebrate it, in the manner of such writers as Robert Jay Lifton, Walter Truett Anderson, and Sherry Turkle, who find exciting elements in the postmodern liberation from unitary personality? Or is one obliged to deplore it, in the manner of the late social critic Christopher Lasch, as an effort to “redescribe” mental illness and moral incoherence as a new form of mental health and moral clarity? Or should one merely treat it neutrally, as a provisional account of a new set of psychosocial conditions, to be analyzed and somehow coped with? Whatever one’s answer to those questions, it seems clear that the modernist ideal of the individual has been rendered far more confused and unsteady than ever before.
One element in that ideal that has been made especially untenable is its assumption of the fundamental and interchangeable equality of all selves. This axiom has been challenged in a number of ways. For example, the last three decades of the 20th century saw major legal and jurisprudential struggles over the ways that the doctrine of equality should be applied to questions of group identity. These struggles have been manifested in debates over race, affirmative action, multiculturalism, and the like; but nowhere has the problem become more complex than in the area of gender. As Elizabeth Fox-Genovese has argued, it is no easy matter to determine the extent to which irreducible biological asymmetries—particularly women’s ability to bear children—should influence legal and political structures. Is women’s equality best honored by obliterating all distinctions of sex, across the board, treating men and women interchangeably? Or is it best honored by building certain recognitions of female difference into the law? Surely the sensible, if evasive, answer is: it depends. Fox-Genovese attempts to answer the point by distinguishing between the imposition of a mechanical equality, which would be undesirable for most women, and a more amorphous standard of “equity,” which would take difference into account.
To be sure, Fox-Genovese’s notion of “equity” might be exceedingly hard to define in practice, particularly given the propensity of the law to be an exceptionally blunt instrument. But she points us toward an understanding of individual dignity that, if achieved, would be strikingly different from doctrinaire egalitarianism. In many respects, it would resemble a recovery and adaptation of the Judeo-Christian understanding of the individual person as deriving dignity from his or her intrinsic being, rather than from the greater or less degree of freestanding autonomy—what is sometimes called one’s “quality of life”—that he or she can demonstrate. Such a view would stand in the longer Western tradition of individualism, affirming the diversity of legitimate human roles and ranks in society as we find it. At the same time, it would stand in direct competition to the increasingly influential view that the dignity and standing of any individual life is dependent upon the competency of the individual in question. How such differences will be resolved in the future is hard to discern. We can anticipate, too, that our growing knowledge of the biogenetic bases for human psychology and behavior in years to come will have a profound, and equally unpredictable, effect upon our view of the individual, and therefore its own influence on the outcome.
One thing seems clear, however. And that is the need to rescue the idea of individual dignity from its captivity in the realms of individual psychology and postmodernist subjectivity, by returning it to the public realm, where it may be able to find a firmer footing and deeper roots. This would mean reaffirming the core meaning of individualism: its insistence upon the transcendent value of the person. But it would also embrace the core insight of communitarianism: the recognition that the self is made in culture, and the richest forms of individuality can only be achieved in the sustained company of others. And it would build upon Tocqueville’s further insight that it is in the school of public life, and in the embrace and exercise of the title of “citizen,” that the selves of men and women become most meaningfully equal, individuated, and free—not in those fleeting, and often illusory, moments when they escape the constraints of society, and retreat into a zone of privacy, subjectivity, and endlessly reconstructed narratives of the “self.”
In short, the word “citizenship”—a word that the social sciences, including political science, have done so much to disparage in the past century—needs to be restored to its proper place. Equality and individuality—and freedom—take on a different meaning when they are understood in tandem with it. Indeed, citizenship is an artificial construct meant to encompass, and correct for, the imperfections and inequalities inherent in the endowments of nature and the accidents of culture. In addition, it grounds itself in something that the social-scientific view of human nature has sorely neglected: the human ability to initiate, to deliberate, to act, and in so doing to transform the very conditions of action. The proper study of politics takes seriously the human capacity to be a cause, and not merely an effect. More than anything else, Tocqueville feared that their tendency toward individualism would inhibit Americans’ willingness to act in public ways. Accordingly, the most fruitful response to the present-day disintegration of the self may be a movement away from the characteristic preoccupations of modern sociology and psychology, and toward a fresh reconsideration of our political natures, in all their complexity, contingency, and promise. The Western Christian tradition has always taught that the fractured soul is healed not by drawing in upon itself, but by being poured out into relationship with the things outside itself. That insight still has much to commend it.