Skip to main content

The New Ludditism in Literature


PUBLISHED: June 26, 2009


In a recent essay in n+1, Benjamin Kunkel, in a wide-ranging consideration of technology’s effects on contemporary culture and daily life, writes that the internet and its products feel forced upon us. For anyone who goes online daily—and increasingly that is most of us—there is a never-ending barrage of e-mail, articles of note (for their vulgarity or supposed profundity), amusing videos, invitations, profiles, photos, blog posts, news feeds, figurative “gifts,” and the like—and most of it is free, available to be guzzled down with a click. It is nigh impossible to simply dip into the internet; the irony is that if you have any awareness of how to navigate it, this endless stream of content, digital companions, and e-communiques becomes more numerous and oppressive, its depths cavernous and alluring, rather than simpler and streamlined.

What does it take to separate us from these omnipresent digital phenomena, and will that separation one day be impossible, when gadgets, screens, and Wi-Fi are everywhere? Even now, the term “going off the grid” is often used as a jesting hypothetical, something done by eccentrics and believers in an impending apocalypse. As a regular feature of electronic social discourse, waiting a day or two to answer an e-mail requires an explanation, if not an apology. “You don’t have a 3G-enabled phone with e-mail?” my friend asked me a few months ago (an eternity, in technological terms). He was joking, of course, but there was also some truth there, a frustrating and niggling feeling that with my once-cutting-edge Motorola, I was somehow missing out. To my irritation, it took a moment to focus, pull back, and realize that no, I didn’t need that.

Kunkel is correct that self-discipline is one of the great casualties of the internet age, but as thinking, independent beings, we only have ourselves to blame, and it is up each individual to recover what might be lost. Not every technology is inherently neutral—consider Monsanto’s “Terminator” seeds—but our laptops and e-mail clearly are. “No one is stopping you from stopping yourself,” Kunkel writes. “It’s just that many users of digital communications technology can’t stop. An inability to log off is hardly the most destructive habit you could acquire, but it seems unlikely there is any more widespread compulsion among the professional middle-class and their children than lingering online.”

The fear, as Kunkel attests, is that our willpower is inadequate, that, like in Infinite Jest or other visions of death-by-technology (it is no coincidence that many of these scenarios are found in books), we cannot resist our own creations. Technology is our reflecting pool, and each one of us is a potential Narcissus—isn’t a social networking profile or a YouTube video gone viral proof of such? What else to account for my Facebook friend, someone I barely know in fact, who has more than 2,200 pictures of herself online? The victims of this mania are—we are variously told—genuine emotional connection, privacy, attention spans, novel reading, and serious culture.

But not everyone is like this. Not everyone is as interconnected and digitally astute as those described in the previous paragraph, though a recent poll shows that only 14 percent of American adults don’t use a cell phone or the internet. Yet if we don’t consider the demographic, financial, and even geographic elements of the technology gap, we risk succumbing to the solipsism commonly attributed to our culture. After all, many of the elderly or the poor aren’t regular computer users. A kid living in poverty in East L.A. or East Timor may not have access to a cell phone or internet-connected computer, though he might have a library card or a few books (hence some of the recent arguments that widespread adoption of e-readers could impinge on book access for the poor). However, the hopes of the digital age—and those techno-evangelists who abet it, from Steve Jobs’s iPhone to President Obama’s plan to bring broadband to rural communities—are tied up, at least in part, in leveling this technological gap. Perhaps it’s not spoken of much because the proliferation of high technology seems assured; or, on the other hand, we may simply neglect those who don’t have the means to connect. Because if you’re not online today, can you be part of the conversation?

One benefit of fiction reading is supposed to be its remove from the day-to-day. Reading allows us to explore leisurely that which bends toward the eternal—the ideas and truths plumbed and desperately sought by generation after generation of writer. If we don’t have the patience and discipline to read and consider novels, stories, poetry, it’s we who suffer, because the writers are still out there, doing their work, the younger ones trying to ignore their vibrating phones, e-mail notifications, and g-chats.

Something must endure, and the history books—or whatever we create to contain and record our history—will not, I hope, be concerned with David at the Dentist or the latest celebrity nipple slip. That’s not to say that they don’t have their brief moments of (severely) relative importance, but these ephemera are clearly part of pop culture, not culture—full stop. But it also takes time, decades maybe, before we find what truly persists, despite our eagerness to anoint something as great or canonical. The acclaim heaped upon Irene Nemirovsky’s Suite Francaise was in part derived from the fact that she created acutely observed fictions about events nearly simultaneous with their occurrence; but importantly, it took fifty years for her work to be discovered (as her daughter refused for so long to read her mother’s notebooks), and her psychological acuity and powers of observation are better appreciated with the benefit of a half-century of hindsight. They are made timely precisely because a gap of time exists between these novels’ creation and our reading of them. Decades of World War II scholarship and soul-searching provide the interstitial tissue that heightens our understanding.

Somehow the descriptor “postmodern” is still the de facto term for describing progressive, innovative work, when the term itself was first used more than 130 years ago. We return to it because we have little else that can adequately supplant it, and its vague generality remains useful. Perhaps, also, it is in the very nature of the word to be continually relevant, or merely never outmoded, for it literally means “after modern,” and modernity is a constantly changing edifice, if it even exists in the physical world. This feeling may be reflective of a zeitgeist that now is so quicksilver and protean as to be unable to fairly diagnose itself. We don’t have the time between the writing and the discovery, and so postmodernism endures, stretches out into the future, into an unstable and meaningless infinity.

The artistic movements of the past, whether it was Surrealism or Romanticism, existed in slower periods, in eras of lesser connectivity. They had time to incubate, in communities of artists living, working, loving, drinking, and dying in close proximity to one another. Letters were thought out and slow in creation. Manifestos were written, and they were often read and taken seriously. Another byproduct of our current condition—and this, too, may change at any moment, with any subsequent innovation or fad—is that our ease of communication allows for wonderful opportunities for collaboration, but it doesn’t breed a tendency towards artistic movements or greater ambitions. We create mash-ups, not epics; parodies, not song cycles; memes, not disciplines.

In the last 30-odd years of literature, we have seen something more akin to trends than broad-based movements. Whether it’s maximalism, hysterical realism (originally coined as a term of derision), dirty realism, or absurdism, no single aesthetic view has managed to dominate the literary culture. This fragmentation is above all a blessing, connoting a tolerance for diversity in American literature. In the last decade, the rise of novelists Michael Chabon and Jonathan Lethem, who vehemently oppose and successfully transcend the artificial ghettos of genre, adds to this great leveling. And so fiction writers as distinct as Don DeLillo, Lorrie Moore, Marilynne Robinson, Cormac McCarthy, Toni Morrison, Aleksandar Hemon, George Saunders, Edward P. Jones, and numerous others like and wholly unlike them have strode forward to assume prominent places in American letters.

But just when any mode of writing seems viable, when the marketplace, readers, and critics are capable of embracing these exceedingly diverse voices, our technology may threaten our ability to adequately capture the current state of our society. To be sure, our technological progress and the internet’s manifold offspring are awe-inducing, capable of allowing us to communicate in a hundred different ways, many of them useful. But as this tremendous atomization takes place, as blogs and text-messaging and social networks and web video and the synergistic interactions between these media proliferate, the problem arises of how to authentically integrate these modes of communication into fiction without awkwardly drawing attention to the writer’s use of them; how to determine what is essential to the times and what is a troubling distraction.

With these challenges—surely to multiply in the coming years—in mind, will the term “historical fiction” soon encompass anything set before 2009? Or 2015? Or maybe that threshold has passed, and a novel written about the (post)modern world can only seem contemporary if taking place after 2006, replete with text messages, social networking, and the rest. Is it then possible for one of us to summon the powers of Nemirovsky to create an image of the current age that persists beyond a day, a week, or the lifespan of distracting viral internet media? Will more than a handful be able to concentrate long enough to read and discuss it? When Joseph O’Neill, in his novel Netherland, used Google Earth to express how deeply his protagonist misses his family, Dwight Garner described the scene as “closely observed, emotionally racking, un-self-consciously in touch with how we live now.” Will others like him be able to do the same, or will writers of literary fiction ignore the role of technology, relegate it to the sci-fi crowd, thereby eventually making themselves sadly anachronistic?

Not all is to be gained from a full-throated embrace of our light-speed communication. There remains a risk that a society will be created around us that allows no choice but to participate, to be fully connected. Already, there is the creeping pressure to be always plugged in, as Kunkel—who doesn’t own a cell phone—articulates, as do some of the authors whose work he surveys. I felt an uncomfortable sense of release recently, when during two separate weekends, I didn’t use my computer or the internet (and hardly my cell phone). Release because I could immerse myself in books and magazines at my own pace, without any sense of having missed something, but uncomfortable because that release is needed at all and was made possible only because I was traveling.

Will there be a sufficient backlash among some populations—here I propose the unoriginal and slightly pejorative New Ludditism—that we demand something slower, more ordered, and more sustainable? If we can make similar demands of our food production, as the Slow Food ethos advocates, then why not with our communication and culture, or at least certain aspects of it? The choice, once again, may lie with the individual, since this is a time in which even the most fervent grassroots movements quickly lose their way, and techno-skepticism is unlikely to fare any better.

There must be a terminal velocity, be it the end of Moore’s Law or the end of our productivity, to the concomitant diminishment of our attention spans and the progression of our technology. I do wonder though: will we still like ourselves when we get there, and will a healthy literary ecosystem be one of the losses along the way?

8 Comments

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Bobby Styles's picture
Bobby Styles · 14 years ago
An excellent articulation of current phenomena. Thank you, Mr. Silverman. I’m confident that writers will always pick the details (whatever technology those details may be) that feel right to fashion for their creative purposes. What is made of that by readers and publishers, is what will be made of that. Se La Vie.
+1
-147
-1
Daniel's picture
Daniel · 14 years ago
I’m pretty sure you meant “casualties”“, not “causalities.” Otherwise, great article!
+1
-85
-1
Waldo Jaquith's picture
That’s my mistake, Daniel, for not catching itthanks for that.
+1
-126
-1
Diana's picture
Diana · 14 years ago
Mr. Silverman, I enjoyed reading your article; although, to be honest, I think I enjoyed it only because it reaffirmed the opinion I already held of myself as someone who belongs to that “special” category of person who still reads novels and poetry and cares about Art with a capital… you get it. It has become modish (not without reason) to critique mankind’s current seduction by technology, and I feel that your blog post fits nicely within this “What Hath We Wrought” trope. Please don’t get me wrong. I understand that the discipline (or pleasurable pastime) of critiquing is a.) not very new b.) kind of useful c.) probably what distinguishes us from rocks, carrots and most animals. However, I have not made up my mind about the depth of your analysis in this particular blog post. Maybe the post was not meant to be an “in-depth” (whatever that means) analysis of anything but more an elegant rumination (very back cover blurb- forgive me) on your own struggle to quit the Interweb and its endless blue links and vertiginous cliffs of information… Ok, now I kind of feel like I am copping your writing style. Also, if you’re the type to keep track of these types of things, I have not made any lucid points. Maybe if I take some of your quotes and comment on them I will be able to find a path out of the (beautiful) wilderness of my own writing: “Something must endure, and the history books—or whatever we create to contain and record our history—will not, I hope, be concerned with David at the Dentist or the latest celebrity nipple slip. That’s not to say that they don’t have their brief moments of (severely) relative importance, but these ephemera are clearly part of pop culture, not culture—full stop. But it also takes time, decades maybe, before we find what truly persists, despite our eagerness to anoint something as great or canonical.” Nobody I know has anointed David at the Dentist as “canonical” or “great.” Actually, my father told me it was child abuse after I showed him the video. I don’t think our history books will be concerned with the latest nipple slip; I don’t believe you really think they will be either. Or maybe you do. I guess I don’t really know you. Yes, it takes time to find out what persists. Sometimes critics will even argue about whether something that has persisted deserves to have persisted for so long. Life is funny like that. “After all, many of the elderly or the poor aren’t regular computer users. A kid living in poverty in East L.A. or East Timor may not have access to a cell phone or internet-connected computer, though he might have a library card or a few books (hence some of the recent arguments that widespread adoption of e-readers could impinge on book access for the poor). However, the hopes of the digital age—and those techno-evangelists who abet it…” I think cell phones are a relatively cheap technological device. The population of many of the developing nations I have visited are quite dependent on them. Text messaging is one of the cheapest ways to contact a body. Internet cafes are legion. I believe most libraries in the United States have wireless; it might even be a requirement. The image of a poverty-hobbled child with a love for literature in his heart and a useless library card in his back pocket seems a bit Romantic. Your use of phrases like “literary ecosystems” and “artificial ghettoes of genre” tells me you have read a lot of literary criticism and theory. You might have majored in English or Creative Writing. You also threw “memes” into your essay for good measure. Interesting. Anyway, I liked your piece. Really. I would write more and perhaps make a coherent argument, but I’m being distracted by other things I could do online (cheap ending- but I needed to end my rant!).
+1
-66
-1
Jacob Silverman's picture
Hi, Diana, and thanks for chiming in. Please don’t judge me by my vocabulary. You’re correct that I was an English/Creative Writing major, but I admittedly know very little about theory. I do read some criticism, but it’s the usual suspects: newspaper reviews, VQR, the Atlantic, Harpers, NYRB, etc. I don’t spend nights reading Irving Howe and H.L. Mencken by lamplight, though I’d probably get something out of it. As for “memes,” I get that from reading Andrew Sullivan’s blog. I’m really not a partisan one way or another in this debate. I was trying to express my overall ambivalence about technology – there are certainly some aspects of it that I love (I don’t think I could give up my cell phone like Kunkel). As for the library card image, it may be a romantic one, but it’s also true. For one example, check out this recent conversation on Ed Champion’s site. Several people in the comments talk about growing up poor and how essential libraries and bookmobiles were to their education. Finally, I use the term ludditism (and I think I was wrong and that luddism is proper) to mean a few things. I think some degree of luddism or skepticism towards technology can be useful because it causes us to stop and think about what we’re doing, to assess if we like the direction in which things were going. But at the same time, writers, of fiction and non-fiction, must reckon with these ideas to remain relevant cultural contributors, and that’s why towards the end I mention that concerns about technology shouldn’t be relegated just to sci-fi writers or some other benighted sub-genre. Then again, many sci-fi writers are doing quite well (certainly in TV and film), and the New Yorker, of all places, has actually published some sci-fi/techno-tinged stories in the last year. (Off the top of my head I can think of Jonathan Lethem’s “Lostronaut” story and a recent one by an Israeli writer.) Best, Jacob
+1
-118
-1
PHM's picture
PHM · 14 years ago
I think that as technology grows to encompass everything it will do more good than harm. If electronic book readers follow the same path as all other technology (becoming less expensive as time goes on) then, in fact, they will help proliferate novels and magazines (especially off-beat novels and magazines that a small, local bookstore might not have or a local library might not carry). Consider the greatness that could ensue if a library a small West African nation purchased a handful of eBook readers and attached them to the Gutenberg library rather than buying 200-year-old classics again and again? Then the library request feature would work the same, wouldn’t it? And more over, booksellers have often and always made exceptions for public institutions. Thus I find it very plausible that electronic ink is going to pave the way for more literacy. It’s not that people aren’t reading, in fact I’d say that if Joe the Plumber is getting his news from Reuters online instead of the CBS Evening News then, in fact, literacy has increased. It’s the kinds of things that people are reading. The overall bend toward non-fiction has been a product of post-9/11 culture and its necessity to be grounded in fact. I think when the time comes that people are generally ready to read fiction on a massive scale again, they’ll be looking online more than they’ll be looking at Borders. I also think that society is going to adjust to the changes brought on. The internet has been around for 40 years, remember, so as with all changes, it’s going to take a bit longer for society to be fully ready to deal with the consequences of constant availability.
+1
-176
-1
An NYU Student's picture
An NYU Student · 14 years ago
To link to The Atlantic article ‘Is Google Making is Stupid?’ without admitting that in the current issue there is a feature article tagged ‘Is Google Actually Making Us Smarter’ on the front isn’t cool. And I think that, in their own ways, the articles are both right. Your post was fantastic, by the way. I’ve never heard of VQR, but then again I’m not yet 20. I’ll be sure to check out the rest of it. Thanks for your thoughts.
+1
-71
-1

Recommended Reading