If anything distinguishes my generation of American writers, it’s that everyone in my generation became a writer, simply through the act of going online. More words have been written, more words have been read, by my generation than by any other generation in human history. I have to say, as a person who’d always planned on becoming a novelist, as a person who’d always planned on supporting the writing of novels through the writing of nonfiction, I found this daunting. The amount of information and the speed of its dissemination overwhelmed. I’m guessing this was the experience of most Americans born within reach of a midsized untangled extension cord from the year 1980—most Americans who’d grown up with books, only to exchange them for millennial adulthood and screens.
This ever-increasing amount of information coming at us at this ever-increasing speed rendered us unable to adequately attend to our own divided presences, let alone to a world that, though it wasn’t united, was suddenly “global.” Terrorism in Istanbul, hostages in Afghanistan, shark attacks, lethal mold, a sex scandal involving a missing congressional intern, the Giants v. the Broncos (to mention just a few of the “headlines” of September 10, 2001)—we were utterly incapable of absorbing what was happening. Rather, we were only capable of reacting to it: We scrolled through the plenitude, and clicked “like,” and clicked “dislike,” and generally ignored anything we weren’t able to assimilate efficiently. The dangers of our impatience were obvious: no depth. But considerably less obvious were the dangers involved with a mass culture’s rupture into myriad subcultures. Today, our sense of selfhood is undergoing a similar fragmentation. We’re all becoming too disparate, too dissociated—searching for porn one moment, searching for genocide the next—leaving behind stray data that cohere only in the mnemotech of our surveillance.
I began writing nonfiction in the wake of September 11—and was published in print, in hard copy, by newspapers and magazines that would go on to cut pages, wages, and staff, if they didn’t fold altogether. Meanwhile, online was busy revising responsibility for the attacks: Bush II ordered them, Cheney let them happen, the American Deep State colluded with the Israelis, the Israelis colluded with the Saudis. I remember enduring explanations about how it was absolutely unthinkable that an explosion of jet fuel would be able to melt that grade and tonnage of steel so quickly and completely as to cause complete collapse. Ergo, the destruction of the WTC had to be a “controlled demolition.” Ergo, the destruction of the WTC had to be “an inside job.” Here, at the start of my nonfiction career, was the first time I encountered this phenomenon—namely, the violence being done to facticity.
In the years since, the ways in which fact has been under attack have been well documented, in the very venues in which fact has been under attack. Newspapers, magazines—by which I mean, of course, their online successors—are full of much more than information that’s true and information that’s false. They’re also full of true accounts of the dissemination of true information, true accounts of the dissemination of false information, false accounts of the dissemination of true information, and, last but not least, my mind-melting favorite, false accounts of the dissemination of false information. The identity, or identities, of the disseminator, or disseminators, of this information changes frequently. The notions of the degree of culpability to be borne by the organizations that merely disseminate the information that has been leaked, or hacked, or faked, or some combination of leaked and faked, or hacked and faked, changes frequently too. But digital technology is not at fault. Rather, to blame digital technology is to blame ourselves. The average computer user of good faith who seeks regularly to read the news online now has to exercise the type of critical acumen that scholars of literature have always reserved for the analysis of texts: an intense engagement that seeks out secret meanings, hidden biases, hidden agendas. And what’s more, our fictional average computer user of good faith who seeks regularly to read the news online has to do so even as the news reads him, or her, and modifies itself accordingly.
I live in a land where the natives don’t have to be native and the foreigners don’t have to be foreign; a land where everyone’s always changing their addresses and switching employers, trading in their old names for new names, and altering their sexual preferences, genders, and fortunes; a land whose peoples have no mutual history, or not much; a land whose peoples have no mutual culture, or not much; a land that lacks any common religious or ethnic or racial identity, along with all reliable markers of education and class, and even a unifying language and consistent ethical and moral principles.
This is where you live too, if you also live online: a land that feels virtual, because everything in it has been reimagined to distraction.
In its strictest sense, to be distracted means to be perplexed, confused, bewildered; a distracted person is out of touch with the person they used to be; a person “beside themselves,” who has to be reminded; a person drawn asunder, pushed away, pulled apart, turned aside; a person “depersonalized,” who’s lost their grip, their footing, their mind.
Unlike other popular brands of bonkers (“witless,” “frantic,” “frenzied,” “antic”), distraction isn’t some spontaneous disintegration or unexpected absence of the senses. Instead, it’s a gradual disbalancing, which requires only an initial intoxication and then proceeds to intoxicate itself. That’s not to say it’s a death sentence, however: because, almost uniquely among the mental maladies, distraction can be reversed, which explains why it was the term preferred by doctors for one of the earliest certified forms of temporary insanity, and so why it was the term preferred by lawyers for one of the earliest certified forms of the temporary-insanity defense.
Meanwhile, when applied to the crowd, the epithet is declinist: It describes a state that cannot hold; a state diverted.
I considered listing some statistics regarding how many Americans claim they’re distracted, but while undertaking that research—which I only did because I could do it online—I came into contact not only with how many daily computer users claim they’re distracted, but also with how many American women aged 65 and older, and how many American children who attend public school and have no siblings and reside in nonurban areas, claim to be distracted on weekdays v. weekend nights. I was in the midst of compiling all these numbers into a single comprehensive number, but I must’ve gotten sidetracked, and, anyway, that final sum should be, to quote our Founders, “self-evident”: It’s everyone. It’s 100 percent. And even that figure feels too low by half.
Suffice to say, if you read at the pace of most Americans, which is approximately two hundred words per minute, then you’ve been reading for approximately six minutes by now, though—if you’re like most Americans in another respect—there’s also a roughly 50 percent chance you’ve already taken one break to check your email, and a roughly 75 percent chance you’ve taken two breaks if you’ve been reading on your phone.
We click away, but then we return, but then we click away again. We toggle perpetually between our guilt and guilty pleasures.
But though we might experience distraction as a shuttling, the shuttling compounds. The ailment tends to multiply itself, to mirror, echo, spin off, and sequelize itself, until the best any of us can do is just acknowledge it: We’re spiraling. You ask: “What was I doing?”
You ask: “What was I supposed to be doing?” All you can answer is: I’m distracted. It’s hard to go any further into it than that.
Especially because what you should be doing is trying to step back. Not to retreat, but to gain another vantage. Stepping back is never a retreat, if you pursue a problem to its origins. We’ll only recover if we can find out just how and why this problem of distraction—this dim word, this diffuse abstraction—came to so blight us and unravel our brains.
The most venerable of our English words for “losing it” point etymologically to an event that made us that way (“mad”), and to a sad accident that cracked and crushed us (“crazed us”); we have a word that attributes the affliction to the phases of the moon (“lunacy”); we have another that ascribes it to the cycles of the womb (“hysteria”); we have blunt pseudomedical and pseudolegal jargon (“psychopathic,” “sociopathic”), and the inevitable overabundance of mean negations: “insane,” “unhinged,” “demented,” “deranged.” All those words, but especially distraction, suggest some degree of deviation from a communal standard—some loss of a fundamental collective traction, which must immediately be regained.
This was how the Puritans understood the term: denouncing the women at Salem as having been “Distract’d” into witchcraft. George III (not the most stable of men himself) was still censuring “the distracted colonies” on the brink of independence.
In American life, whenever a governmental or religious entity accused a group of distraction, the subtext was that the group (usually of women or minorities) had transgressed a norm or crossed a boundary. If the members didn’t course-correct—because, again, distraction was correctible—the authorities would have to intervene to restore order.
Likewise, whenever an American politician leveled the distraction charge against the country in general, it was typically in the midst of a populist appeal to a base resistant to change: abolition, emancipation, universal suffrage.
“The distracted state of the Union” was traditional campaign rhetoric on the march toward Civil War, as antisecessionist candidates in both North and South threatened a turbulent future that would only get worse, unless Americans returned to the ways of the past, as if the past—pre-1850? pre-1812? pre-Revolution? pre-Columbian?—were this bright shining garden of sanity from which they’d fallen.
Given this history, distraction grows a great deal thornier than it’d been when it merely delineated the condition of slurping delivery noodles while streaming an Amazon show while holding a ringing phone and trying to remember who you’re calling.
The term’s past and present usages have a depressing codependency, however: They’re like two sides of the same dull blank coin.
When our media fill the air with trashy breaking updates, when our elected officials lie, what they’re doing is creating a distraction, so as to command our attention for their profit, or to steer our scrutiny away from the more dire of their crimes.
In turn, when we feel overcome by this assault, when the sheer variety of its indecency has worn us into boredom, we withdraw and distract ourselves.
And so what had once been a technique for subduing the vulnerable is still with us, but now it’s also become the technique by which we subdue our passion and intelligence and keep our vulnerabilities private and intact.
To live in America today is to sit slackjawed at a helpless recline, stuck between the external forces that seek to disempower and control us, and our own internal drives to preserve, protect, and defend our hearts and minds.
In my opinion, there has never been a better time to recall this: the democracy of our distraction.
I’m writing it down here, before I forget.
From Attention: Dispatches from a Land of Distraction by Joshua Cohen, published by Random House, a Penguin Random House company. Copyright © 2018 by Joshua Cohen.
If you like this article, please subscribe to n+1.