Advertisement

SKIP ADVERTISEMENT

First

The Twitter Trap

Last week my wife and I told our 13-year-old daughter she could join Facebook. Within a few hours she had accumulated 171 friends, and I felt a little as if I had passed my child a pipe of crystal meth.

I don’t mean to be a spoilsport, and I don’t think I’m a Luddite. I edit a newspaper that has embraced new media with creative, prizewinning gusto. I get that the Web reaches and engages a vast, global audience, that it invites participation and facilitates — up to a point — newsgathering. But before we succumb to digital idolatry, we should consider that innovation often comes at a price. And sometimes I wonder if the price is a piece of ourselves.

Joshua Foer’s engrossing best seller “Moonwalking With Einstein” recalls one colossal example of what we trade for progress. Until the 15th century, people were taught to remember vast quantities of information. Feats of memory that would today qualify you as a freak — the ability to recite entire books — were not unheard of.

Then along came the Mark Zuckerberg of his day, Johannes Gutenberg. As we became accustomed to relying on the printed page, the work of remembering gradually fell into disuse. The capacity to remember prodigiously still exists (as Foer proved by training himself to become a national memory champion), but for most of us it stays parked in the garage.

Sometimes the bargain is worthwhile; I would certainly not give up the pleasures of my library for the ability to recite “Middlemarch.” But Foer’s book reminds us that the cognitive advance of our species is not inexorable.

My father, who was trained in engineering at M.I.T. in the slide-rule era, often lamented the way the pocket calculator, for all its convenience, diminished my generation’s math skills. Many of us have discovered that navigating by G.P.S. has undermined our mastery of city streets and perhaps even impaired our innate sense of direction. Typing pretty much killed penmanship. Twitter and YouTube are nibbling away at our attention spans. And what little memory we had not already surrendered to Gutenberg we have relinquished to Google. Why remember what you can look up in seconds?

Robert Bjork, who studies memory and learning at U.C.L.A., has noticed that even very smart students, conversant in the Excel spreadsheet, don’t pick up patterns in data that would be evident if they had not let the program do so much of the work.

“Unless there is some actual problem solving and decision making, very little learning happens,” Bjork e-mailed me. “We are not recording devices.”

Image
Credit...James Joyce

Foer read that Apple had hired a leading expert in heads-up display — the transparent dashboards used by pilots. He wonders whether this means that Apple is developing an iPhone that would not require the use of fingers on keyboards. Ultimately, Foer imagines, the commands would come straight from your cerebral cortex. (Apple refused to comment.)

“This is the story of the next half-century,” Foer told me, “as we become effectively cyborgs.”

Basically, we are outsourcing our brains to the cloud. The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.” But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.

The most obvious drawback of social media is that they are aggressive distractions. Unlike the virtual fireplace or that nesting pair of red-tailed hawks we have been live-streaming on nytimes.com, Twitter is not just an ambient presence. It demands attention and response. It is the enemy of contemplation. Every time my TweetDeck shoots a new tweet to my desktop, I experience a little dopamine spritz that takes me away from . . . from . . . wait, what was I saying?

My mistrust of social media is intensified by the ephemeral nature of these communications. They are the epitome of in-one-ear-and-out-the-other, which was my mother’s trope for a failure to connect.

I’m not even sure these new instruments are genuinely “social.” There is something decidedly faux about the camaraderie of Facebook, something illusory about the connectedness of Twitter. Eavesdrop on a conversation as it surges through the digital crowd, and more often than not it is reductive and redundant. Following an argument among the Twits is like listening to preschoolers quarreling: You did! Did not! Did too! Did not!

As a kind of masochistic experiment, the other day I tweeted “#TwitterMakesYouStupid. Discuss.” It produced a few flashes of wit (“Give a little credit to our public schools!”); a couple of earnestly obvious points (“Depends who you follow”); some understandable speculation that my account had been hacked by a troll; a message from my wife (“I don’t know if Twitter makes you stupid, but it’s making you late for dinner. Come home!”); and an awful lot of nyah-nyah-nyah (“Um, wrong.” “Nuh-uh!!”). Almost everyone who had anything profound to say in response to my little provocation chose to say it outside Twitter. In an actual discussion, the marshaling of information is cumulative, complication is acknowledged, sometimes persuasion occurs. In a Twitter discussion, opinions and our tolerance for others’ opinions are stunted. Whether or not Twitter makes you stupid, it certainly makes some smart people sound stupid.

I realize I am inviting blowback from passionate Tweeters, from aging academics who stoke their charisma by overpraising every novelty and from colleagues at The Times who are refining a social-media strategy to expand the reach of our journalism. So let me be clear that Twitter is a brilliant device — a megaphone for promotion, a seine for information, a helpful organizing tool for everything from dog-lover meet-ups to revolutions. It restores serendipity to the flow of information. Though I am not much of a Tweeter and pay little attention to my Facebook account, I love to see something I’ve written neatly bitly’d and shared around the Twittersphere, even when I know — now, for instance — that the verdict of the crowd will be hostile.

The shortcomings of social media would not bother me awfully if I did not suspect that Facebook friendship and Twitter chatter are displacing real rapport and real conversation, just as Gutenberg’s device displaced remembering. The things we may be unlearning, tweet by tweet — complexity, acuity, patience, wisdom, intimacy — are things that matter.

There is a growing library of credible digital Cassandras who have explored what new media are doing to our brains (Nicholas Carr, Jaron Lanier, Gary Small and Gigi Vorgan, William Powers, et al.). My own anxiety is less about the cerebrum than about the soul, and is best summed up not by a neuroscientist but by a novelist. In Meg Wolitzer’s charming new tale, “The Uncoupling,” there is a wistful passage about the high-school cohort my daughter is about to join.

Wolitzer describes them this way: “The generation that had information, but no context. Butter, but no bread. Craving, but no longing.”

Bill Keller is the executive editor of The New York Times.

A version of this article appears in print on  , Page 11 of the Sunday Magazine with the headline: THE TWITTER TRAP. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT