Does Continual Googling Really Make You Stupid? [Excerpt]

Preliminary data suggest that all those tweets, status updates and other digital distractions may actually stave off cognitive decline

Robin Marantz Henig & Samantha Henig

Editor's note: The following is an excerpt from the book, Twentysomething: Why Do Young Adults Seem Stuck?, by Robin Marantz Henig and Samantha Henig (Hudson Street Press, 2012). Copyright © Robin Marantz Henig and Samantha Henig. Robin Henig has written several articles for Scientific American, including "When Does Life Belong to the Living?" and "How Depressed Is That Mouse?".

With all the emails, tweets, chats, and status updates continually vying for brain space, young people these days are slave to what’s been called “continuous partial attention.” One study of college students found that 84 percent get instant messages, Facebook updates, texts, or other interruptions at least once in any given hour; 19 percent get them at least six times every hour. And for 12 percent, the interruptions occur so often that they’ve lost count.

Those incessant distractions don’t bode well for the brain, wrote journalist Nicholas Carr in a controversial cover story in The Atlantic in 2008, “Is Google Making Us Stupid?” With our attention constantly splintered, he wrote, our brains might be subtly rewired, leading to a younger generation less and less capable of thinking deep thoughts. “What the Net does is shift the emphasis of our intelligence, away from what might be called a meditative or contemplative intelligence and more toward what might be called a utilitarian intelligence,” Carr wrote in an online symposium about his article hosted by the Pew Research Center’s Internet and American Life Project. “The price of zipping among lots of bits of information is a loss of depth in our thinking.”

Defenders of Google say it frees up people’s brains for more important stuff than data entry and retrieval. “Holding in your head information that is easily discoverable on Google will no longer be a sign of intelligence, but a side-show act,” wrote Alex Halavais of the Association of Internet Researchers in that same symposium in response to Carr’s lament. Once your mind is clear of actual facts, goes his argument, you have room for sophisticated analysis and problem-solving. I’m reminded of my brother’s uncanny ability to recite, since he was fourteen, the first thirty-six digits of pi. It was amusing, but it didn’t make him any better at math than the guy with pi programmed into his TI-89.

Googling has, arguably, made Millennials less able than any previous group of twentysomethings to retain information. Recent research suggests that they use Google as a sort of auxiliary memory. In 2011 a team of psychologists led by Betsy Sparrow of Columbia gave 60 undergrads a bunch of trivia (on the order of “an ostrich’s eye is bigger than its brain”) and asked them to type all forty factoids into a computer. Half were told that the file containing these facts would be accessible later; half were told the file would be erased. On a subsequent test of memory, the ones who thought everything would be erased remembered much more. When they believed their document would be saved, Sparrow found, they didn’t bother remembering it; they figured they could always find it (or, as it’s called outside the lab, Google it) when they needed to.

And maybe it’s not just pervasive googling that interferes with memorization; it might be reliance on the computer keyboard itself. Some studies suggest that the best way to retain information is to write it out in longhand, which activates a tactile connection between the words and the brain that might be skipped by typing. Karin Harman James, a neuroscientist at Indiana University, recently asked a group of college students to transcribe a passage in one of three ways: by writing it out in cursive, by writing it out using print, or by typing it. One week later, she brought them back to the lab and asked them to recall as much of the passage as they could. Those who had written it out in cursive—the old-fashioned way, the way that’s hardly even taught in schools anymore—remembered significantly more than either of the other two groups. This no doubt has implications for Millennials’ ability to remember what they write, since even young people who use longhand, which is rare enough, tend to choose printing over script.

It’s not the lack of memorization that bothered Carr, though. His concern, based on intuition rather than data, was the growing inability to focus on a piece of long-form writing in a way that allows the reader to resonate to the “intellectual vibrations” of an author’s words. “In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas,” he wrote. Those opportunities for mental improv are being drowned out, according to Carr, by noisy and distracting “content.”

Carr’s 2008 article appeared before Twitter really took off, before smartphones and constant texting and checking in and googling became second nature, especially among twentysomethings. In just a few more years, distractions would be popping up not only on the laptop at your desk but on your mobile device everywhere—while you walked down the street, waited for a bus, rode the bus, went to the bathroom, stood still, all the quiet places where people used to let their minds wander for a bit and see where the musing led. (Even the shower, where I do some of my best thinking, is being invaded by waterproof iPods and smartphones.)

In 2009, The Atlantic published a rejoinder to Carr by Jamais Cascio, a fellow at the Institute for Ethics and Emerging Technologies. Not only are we not getting stupider, Cascio wrote; we’re getting smarter, as the human brain evolves to take advantage of the hive mind of the web. Focus and attention might be sacrificed because of all the distractions and hyperlinks, he wrote, but they are being replaced by “fluid intelligence—the ability to find meaning in confusion and solve new problems, independent of acquired knowledge.”

Only a handful of neuroscientists have looked directly at the brain in action to see if that’s what’s actually happening. Among them is a team from UCLA that used functional MRI scanning on a group of older adults to visualize electrical activity in their brains while they performed two cognitive tasks: reading a book and searching the Internet.

Led by psychiatrist and neuroscientist Gary Small, the scientists divided 24 subjects, aged fifty-five to seventy-six, into two groups: 12 who were experienced Googlers, and 12 who had never used Google before. In both groups, functional MRI scans showed that reading a book engaged regions in the temporal, parietal, and occipital lobes of the brain that were involved in language, reading, memory, and visual skills. So far, so good.

Next, the subjects were asked to do a Google search on a topic of interest to them, such as “What are the health benefits of chocolate?” While they were searching, their brains showed activation in the same regions that were involved in reading. But in some subjects, additional brain activity was recorded in the frontal pole, anterior temporal region, cingulate, and hippocampus—brain areas involved in decision-making, complex reasoning, memory, and vision. The subjects whose brains got more active while googling were those from the web-savvy group, who were familiar with Google to begin with. The web novices didn’t engage in searching in the same way, and googling never managed to get their brains in gear.

“A simple, everyday task like searching the Web appears to enhance brain circuitry,” Small said. People in the web-savvy group were using those circuits during the functional MRI scans because they already had them available to use, having strengthened them during previous episodes of googling. Small and his colleagues were mostly interested in web searching as a way to stave off cognitive declines in old age—their paper was published in the American Journal of Geriatric Psychiatry—but it’s also possible to read these results as suggesting something about brain changes in the hyperlinked young. It was just a small pilot study, with only a dozen people in the Internet-literate group. But it might be a useful counterbalance to the conventional wisdom that Millennials have lost the ability to deal with anything more complex than screen-size bursts and 140-character thoughts. Google, it seems, might be doing something different to the brains of digital natives, creating a new set of neural connections and engaging young brains in an unprecedented way. With their brains thus wired, Millennials might be using the web as a vehicle for sophisticated thinking and higher-order cognition. And they might be even more mentally engaged while online than their elders are while reading a book.

Share this Article:


You must sign in or register as a member to submit a comment.
Scientific American Dinosaurs

Get Total Access to our Digital Anthology

1,200 Articles

Order Now - Just $39! >


Email this Article