TOTAL RECALL?: The advent of the Internet and near-ubiquitous information at our fingertips makes it less critical for us to commit items to memory. Using the Internet as a mental crutch is not necessarily a bad thing, according to researchers. Image: IMAGE COURTESY OF ALEX HINDS, VIA ISTOCKPHOTO.COM
Has the Internet dumbed down society or simply become an external storage unit that enhances the human brain's memory capacity? With Google, Internet Movie Database and Wikipedia at our beck and call via smart phones, tablets and laptops, the once essential function of committing facts to memory has become little more than a flashback to flash cards. This shift is not necessarily a bad thing, nor is it irreversible, according to a team of researchers whose study on search engines and learning appears in the July 15 issue of Science.
Led by Columbia University psychologist Betsy Sparrow, the researchers conducted a series of experiments whose results suggest that when people are faced with difficult questions, they are likely to think that the Internet will help them find the answers. In fact, those who expect to able to search for answers to difficult questions online are less likely to commit the information to memory. People tend to memorize answers if they believe that it is the only way they will have access to that information in the future. Regardless of whether they remember the facts, however, people tend to recall the Web sites that hold the answers they seek.
In this way, the Internet has become a primary form of external or "transactive" memory (a term coined by Sparrow's one-time academic advisor, social psychologist Daniel Wegner), where information is stored collectively outside the brain. This is not so different from the pre-Internet past, when people relied on books, libraries and one another—such as using a "lifeline" on the game show Who Wants to be a Millionaire?—for information. Now, however, besides oral and printed sources of information, a lion's share of our collective and institutional knowledge bases reside online and in data storage.
The idea for Sparrow's research sprang from a common occurrence in many homes—a few years ago she was watching a movie with her husband and saw an actress whose face she could not quite place. Using Internet Movie Database on her laptop, she quickly discovered that the actress was Angela Lansbury (debuting in 1944's Gaslight), who went on to star in dozens of movies and the popular Murder, She Wrote TV show of the 1980s and '90s.
What would Sparrow's alternatives have been if the Internet never existed? Most likely, if she could not eventually come up with the answer herself, she would have asked a friend or family member for help. Another option would have been to consult a cinema reference book. Or she would have simply had to live with that nagging curiosity and moved on.
The situation with the Internet is in many ways not all that different than it ever was, Sparrow says. "It's different in the sense that information is much more available than it was," she says. "In the past you would have to go through the filing system in your brain, maybe with the help of someone else to try to remember."
Some people are troubled that information gleaned online plays too large a role in their fact-access process, yet this reliance on external memory seems to bother them less if the information resides in the brain of another person. "It's not as salient to people that we do this with other people, but it's obvious to them that we do this with the Internet," Sparrow says.
Besides, memorization is overrated, according to Sparrow. "Obviously we need some baseline skill in memorizing things, but I personally have never seen all that much intellectual value in memorizing things," she says, adding that it is far more important to understand information on a conceptual level. As an instructor, she has seen how some students struggle with cognition related to the things she teaches, whereas they would do much better if they simply had to memorize a bunch of answers. "Memorizing is the easier thing to do, which is why students do it," she says.
Sparrow continues to research the impact on learning if instructors remove the expectation of memorization. "Will students better be able to learn focusing on conceptualizing and understanding information rather than simply remembering it?" she asks. "More likely, if a person does not think the information will be available later, they will try to memorize it, often at the expense of understanding the concepts."
And if our gadgets were to fail due to a planet-wide electromagnetic pulse tomorrow, we would still be all right. People may rely on their mobile phones to remember friends' and family members' phone numbers, for example, but the part of the brain responsible for such memorization has not been atrophied, she says. "It's not like we've lost the ability to do it."
John Suler, a psychology professor at Rider University in Lawrenceville, N.J., and author of the online book The Psychology of Cyberspace, agrees. "I suspect that we're still going to remember information that's important to us, while relying on the Internet to verify what we think might be true or have forgotten, and to provide new information to which we were never exposed," he says.
Perhaps the more pressing issue is whether people will develop the ability to scrutinize online information. "If you look long and hard enough, you will find a Web site that validates almost anything you might want to believe, whether it's true or not," Suler says. "It's also clear that cyberspace is filled with differences of opinion, contradictory 'facts,' and the propagation of information from one site to another that gives the illusion of consensual validation."
In this respect, the Internet is just like any other memory system—the need for critical thinking does not diminish, regardless of where the information is stored.