A recurring red herring in the current presidential campaign is the verity of President Barack Obama's birth certificate. Although the president has made this document public, and records of his 1961 birth in Honolulu have been corroborated by newspaper announcements, a vocal segment of the population continues to insist that Obama's birth certificate proving U.S. citizenship is a fraud, making him legally ineligible to be president. A Politico survey found that a majority of voters in the 2011 Republican primary shared this clearly false belief.
Scientific issues can be just as vulnerable to misinformation campaigns. Plenty of people still believe that vaccines cause autism and that human-caused climate change is a hoax. Science has thoroughly debunked these myths, but the misinformation persists in the face of overwhelming evidence. Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.
"You have to be careful when you correct misinformation that you don't inadvertently strengthen it," says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper's authors. "If the issues go to the heart of people's deeply held world views, they become more entrenched in their opinions if you try to update their thinking."
Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance. A 2008 study in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept).
In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. "It's threatening to admit that you're wrong," he says. "It's threatening to your self-concept and your worldview." It's why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live.
Ironically, these issues are also the hardest to counteract. Part of the problem, researchers have found, is how people determine whether a particular statement is true. We are more likely to believe a statement if it confirms our preexisting beliefs, a phenomenon known as confirmation bias. Accepting a statement also requires less cognitive effort than rejecting it. Even simple traits such as language can affect acceptance: Studies have found that the way a statement is printed or voiced (or even the accent) can make those statements more believable. Misinformation is a human problem, not a liberal or conservative one, Nyhan says.
Misinformation is even more likely to travel and be amplified by the ongoing diversification of news sources and the rapid news cycle. Today, publishing news is as simple as clicking "send." This, combined with people's tendency to seek out information that confirms their beliefs, tends to magnify the effects of misinformation. Nyhan says that although a good dose of skepticism doesn't hurt while reading news stories, the onus to prevent misinformation should be on political pundits and journalists rather than readers. "If we all had to research every factual claim we were exposed to, we'd do nothing else," Nyhan says. "We have to address the supply side of misinformation, not just the demand side."