ADVERTISEMENT

Diss Information: Is There a Way to Stop Popular Falsehoods from Morphing into "Facts"?

False information is pervasive and difficult to eradicate, but scientists are developing new strategies such as "de-biasing," a method that focuses on facts, to help spread the truth
President Obama's Certificate Of Live Birth



Flickr/Talk Radio News Service

A recurring red herring in the current presidential campaign is the verity of President Barack Obama's birth certificate. Although the president has made this document public, and records of his 1961 birth in Honolulu have been corroborated by newspaper announcements, a vocal segment of the population continues to insist that Obama's birth certificate proving U.S. citizenship is a fraud, making him legally ineligible to be president. A Politico survey found that a majority of voters in the 2011 Republican primary shared this clearly false belief.

Scientific issues can be just as vulnerable to misinformation campaigns. Plenty of people still believe that vaccines cause autism and that human-caused climate change is a hoax. Science has thoroughly debunked these myths, but the misinformation persists in the face of overwhelming evidence. Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.

"You have to be careful when you correct misinformation that you don't inadvertently strengthen it," says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper's authors. "If the issues go to the heart of people's deeply held world views, they become more entrenched in their opinions if you try to update their thinking."

Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance. A 2008 study in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept).

In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. "It's threatening to admit that you're wrong," he says. "It's threatening to your self-concept and your worldview." It's why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live.

Ironically, these issues are also the hardest to counteract. Part of the problem, researchers have found, is how people determine whether a particular statement is true. We are more likely to believe a statement if it confirms our preexisting beliefs, a phenomenon known as confirmation bias. Accepting a statement also requires less cognitive effort than rejecting it. Even simple traits such as language can affect acceptance: Studies have found that the way a statement is printed or voiced (or even the accent) can make those statements more believable. Misinformation is a human problem, not a liberal or conservative one, Nyhan says.

Misinformation is even more likely to travel and be amplified by the ongoing diversification of news sources and the rapid news cycle. Today, publishing news is as simple as clicking "send." This, combined with people's tendency to seek out information that confirms their beliefs, tends to magnify the effects of misinformation. Nyhan says that although a good dose of skepticism doesn't hurt while reading news stories, the onus to prevent misinformation should be on political pundits and journalists rather than readers. "If we all had to research every factual claim we were exposed to, we'd do nothing else," Nyhan says. "We have to address the supply side of misinformation, not just the demand side."

Correcting misinformation, however, isn't as simple as presenting people with true facts. When someone reads views from the other side, they will create counterarguments that support their initial viewpoint, bolstering their belief of the misinformation. Retracting information does not appear to be very effective either. Lewandowsky and colleagues published two papers in 2011 that showed a retraction, at best, halved the number of individuals who believed misinformation.

Combating misinformation has proved to be especially difficult in certain scientific areas such as climate science. Despite countless findings to the contrary, a large portion of the population doesn't believe that scientists agree on the existence of human-caused climate change, which affects their willingness to seek a solution to the problem, according to a 2011 study in Nature Climate Change. (Scientific American is part of Nature Publishing Group.)

"Misinformation is inhibiting public engagement in climate change in a major way," says Edward Maibach, director of the Center for Climate Change Communication at George Mason University and author of the Nature article, as well as a commentary that accompanied the recent article in PSPI by Lewandowsky and colleagues. Although virtually all climate scientists agree that human actions are changing the climate and that immediate action must be taken, roughly 60 percent of Americans believe that no scientific consensus on climate change exists.

"This is not a random event," Maibach says. Rather, it is the result of a concerted effort by a small number of politicians and industry leaders to instill doubt in the public. They repeat the message that climate scientists don't agree that global warming is real, is caused by people or is harmful. Thus, the message concludes, it would be premature for the government to take action and increase regulations.

To counter this effort, Maibach and others are using the same strategies employed by climate change deniers. They are gathering a group of trusted experts on climate and encouraging them to repeat simple, basic messages. It's difficult for many scientists, who feel that such simple explanations are dumbing down the science or portraying it inaccurately. And researchers have been trained to focus on the newest research, Maibach notes, which can make it difficult to get them to restate older information. Another way to combat misinformation is to create a compelling narrative that incorporates the correct information, and focuses on the facts rather than dispelling myths—a technique called "de-biasing."

Although campaigns to counteract misinformation can be difficult to execute, they can be remarkably effective if done correctly. A 2009 study found that an anti-prejudice campaign in Rwanda aired on the country's radio stations successfully altered people's perceptions of social norms and behaviors in the aftermath of the 1994 tribally based genocide of an estimated 800,000 minority Tutsi. Perhaps the most successful de-biasing campaign, Maibach notes, is the current near-universal agreement that tobacco smoking is addictive and can cause cancer. In the 1950s smoking was considered a largely safe lifestyle choice—so safe that it was allowed almost everywhere and physicians appeared in ads to promote it. The tobacco industry carried out a misinformation campaign for decades, reassuring smokers that it was okay to light up. Over time opinions began to shift as overwhelming evidence of ill effects was made public by more and more scientists and health administrators.

The most effective way to fight misinformation, ultimately, is to focus on people's behaviors, Lewandowsky says. Changing behaviors will foster new attitudes and beliefs.

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Back To School

Back to School Sale!

12 Digital Issues + 4 Years of Archive Access just $19.99

Order Now >

X

Email this Article



This function is currently unavailable

X