Conspiracy Theories Can Be Undermined with These Strategies, New Analysis Shows

A new review finds that only some methods to counteract conspiracy beliefs are effective. Here’s what works and what doesn’t

Abstract eye background

When someone falls down a conspiracy rabbit hole, there are very few proved ways to pull them out, according to a new analysis.

The study is a review of research on attempts to counteract conspiratorial thinking, and it finds that common strategies that involve counterarguments and fact-checking largely fail to change people’s beliefs. The most promising ways to combat conspiratorial thinking seem to involve prevention, either warning people ahead of time about a particular conspiracy theory or explicitly teaching them how to spot shoddy evidence.

“We’re still in the early days, unfortunately, of having a silver bullet that will tackle misinformation as a whole,” says Cian O’Mahony, a doctoral student in psychology at University College Cork in Ireland, who led the study. It was published today in the journal PLOS ONE.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Counteracting conspiracy beliefs is important because beliefs in conspiracies can encourage people to act in harmful ways, says Kathleen Hall Jamieson, a professor of communication and director of the Annenberg Public Policy Center at the University of Pennsylvania, who was not involved in the new review. The people who stormed the U.S. Capitol on January 6, 2021, believed the 2020 presidential election had been stolen, for example. And believers in COVID vaccine conspiracies put themselves at risk from the disease by refusing to get vaccinated. But the field is so young that trying to compare individual studies is fraught, Jamieson says.

“There are so many different definitions and specifications of what is a conspiracy belief and a conspiracy mindset that it’s very difficult to aggregate this data in any way that permits generalization,” she says. The comparisons in the new review are a suggestive starting point, Jamieson adds, but shouldn’t be seen as the last word on conspiracy interventions.

Studies often blur the lines between conspiracy theory, disinformation and misinformation, O’Mahony says. Misinformation is simply inaccurate information, while disinformation is deliberately misleading. Conspiracy beliefs, as O’Mahony and his colleagues define them, include any beliefs that encompass malicious actors engaging in a secret plot that explains an important event. Such beliefs are not necessarily false—real conspiracies do happen—but erroneous conspiracy theories abound, from the idea that the moon landingwas faked to the notion that COVID vaccines are causing mass death that authorities are covering up.

O’Mahony and his colleagues focused on studies that targeted conspiracy beliefs, not misinformation or disinformation. They found 24 studies in 13 papers. The majority of these papers were conducted in the U.S. or U.K., though several were conducted elsewhere, and most involved online samples, which often involved participants from different countries. In nearly two thirds of those studies, the researchers attempted to change people’s general willingness to believe conspiracies, while the rest addressed individual beliefs.

There were a few categories of interventions. One consisted of priming studies, which used an unrelated task to shift someone’s mindset. For instance, participants might be asked to read a passage in a hard-to-read font, which requires more work to take in information and prompts an analytic mindset. Then researchers might present the participants with a conspiracy theory to see if this task decreases belief. These studies suggested that priming mostly worked, but it generally had only small effects. Another strategy of arguing against conspiracy theories with facts also showed only very small to small effects. The least effective arguments involved appealing to a believer’s sense of empathy or mocking them for their beliefs.

More promising were inoculation studies, which warned people ahead of time that they might see a conspiracy theory and gave them an argument against it. These had medium to large impacts on decreasing conspiracy belief. Inoculation can backfire, however. One study found that if a conspiracy peddler warned against an inoculation approach, that inoculation would no longer work. Politicians use this “inoculate against inoculation” strategy in the real world, says Joseph Uscinski, a political scientist at the University of Miami and co-author of American Conspiracy Theories (Oxford University Press, 2014), who was not involved in that study or the new analysis. For example, Florida governor Ron DeSantis has taken to accompanying his arguments that teachers are indoctrinating students with a “woke” agenda with phrases such as “Anyone that tells you it’s not happening is lying to you.”

Another challenge is finding out whether inoculation or any other strategy works in the long run, says Karen Douglas, a social psychologist at the University of Kent in England, who was not involved in the new review. For many studies that examine this method, scientists measure immediate effects but don’t follow up over days, weeks or months. “For something to be effective in dealing with the bigger problem,” Douglas says, “we need to know that the interventions will last over time.”

The best opportunity to avoid conspiratorial thinking may be the most labor-intensive. In the new analysis, one of the largest effects came from a study that involved a three-month university class aimed at distinguishing science from pseudoscience. For the study, three instructors taught students critical thinking skills needed to understand common human mistakes of perception and logic. The result was a reduction in conspiracy beliefs. “This was a singular study, but it did highlight teaching these skills explicitly,” O’Mahony says.

If it’s hard to change entrenched conspiracy beliefs, the silver lining is that it’s also hard to make people believe in conspiracies, contrary to popular conception, Uscinski says. In 2022 he and his colleagues published research in PLOS ONE that found no evidence that conspiracy beliefs are growing, despite their visibility on social media. Changing entrenched beliefs of any kind is challenging, Uscinski says, especially if those beliefs are closely tied to someone’s worldview. “Sometimes people pick the beliefs that they want, and they do what they want because of who they are,” he adds.

For individuals interested in challenging conspiracy thinking, the authors of the new review provide some tips:

  1. Don’t appeal to emotion. The research suggests that emotional strategies don’t work to budge belief.

  2. Don’t get sucked into factual arguments. Debates over the facts of a conspiracy theory or the consequences of believing in a particular conspiracy also fail to make much difference, the authors found.

  3. Focus on prevention. The best strategies seem to involve helping people recognize unreliable information and untrustworthy sources before they’re exposed to a specific belief.

  4. Support education and analysis. Putting people into an analytic mindset and explicitly teaching them how to evaluate information appears most protective against conspiracy rabbit holes.

It’s Time to Stand Up for Science

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, can't-miss newsletters, must-watch videos, challenging games, and the science world's best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American

Subscribe