To win the White House, candidates in the presidential race will need to change minds. Bernie Sanders may try converting Hillary Clinton’s superdelegates to gain the Democratic nomination. If frontrunner Clinton gets it, she and Republican Donald Trump will need to win over reluctant voters who supported their competitors.

And to change opinions, candidates will have to contend with neurobiology. Scientists say there’s a tension in the brain between responding to new information and resisting overwhelming amounts of conflicting data—and the latter can prevent opinion change. Altering opinion depends on using different psychological methods tailored to different types of belief, according to research. “There’s not much convincing people,” even when the beliefs in question are purely false, says psychiatrist Philip Corlett of Yale University School of Medicine.

Neural images show that opinion change plays out at least partly at the neurological level. Depending on the reason people change their opinions, different parts of the brain light up in functional Magnetic Resonance Imaging (fMRI) scans. For example, people who change their minds to conform to social pressure show activity in the posterior medial frontal cortex, an area of the brain associated with reinforcement learning. People who change their minds as a result of targeted persuasion, however, show higher activation in a more frontal part of the brain involved in self-reference. Keise Izuma, a neuroscientist at the University of York in England, summarized these findings in a review in Current Opinion in Neurobiology.

The neural hardware is there, so why is it so difficult to change opinion? One reason is that once the brain figures out a way of processing information into a narrative that makes sense, it resists change. Yale’s Corlett studies what happens in the brain when people hang onto preexisting beliefs, even in the face of new conflicting information. In particular, Corlett investigates the brain activity of people with clinical delusions—like the belief that government agencies are following them. By studying people as they performed mundane tasks in scanners, Corlett found that patients with delusions tended to use brain circuitry that other people reserve for surprising events.

When this circuitry works overtime, people pay attention to run-of-the-mill details they would otherwise ignore, potentially flooding the brain with conflicting information. Corlett collaborated with psychiatrist Sarah Fineberg, also of Yale, on a paper in the February Cognitive Neuropsychiatry, proposing that delusions may be adaptations for reconciling a flood of conflicting information. “Humans are really adverse to any uncertainty in their world,” Corlett says. Although he emphasizes that delusions are harmful in the long run, in the short term they can help people piece together disruptive information into a narrative that makes sense. False beliefs in general might be similarly adaptive. “We all have some of them,” Corlett says. “I think I’m fitter and healthier and probably more attractive than I actually am because if I really appreciated how things actually are, I might not do anything at all—I’d be too depressed.” People might be wired to cling to some beliefs even in the face of conflicting information. For example, even if Trump significantly alters his message and policies as the Republican nominee, voters who once supported John Kasich or Ted Cruz might have difficulty altering their perceptions of the candidate.

Despite resistance to change, the brain is sometimes capable of incorporating new information and changing opinion, according to neuroscientist Michael Shadlen of Columbia University. He found that people often make decisions before they have fully processed information—and after their brains play catch-up, they occasionally do change their minds. “The little secret of the brain is that it doesn’t work like Google,” Shadlen says. The brain must make a trade-off between accuracy and speed.

Shadlen and a team of researchers reported this trade-off recently in eLife. The team asked participants to indicate the overall direction of dots that were actually moving randomly on a screen. They encouraged subjects to act quickly but not at the expense of accuracy. Even after participants made their decisions and the dots disappeared from the screen, participants’ confidence in their decisions continued to change, suggesting that the brain had not yet finished processing information at the time of their decisions. As the participants continued to process, they sometimes changed their minds. “The same principle would probably apply to more complicated decisions like politics, provided you keep your mind open,” Shadlen says.

But different types of political beliefs call for different approaches to conversion. Psychologist Steven Sloman of Brown University led a study asking people to explain in detail policies about which they felt strongly, like sanctions on Iran or a single-payer health care system. After people struggled to explain the mechanisms behind the policies they supported or opposed, they reported more moderate views and decreased their donations to relevant advocacy groups.

But Sloman (whose forthcoming book School of Thought covers the importance of community in shaping opinion) stressed that this tactic does not work with beliefs strongly tied to values and community, such as positions on abortion or gun rights. Instead, a recent Science study suggests that changing some value-based opinions, including anti-transgender sentiment, may hinge on direct, one-on-one conversations that help people relate to the group in question.

That means candidates may need to employ different techniques to win over different delegates, depending on the types of beliefs each delegate values most. And voters, scientists say, should keep their minds open, in the interest of accuracy. “We overestimate our own understanding,” Sloman says. “People know much less about stuff than they think they do.”