Acceptance of science has become increasingly polarized in the United States. Indeed, a recent Pew poll shows that there is a substantial and growing amount of public disagreement about basic scientific facts, including human evolution, the safety of vaccines and whether or not human-caused climate change is real and happening. What is causing this, you might ask?
People often interpret the same information very differently. As psychologists, we are more than familiar with the finding that our brains selectively attend to, process and recall information. One consequence of this is “confirmation bias,” a strong tendency to automatically favor information that supports our prior expectations. When we consider issues that we feel strongly about (e.g., global warming), confirmation bias reaches a new height: it transitions into “motivated reasoning.” Motivated reasoning is the additional tendency to defensively reject information that contradicts deeply held worldviews and opinions. One example of this is the “motivated rejection of science”; if you are personally convinced that global warming is a hoax, you are likely to reject any scientific information to the contrary – regardless of its accuracy.
Yet, if our personal values, beliefs and worldviews really dictate our reality, then aren’t science communicators just blowing in the wind? Not necessarily so. Although some research has indeed shown that factors such as “scientific literacy” are not always associated with, say, more concern for climate change, we have investigated a different, social type of fact: “expert consensus.” Our research shows that highlighting how many experts agree on a controversial issue has a far-reaching psychological influence. In particular, it has the surprising ability to “neutralize” polarizing worldviews and can lead to greater science acceptance.
A recent study by one of us showed that perceived scientific consensus functions as an important “gateway belief.” In the experiment, we asked a national sample of the US population to participate in a public opinion poll about popular topics (participants did not know that the study was really about climate change). Participants were first asked to estimate what percentage of scientists they thought agree that human-caused climate change is happening (0 to 100 percent). We then exposed participants to a number of different experimental treatments that all contained the same basic message, that “97% of climate scientists have concluded that human-caused climate change is happening.” After several quizzes and bogus distractions, we finally asked participants again about their perception of the scientific consensus.
You might expect that given the contested and politicized nature of the climate change problem, such a simple message would have little effect or could even backfire. Indeed, some research has shown that disagreements between parties can become more extreme when exposed to the same evidence. Yet, contrary to the motivated-reasoning hypothesis, our results showed that on average, participants who were exposed to one of the consensus-messages increased their estimate of the consensus by about 13% (up to as much as 20% in some conditions). Moreover, we found that when respondents’ perception of the level of scientific agreement increased, this led to significant changes in other key beliefs about the issue, such as the belief that climate change is happening, human-caused and a worrisome problem. In turn, changes in these beliefs propelled an increase in support for public action. Thus, we found that people’s perception of the degree of scientific consensus seems to act as a “gateway” to other key attitudes about the issue.
What’s even more interesting is that we found the same effect for two differentially motivated audiences: Democrats and Republicans. In fact, the change was significantly more pronounced among Republican respondents, who normally tend to be the most skeptical about the reality of human-caused climate change. These findings are quite remarkable, if not surprising, given that we exposed participants only once, to a single and simple message.
Nonetheless, these new results are consistent with two previous Nature studies. Some years ago, our colleagues showed that people’s perception of the level of scientific agreement was associated with belief in climate change and policy support for the issue. A subsequent experimental study by one of us revealed a causal link between highlighting expert consensus and increased science acceptance. In that study, too, information about the degree of consensus “neutralized” the effect of ideological worldviews.
Since then, numerous studies have reported similar results. One study showed that even a small amount of scientific dissent can undermine support for (environmental) policy. A new paper published just this month reported that respondents across the political spectrum responded positively to information about the scientific consensus on climate change.
Why is “consensus-information” so far-reaching, psychologically speaking?
One feature that clearly distinguishes “consensus” from other types of information is its normative nature. That is, consensus is a powerful descriptive social fact: it tells us about the number of people who agree on important issues (i.e., the norm within a community). Humans evolved living in social groups and much psychological research has shown that people are particularly receptive to social information. Indeed, consensus decision-making is widespread in human and non-human animals. Because decision-strategies that require widespread agreement lie at the very basis of the evolution of human cooperation, people may be biologically wired to pay attention to consensus-data.
In the case of experts, it describes how many scientists agree on important issues and as such, implicitly embodies an authoritatively rich amount of information. Imagine reading a road sign that informs you that 97% of engineers have concluded that the bridge in front of you is unsafe to cross. You would likely base your decision to cross or avoid that bridge on the expert consensus, irrespective of your personal convictions. Few people would get out of their car and spend the rest of the afternoon personally assessing the structural condition of the bridge (even if you were an expert). Similarly, not everyone can afford the luxury of carving out a decade or so to study geophysics and learn how to interpret complex climatological data. Thus, it makes perfect sense for people to use expert consensus as a decision-heuristic to guide their beliefs and behavior. Society has evolved to a point where we routinely defer to others for advice—from our family doctors to car mechanics; we rely on experts to keep our lives safe and productive. Most of us are constrained by limited time and resources and reliance on consensus efficiently reduces the cost of individual learning.
Back to facts. A recent study showed that people are more likely to cling onto their personal ideologies in the absence of “facts.” This suggests that in order to increase acceptance of science, we need more “facts.” We agree but suggest that this is particularly true for an underleveraged but psychologically powerful type of fact — expert consensus.
The consensus on human-caused climate change is among the strongest observed in the sciences—about as strong as the consensus surrounding the link between smoking and lung cancer. Yet, as Harvard science historian Naomi Oreskes has documented, vested-interest groups have long understood the fact that people make (or fail to make) decisions based on consensus-information. Accordingly, so-called “merchants of doubt” have orchestrated influential misinformation campaigns, including denials of the links between smoking and cancer,and between CO2 emissions and climate change. If polarization on science is to be reduced, we need to harness the psychological power of consensus for which it was designed: the public good.