Why Do White Men and Scientists Tend to Downplay the Risks of Technology?

The naive answer is that white men and scientists are coldly rational—but that’s not the whole story

Carlo Giambarresi

Join Our Community of Science Lovers!

Scientists often complain that people are irrational in their opposition to technologies such as nuclear power and genetically modified (GM) crops. From a statistical perspective, these are very safe, and so (it is argued) people's fear can be explained only by emotion, undergirded by ignorance. Electricity from nuclear power has led to far fewer direct deaths than has coal-fired power, yet many people are afraid of it, and hardly anyone is afraid of coal plants. Similar arguments can be made about GM crops, which studies have shown are generally safe for most people to eat.

Scientific illiteracy may be part of the problem. Most of us are afraid of things we don't understand, and studies have shown that scientists tend to be more accepting of potentially risky technologies than laypeople. This suggests that when people know a lot about such technologies, they are usually reassured.

But there's more to the issue than meets the eye. It is true that many of us fear the unknown, but it is also true that we can be cavalier about routine risks. Part of the explanation is complacency: we tend not to fear the familiar, and thus familiarity can lead us to underestimate risk. The bipartisan commission that reviewed the Deepwater Horizon blowout and oil spill concluded that complacency—among executives, among engineers and among government officials responsible for oversight—was a major cause of that disaster. So the fact that experts are unworried about a threat is not necessarily reassuring.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Scientists also make a mistake when they assume that public concerns are wholly or even mostly about safety. Pope Francis, for example, rejects genetic modification of organisms in part because he views it as an inappropriate interference in God's domain; this is a theological position that cannot be refuted by scientific data. Some people object to GM crops such as Roundup Ready corn and soy because they facilitate the increased use of pesticides. Others have a problem with the social impacts that switching to GM organisms can have on traditional farming communities or with the political implications of leaving a large share of the food supply in the hands of a few corporations.

Geoengineering to lessen the impacts of climate change is another example. Some concerns about geoengineering—not just among laypeople but among scientists as well—have more to do with regulation and oversight than with safety. Who will decide whether this is a good way to deal with climate change? If we undertake the project of setting the global temperature by controlling how much sunlight reaches Earth's surface, who will be included in that “we,” and by what process will the “right” global temperature be chosen?

Such considerations may help explain the results of a classic study of perceptions of health risks from a polluted environment, which showed that white women, as well as nonwhite men and women, were substantially more worried about these risks than white men. Because scientists are for the most part less worried about risks than laypeople, we might conclude that the insouciant white men are right and the others unnecessarily troubled.

Of course, the majority of scientists are white men, so it's not entirely surprising that their views track with those of the demographic group to which they belong. And there is a more important point here: risks are not equally distributed. Women and people of color are more likely to be the victims when things go wrong (think the Marshall Islands or Flint, Mich.), so it makes sense that they tend to be more worried. Moreover, women and people of color have historically been excluded from important decision-making processes, not just in science and technology but in general. When you're excluded from a decision-making process, it is not irrational for you to view that process as unfair or to be skeptical about what it yields.

Can we say whether men or women are more rational about risk? Can we say which group's view is closer to an accurate assessment? Well, here's one relevant datum: women are more likely than men to wear seat belts.

It’s Time to Stand Up for Science

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, can't-miss newsletters, must-watch videos, challenging games, and the science world's best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American

Subscribe