No well-intentioned researcher expects that their work will be used to justify violence. But following the racist massacre of 10 Black people in a Buffalo, N.Y., supermarket on May 14, one of us experienced just that. We join other researchers in condemning any use of genetics to justify racism or hate.

In a rambling 180-page screed posted online just before the shooting, the Buffalo shooter appears to write so as to emulate an academic monograph. He cites recent developments in human molecular genetics to falsely assert that there are innate biological differences between races in an attempt to validate his hateful, white supremacist worldview. Although misuse of science to support bigotry is not new, this latest atrocity is another wake-up call to geneticists and the scientific community at large to consider how we are conducting and communicating science—and how we can do these things better.

Let’s first correct the record about the science. In his document, the shooter contorts many scientific studies, including the findings from a 2018 genetic study co-authored by one of us (Wedow), to try and “prove” that white people have a genetic intellectual advantage over Black people. The 2018 study cited by the shooter aimed to find genetic variation associated with years of completed schooling and cognition. It gathered DNA on one million people of predominantly European estimated genetic ancestry and sought to identify genetic variants correlated with outcomes such as years of completed schooling and cognitive performance. Importantly, the genetic variants identified in this study, like any genomic study of a complex outcome, are time- and context-dependent. In a different time, place and social structure, a different set of variants might emerge as statistically linked. Genes do not predestine one individual to complete fewer years of schooling than another or one individual to score higher on a cognitive performance test than another. The 2018 study concluded that the environment plays a substantial role in shaping these outcomes.

The shooter’s document deceptively extracts data from the 2018 study, combining it with another genetics study to present statistical artifacts to bolster the shooter’s false claims. Had the initial study instead been done on a million individuals of estimated African genetic ancestry, then based on his misguided exercise the shooter could have instead concluded that Black people have a genetic intellectual advantage over white people. Even putting aside the inaccurate and dangerous conflation of genetic ancestry and race, the shooter’s argument is just bad, utterly invalid science. There is absolutely no evidence that there are genetic differences in cognitive performance between racial, ethnic or genetic ancestral groups of people.

Although the 2018 genomic study does not make any claims about genetic differences between racial groups, or any groups for that matter, the results of a study do not prevent others from constructing alternate realities. The Buffalo shooter is one of many people who have misappropriated genetic studies; he probably did not come up with his interpretation of the research in a vacuum. Instead, he is part of a long, dark and violent history. Genetics has been used time and time again in service of white supremacy. Failure to place the shooter’s document within this larger context makes it too easy for the scientific community to point fingers elsewhere.

We scientists could all view the 2018 study as nothing more than the unfortunate choice of weapon for a domestic terrorist driven by delusions instead of facts. However, doing so enables a level of moral disengagement that just won’t cut it anymore. We live in an age of mistrust, disinformation and deep polarization. Researchers cannot assume that the rigor and reproducibility of their research will weather this storm, or lead to a singular interpretation. As hard as it might be, and it certainly will be challenging, scientists need to consider their moral responsibilities as producers of this research. Otherwise, we stay caught in the delusion that science can speak accurately for itself.

Ethical scientific research requires a delicate weighing of risks and benefits. When this weighing occurs, risks to the individual are factored into the equation, but broader risks to society seldom are. The scientific community has been incentivized to outsource responsibility to existing regulations and review boards to make these calculations. Any research involving human participants must obtain Institutional Review Board (IRB) approval, and researchers working with human subjects in the U.S. are subject to federal policies such as the Common Rule. Yet these safeguards cannot on their own ensure that research maximizes benefits and minimizes harms. There are no existing regulatory mechanisms that explicitly factor in the social risks of research. In fact, IRBs are prohibited from considering the broader social impacts, focusing only on individual-level risks.

Most genomic studies do not undergo extensive evaluations of potential risks and benefits. These studies use de-identified genomic data—meaning data that isn’t tied to a name or other identifiable characteristic—and are therefore not considered to be research on human subjects. These studies typically do not require IRB approval, nor are they subject to the Common Rule. Although there is minimal direct risk to the individual participants who provide their DNA for these studies, the results and communication of what comes from their DNA clearly can affect real people in the real world.

We are not advocating for academic censorship here. Scientists cannot and should not be expected to anticipate every possible risk or misuse of their research. That burden is too big to bear for one community. Yet, as the shooter’s document illustrates, minimizing one’s responsibility to mitigate against the social risks of a body of research does not make these risks go away.

Scientists funded with taxpayer dollars are tasked with discovering truth and innovating in order to support the flourishing of all humans. To realize this aim, it’s time we rethink how we weigh the risks and benefits of research. For example, what if we incentivized future generations of scientists to prioritize considering the social risks of their work in the same way they do the scientific impact? What if funding agencies, which help steer the course of research by deciding whom and what to fund, routinely required researchers to develop plans for mitigating against potential social risks? And what if we taught genetics in schools in a way that reflects actual human variation, rather than incorrectly reflecting determinism?

The intricacies of scientific interpretation can have unintended consequences. The costs of continuing as is are simply too high.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.