The federal government announced on Tuesday that it is lifting a three-year moratorium on funding controversial research that involves genetically altering viruses in ways that could make them more contagious, more deadly, or both—and that critics say risks triggering a catastrophic pandemic.

Called gain-of-function experiments, the studies aim to understand genetic changes that can make viruses such as bird flu, SARS (severe acute respiratory syndrome), and MERS (Middle East respiratory syndrome) more transmissible from person to person. But if they escaped from the lab, perhaps through human error, the modified viruses could in theory spread quickly or be extremely virulent, increasing the toll of an outbreak.

The moratorium was imposed a few months after two mishaps at government labs, one handling anthrax and one handling avian flu, which together suggested that biosafety and biosecurity at even the most respected labs fell well short of what is needed to protect the public.

Dr. Francis Collins, director of the National Institutes of Health, said the new policy didn’t represent a significant shift, since the NIH has continued to assess and fund some gain-of-function experiments even during the moratorium. Such studies will continue to be vetted by a federal panel before they can receive funding.

But the decision to lift the moratorium did not sit well with scientists who have long warned of the risks of such research—and questioned its benefits.

“I am not persuaded that the work is of greater potential benefit than potential harm,” said molecular biologist Richard Ebright of Rutgers University, who has argued that U.S. labs working with dangerous pathogens regularly suffer serious biosafety lapses. Experiments to create enhanced viruses, he and others argue, could lead to the pathogens’ accidental release, most likely by a lab worker becoming infected unknowingly and then walking out the door.

“A human is better at spreading viruses than an aerosol” that might breach a lab’s physical containment, said epidemiologist Marc Lipsitch of Harvard T.H. Chan School of Public Health, who has calculated that the risk of a lab-acquired infection sparking a pandemic is greater than recognized. “The engineering is not what I’m worried about. Accident after accident has been the result of human mistakes.”

He nevertheless called the new policy “a small step forward” because it sets up a formal process for evaluating whether the controversial experiments should receive federal funding. But because geneticallymodified viruses “risk creating an accidental pandemic” and “have done almost nothing to improve our preparedness for pandemic,” he said, “my view is that a review of the sort proposed should disallow such experiments.”

In fact, he argued that the moratorium on federal funding of the experiments should be extended to cover those conducted with private money.

Collins, who announced the new policy, told reporters that 21 proposed studies were “paused” when the moratorium was imposed. But 10 were eventually funded through a case-by-case exemption process. The new policy, Collins told reporters, is “just a way of regularizing the process” of approving studies that enhance the transmissibility or virulence of viruses. He said he does not see the policy as “a particularly significant change.”

The basic justification for gain-of-function studies is to get out ahead of nature.

Viruses evolve constantly. If one sample of, say, H5N1 avian flu were found to contain a new mutation, and a second sample to contain a different mutation, in principle a double-mutant H5N1 could emerge in the wild. By creating such a double mutant in the lab, proponents of the experiments argue, virologists would learn whether such a double mutant is viable, how virulent it is, and whether it can spread from one person to another.

Or virologists could create mutations not yet seen in nature, again getting a clue as to whether such mutants can survive, how dangerous they are, and how many mutations are required to create an extremely dangerous H5N1.

“Evolution guarantees that naturally pathogenically ‘enhanced’ [strains] of influenza and other pathogens will emerge,” said Dr. Samuel Stanley, president of Stony Brook University and chairman of the National Science Advisory Board for Biosecurity. “Nature is the ultimate bioterrorist and we need to do all we can to stay one step ahead” by conducting research “to help us better recognize and countermand these strains.”

Such studies are “important in helping us identify, understand, and develop strategies and effective countermeasures against rapidly evolving pathogens that pose a threat to public health,” Collins said in a statement, adding that he is “confident” that the studies can be done in a way that is “safe, secure, and responsible.”

One scientist whose gain-of-function experiments raised concerns, virologist Ron Fouchier of Erasmus Medical Center in the Netherlands, said he is “happy to see that this is finally moving forward.” He hopes to resubmit proposals for gain-of-function studies of the H5 and H7 strains of bird flu, which had been approved before the moratorium but then put on hold.

The new policy, explained in a notice from the NIH, covers what are called “enhanced potential pandemic pathogens.” Those are viruses that are made more virulent or more contagious by genetic manipulation. Any decisions on funding such research will include an evaluation by high-level officials at the Department of Health and Human Services.

Funding decisions will be based on whether the research is “scientifically sound” and on whether the pathogen that scientists propose to create is “a credible source of a potential future human pandemic.” That might cover studies on whether there are genetic changes that would allow Ebola to be transmitted through airborne particles, said epidemiologist Michael Osterholm of the University of Minnesota, something that is crucial for public health officials to know, he said, but which should not be made public.

A decision on whether to fund a gain-of-function experiment would also turn on whether its overall potential risks “as compared to the potential benefits to society are justified.” But there is no explicit requirement that the benefits outweigh the risks, leading Ebright to call that standard “insufficient to protect the public.”

Scientists proposing studies that make viruses more dangerous will have to convince the HHS board that there are no feasible, equally effective alternative ways to obtain the new information—such as, does a specific genetic change make avian flu more deadly?—in a way that poses less risk.

Critics of such research see that as a problematic loophole. Some questions—such as, can a naturally occurring strain evolve to become so contagious?—“can uniquely be answered with dangerous experiments,” Lipsitch said, “and cannot be answered safely.”

Republished with permission from STAT. This article originally appeared on December 19, 2017