Top climate scientists say their field can improve its transparency.
A group of researchers presented their findings on reproducibility in climate science to the National Academies of Sciences, Engineering and Medicine yesterday as part of a monthslong examination of scientific transparency.
The awareness of issues around reproducing scientific data has been driven by the political nature of climate science, said Andrea Dutton, a geologist at the University of Florida and expert in sea-level rise.
“Climate science has undergone a lot of public scrutiny as we’re all aware,” she said. “And I think dealing with that has really increased our awareness as a community of being very rigorous about quantifying our uncertainties and being transparent in reporting, being transparent in data archiving.”
There is a broad effort underway by researchers to address the challenge of data transparency in science. A group of researchers from the academies is reviewing the issue at the behest of Rep. Lamar Smith (R-Texas), chairman of the House Science, Space and Technology Committee. The National Academies will produce a report by the end of the year that explores the issue.
Smith has accused federal climate scientists of committing fraud and misrepresenting humanity’s role in driving climate change. He was also instrumental in helping shape a new rule proposed by EPA Administrator Scott Pruitt that would require research used by EPA to craft regulations to have data that are public and transparent. Critics say the effort is really designed to exclude definitive studies that have driven air pollution regulations and other public health protections.
EPA’s proposed rule appears targeted toward air pollution regulations in particular, and some of those who helped shape the policy have criticized studies that connect soot to serious human health problems for decades. Less clear is the rule’s effect on climate science used by the agency.
Critics, including Smith, have often targeted the data relied on by climate scientists, calling their conclusions into question. Scientists speaking at the academies meeting said the increased public awareness presented an opportunity to increase the public’s understanding of the underlying data that inform their work.
“This public scrutiny has I think helped us to up our game in all these areas and be better about being transparent and making it open to the public so that the people who want to see it and how reproducible things are or are not as the case may be,” Dutton said.
Researchers yesterday said that there was a need to make data sets more widely available for anyone to download and understand. In particular, there is a lack of established standards for archiving data and metadata, they said. There is also lax enforcement by funding agencies and editors of journals in data reporting. They are now looking to create discipline standards for data reporting, as well as requirements such as those that would require data to be reported within two years of collection or when the research is published.
Making data available is part of publishing in the modern era, and there needs to be better methods for verifying the results of a study are statistically valid, said Rich Loft, director of the technology development division at the National Center for Atmospheric Research.
“In the age of big data, journal publications which would have been suitable a hundred years ago [are] not suitable anymore because it’s not actually enough information to reproduce the thing, so somehow we have to extend our definition of peer-reviewed into these analyses,” he said.
One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt, director of the Goddard Institute for Space Studies at NASA. In addition to storing the data, researchers must make the coding used to synthesize it available, he said. In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said.
“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exploring something that you thought was interesting,” he said.* “So there is a cost to the community, but the benefit is of course understanding how robust particular results are.”
*Correction: This story has been updated to reflect that Schmidt said "exploring" in the preceding quote, not "exporting,"
Reprinted from Climatewire with permission from E&E News. E&E provides daily coverage of essential energy and environmental news at www.eenews.net.