CAPTURE AND BURY: If power operators begin separating CO2 from their plants' emissions, so it can be buried under ground on a large scale, it would behoove regulators to have a system for monitoring the wells and detecting leaks. Image: Flickr/Virtual Scott
It could take decades, at least, to replace cheap, abundant fossil fuels with low-carbon energy sources. In the meantime, many scientists and government officials around the world think the next best option for keeping Earth's rising levels of atmospheric carbon dioxide (CO2) in check is to prevent the gas from escaping in the first place. This can be done by using a chemical solvent to separate it from the emitted byproducts of power plants and other high-polluting facilities like aluminum manufacturing plants and then burying (technically injecting) it deep underground—a process known as carbon capture and sequestration (CCS). Ideal storage areas include depleted oil and gas reservoirs, unmineable coal seams or deep saline formations, because they are all under sufficient pressure to force the greenhouse gas to stay put and are made of porous rock that can soak up CO2 like a sponge.
The Department of Energy estimates that deep saline formations in the U.S. could hold up to 12,000 gigatons of CO2, meaning they are a viable long-term solution because human activities currently emit around 33 gigatons of CO2 per year. Although burying billions of tons of CO2 underground may sound like a daunting, perhaps even dangerous task, engineers have a pretty good idea how to do it, and scientists have reason to think it can work safely on a large scale. The oil and gas industry began injecting various fluids underground in the 1930s; since that time, researchers have been working to understand the effects of the process on the geochemistry of storage sites and the risks it may pose to human safety. A handful of CO2 storage sites, including a Norwegian project beneath the North Sea initiated in 1996, are already active around the world, showing that the concept, on a small scale, can work.
One potential risk that has garnered a lot of research attention is that of an inadvertent leak—especially a hypothetical case in which CO2 seeps into drinkable groundwater supplies. This was the focus of a study published online November 11 in the journal Environmental Science & Technology.
The study authors acquired freshwater samples from four of the nation’s largest aquifers—the Aquia and Virginia Beach aquifers beneath Maryland and Virginia, respectively, the Mahomet Aquifer in Illinois and the Ogallala Aquifer in Texas—each of which overlies a potential sequestration site. Then, in the laboratory, the researchers exposed the experimental water samples to a flow of CO2 designed to simulate a slow leak and observed chemical changes that occurred over the course of more than 300 days. The CO2 caused the pH of the water in all the samples to drop 1–2 units as the gas reacted with the water to form carbonic acid. The drop in pH caused the rock in the samples to weather, increasing the concentration in the water of elements that had been previously part of the rock.
Although the specific chemical changes depended on the unique geochemistry of each sample’s respective site, the authors report that on the whole, CO2 caused concentrations of alkali and alkali earth elements, as well as manganese, cobalt, nickel and iron, to increase—in some cases by more than two orders of magnitude. Concentrations of aluminum, manganese, iron, zinc, cadmium, selenium, barium, thallium and uranium in some samples neared or exceeded maximum contaminant levels set by the Environmental Protection Agency (EPA). Additionally, in some cases the amounts of dissolved lithium, cobalt, uranium and barium kept increasing throughout the whole experiment, which the authors say shows the value of long-term investigations such as this one.