The fires that smelt iron also heat up the planet, but researchers are working on ways to produce higher-quality metals with fewer greenhouse gas emissions, potentially giving U.S. steelmakers an edge in a competitive global market.
A report released yesterday in the journal Nature highlights a step in this direction that uses electricity instead of heat to extract iron.
With thousands of years of development and two centuries of industrialization, making iron and steel is a mature process around the world. In 2011, manufacturers produced around 100 billion metric tons of iron globally.
From mining ores to smelting to tempering alloys, the process is energy intensive, and engineers have chased improvements about as long as steel has topped axes, formed armor and driven machinery.
"What that means is that the majority of low-hanging fruit have been picked and the processes we have to make [metals] are nearing the limits of what is physically possible," explained Lawrence Kavanagh, president of the Steel Market Development Institute.
Though iron is the most common element in the Earth's crust, it is usually in the form of an ore. Conventional processing methods use a high-temperature blast furnace to heat the iron ore and other compounds to remove oxygen and yield a desired alloy, a method that creates a lot of carbon dioxide, according to a report last year from U.S. EPA on greenhouse gas emissions from the iron and steel sector.
Discovery of an inexpensive anode
"For Integrated steelmaking, the primary sources of GHG emissions are blast furnace stoves (43 percent), miscellaneous combustion sources burning natural gas and process gases (30 percent), other process units (15 percent) and indirect emissions from electricity usage (12 percent)," the report said, estimating that the U.S. steel industry produced 117 million tons of carbon dioxide in 2010.
But there are other ways to pull iron out of rocks. Dissolving ores in a molten electrolyte and passing a current through it could reduce iron oxides to a more usable form and produce oxygen at the same time. "The real issue here is finding a nonconsumable anode that can sustain this process," said Donald Sadoway, a professor of materials chemistry at the Massachusetts Institute of Technology and a co-author of the new report.
Previous attempts to electrolyze ores used anodes made of expensive elements like platinum and iridium, or the components broke down in the 1,600-degree-Celsius temperatures needed to maintain a liquid metal-oxide electrolyte.
Sadoway and his team found that an anode made from chromium-based alloys could withstand the process. These materials are also cheap. "If you end up with something that's superior but far costlier, nobody wants it," Sadoway said.
Using electrolysis to make metals has several advantages over a blast furnace. The resulting metals are purer because there are fewer contaminants introduced in the process. "The electrolytic route actually consumes less energy," Sadoway noted, adding that it can be 30 percent more efficient than conventional methods.
Cost advantage for U.S. industry
These methods can help U.S. manufacturers forge a path to more energy-efficient steelmaking, creating a market advantage from higher-quality metals with smaller carbon footprints. Electrolysis could help reduce prices as well.
"Energy is a big cost in steel," said Kavanagh. He also observed that 40 percent of steel ends up traded, so any changes in production will have global consequences.
However, Kavanagh pointed out that electrolysis is only as clean as the grid that feeds it, so if the energy comes from a coal-fired power plant, there may not be any carbon emissions savings.
"If your source of electricity was renewable power, it would reduce carbon dioxide by 10 percent," said Derek Fray, an emeritus professor of materials chemistry at the University of Cambridge. Fray wrote a piece in Nature yesterday commenting on Sadoway's work.