The U.S. government itself more than quadrupled the number of data centers it operated between 1998 and 2010, growing from 432 facilities to more than 2,000. Now, it is trying to scale back the sprawl by cutting and consolidating duplicative centers.
"The proliferation of infrastructure has created an environment that enables redundant systems and applications to sprout like weeds," said federal Chief Information Officer Vivek Kundra in a blog post last month.
By the end of 2011, 178 federal data centers will close, with a total of 800 planned to shutter by 2015 (Greenwire, July 20).
Facility closures and tech efficiency (along with the global economic slump) have already contributed to a deceleration in the growth of electricity consumption in data centers since 2005, according to another study, this one released by Stanford University professor Jonathan Koomey in early August.
Steps toward consolidation and efficiency are a no-brainer for the government and other operators who want to cut their energy costs, say advocates. A better facility means reduced energy use, which means lower bills.
Consolidation's better, but what's the energy source?
"It's not magic; it's mathematics," said Radu Gheorghievici-Pohl, an executive at IT efficiency firm 1E. With energy costs fluctuating, he says, companies must address what's in their control.
"You have to deal with energy in a more conscious way," he said. "Nobody knows how prices will be developing in the next period."
To that end, server machines are becoming less wasteful; data center designers are making optimal use of physical space; and HVAC innovation is achieving the same temperature and ventilation controls with less energy. These gains are measured by power usage effectiveness (PUE), adopted by industry group the Green Grid as a standard metric to compare power needed to run the server equipment with total power needed in the data center.
A recent Greenpeace report, however, says this calculus is overlooking a major variable: the energy source. Companies are relying on healthy PUE ratings, the report says, to "communicate externally that their data centers are 'green' and sustainable without accounting for the full environmental picture."
Though internal efficiency is a worthy cause, the April report says, energy savings are superficial if the data center is powered with a non-renewable source.
That is often the case. Take North Carolina, for example: The western part of the state is becoming a data center hub, with Apple, Facebook and Google all setting up shop there. The geography is inviting; risk of natural disasters is low, telecommunications infrastructure is already in place and energy is dirt-cheap. The state's major utility is Duke Energy Corp., which generates more than 60 percent of its power from coal, with nuclear making up most of the rest. A pending merger with Progress Energy Inc. is expected to even out that ratio.
Similarly, Facebook's announcement of a new data center in Prineville, Ore., was met with criticism from environmentalists. They say the 150,000-square-foot building (which is planned to double in size) should have been located in an area powered by clean energy. Instead, it will be powered by PacifiCorp., which depends mostly on coal and natural gas for generation. Renewables make up 10 percent of its energy portfolio.
Facebook defends the site, which opened in April, pointing to its robust efforts to minimize the center's effects on the environment. South-central Oregon's low-humidity climate allows the building to draw in outside air, process it through layered filters and use it to cool the space and machines. Other energy-saving measures are in place, including rainwater reclamation and reuse of server heat to regulate office temperature. Facebook says the Prineville site uses 38 percent less energy to do the same work as the company's existing facilities.