SERVER FARM: Facebook engineer Joshua Crass holds up a server board installed at the new data center in Prineville, Ore. The exact number of dual-socket boards is proprietary, but it's "many tens of thousands,"--and an example of the growing energy demands of the Internet. Image: Intel / Flickr.com
More than 750 million users, 532 million kilowatt-hours of energy consumption and the attendant 285,000 metric tons of carbon dioxide: those are Facebook's numbers for 2011.
That means, as the social networking company wrote in an August 1 Facebook post (naturally) releasing the data on energy use, that "one person's Facebook use for all of 2011 had roughly the same carbon footprint as one medium latte. Or three large bananas. Or a couple of glasses of wine." That's 269 grams of CO2 per "active user," and another invisible impact of the computing cloud.
But that cloud has a very tangible physical impact. Although the individual number may sound small, when added up, Facebook's—and the world's—use of row after row of computer servers stored on racks in massive, refrigerated, windowless warehouses in places like Prineville, Ore., and Forest City, N.C., consumes a growing share of the globe's energy. For example, to keep Amazon ever ready to take an order, rack after rack of computers in a data center are chilled below 21 degrees Celsius. There are now more than 500,000 data centers worldwide, hosting the bulk of the more than 32 million individual servers. Server farms, according to data center expert Jonathan Koomey of Stanford University, now account for roughly 1.5 percent of global electricity use, or about 300 billion kilowatt-hours of electricity per year. Google's data centers, for example, dwarf Facebook's, using two billion kilowatt-hours per year as the world searches for the latest article on server energy use.
That makes the Internet a larger emitter of greenhouse gases—230 million metric tons—than all the countries of Scandinavia put together.
Internet companies, of course, are not looking for massive energy bills—or catastrophic climate change. Wringing the most energy efficiency out of such cloud computing has become an important part of a company like Facebook's profitability—and cooling all those computers remains the single largest use of energy for these companies.
The only thing that has kept servers from sucking up ever more energy has been a little known corollary of Moore's law: over the past 65 years, the number of computations that can be done per kilowatt-hour of electricity used has doubled every 1.6 years, according to Koomey's research. But small server rooms, or even closets, employed by smaller companies the world over, typically do not have computers with the most efficient cooling and use up to twice as much electricity per computation as the more effective computers employed by many large computing companies.
One idea to cut down on energy use in server rooms (as well as large server farms) is to simply raise the temperature such servers operate in. "Why are data centers cooled to 18 to 21 degrees Celsius? People are concerned about reliability," says Charles Rego, Intel's chief architect for high density data centers and cloud infrastructure. But today's servers can comfortably operate at as much as 27 degrees C, and Intel specifies that its chips must tolerate up to 35 degrees C without a loss in performance. "For every degree Celsius you move, it's 4 to 5 percent energy savings," he notes.