Compared to familiar climate-saving programs that aim to stuff greenhouse gases into the ground or harness the power of the wind, ideas like "cloud computing" are hard to penetrate. Still, the practice is gaining attention as the information technology (IT) industry promotes it as a tool to save both energy and money.
Meanwhile, though, companies, environmentalists and consumers find themselves struggling with a new question: How do you measure the carbon footprint of a "cloud"?
First there is the murky business of understanding exactly what cloud computing is. IT professionals use the term broadly to describe data processing operations that are outsourced to server farms, instead of being powered on-site (in the server room of an office, for example). For businesses, this could include websites or networks that are hosted remotely. For individuals, it could be Google documents, digital storage space and so forth.
There is no question that computing needs are expanding at a rapid clip. Data centers -- which can be as big as two or three Walmart stores -- have become nerve centers of the digital age. First promoted as a pragmatic way for businesses to avoid IT maintenance hassles, they've now been adopted as a cause for energy efficiency advocates who want to rein in the escalating growth of IT power demands.
The claims are numerous. Cloud computing lowers energy costs for users and cuts greenhouse gas emissions by streamlining information-crunching into single facilities on speedy machines, proponents say. With efficiency upgrades to the centers themselves on the rise, less and less energy is needed to power the digital world.
A July report commissioned by the nonprofit Carbon Disclosure Project, which tracks corporate climate change information, outlines more specifics: Cloud computing, it says, is projected to help large U.S. companies save $12.3 billion on energy costs and cut out 85.7 million metric tons of carbon dioxide emissions annually by 2020.
The forecast, conducted by independent analyst firm Verdantix, considers likely IT trends among the 2,600 or so American companies generating more than $1 billion a year in revenue. It also analyzes electricity consumption data, adjusted to account for the varying carbon intensity of energy sources throughout the country.
Bigger energy bills loom for do-it-yourselfers
The savings result from the data center operators' interest in letting no bit of power capacity go to waste. Scattered servers are energy guzzlers, the report says. Much of the power needed is for heating and cooling of space and machines. Maintaining HVAC controls for spread-out sets of servers is more energy-intensive than managing a central facility.
On-site servers also must be equipped for peak data usage -- heavy hits on a retail website during a new product launch, for example. The rest of the time on that site, Web traffic plateaus and the built-in capacity of the servers goes to waste. Open data centers, on the other hand, can redistribute that excess capacity to other clients, using already highly efficient servers.
The IT industry has been trying to generate some buzz. Last month, Hewlett-Packard Co. released its design of a modular data center that it called the "world's most efficient." Google Inc. has made the same claim of its centers over the years. And in April, Facebook launched the Open Compute Project, a collaborative effort to build one even better.
The flurry of action comes after years of unchecked and unorganized growth, said Paul Dickinson, executive chairman of the Carbon Disclosure Project. A 2007 Energy Star report from U.S. EPA listed increased digital communication, record-keeping and financial transactions as forces pushing up the demand for data processing and storage. Greenpeace estimated that cloud computing worldwide demanded 662 billion kilowatt-hours of electricity in 2007, more power than consumed by the entire country of India, or of Germany.