Telecommuting, Internet shopping and online meetings may save energy as compared with in-person alternatives, but as the digital age moves on, its green reputation is turning a lot browner. E-mailing, number crunching and Web searches in the U.S. consumed as much as 61 billion kilowatt-hours last year, or 1.5 percent of the nation’s electricity—half of which comes from coal. In 2005 the computers of the world ate up 123 billion kilowatt-hours of energy, a number that will double by 2010 if present trends continue, according to Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory. As a result, the power bill to run a computer over its lifetime will surpass the cost of buying the machine in the first place—giving Internet and computer companies a business reason to cut energy costs, as well as an environmental one.

One of the biggest energy sinks comes not from the computers themselves but from the air-conditioning needed to keep them from overheating. For every kilowatt-hour of energy used for computing in a data center, another kilowatt-hour is required to cool the furnacelike racks of servers.

For Internet giant Google, this reality has driven efforts such as the installation of a solar array that can provide 30 percent of the peak power needs of its Mountain View, Calif., headquarters as well as increasing purchases of renewable energy. But to deliver Web pages within seconds, the firm must maintain hundreds of thousands of computer servers in cavernous buildings. “It’s a good thing to worry about server energy efficiency,” remarks Google’s green energy czar Bill Weihl. “We are actively working to maximize the efficiency of our data centers, which account for most of the energy Google consumes worldwide.” Google will funnel some of its profits into a new effort, dubbed RE<C (for renewable energy cheaper than coal, as Google translates it) to make sources such as solar-thermal, high-altitude wind and geothermal cheaper than coal “within years, not decades,” according to Weihl.

In the meantime, the industry as a whole has employed a few tricks to save watts. Efforts include cutting down on the number of transformations the electricity itself must undergo before achieving the correct operating voltage; rearranging the stacks of servers and the mechanics of their cooling; and using software to create multiple “virtual” computers, rather than having to deploy several real ones. Such virtualization has allowed computer maker Hewlett-Packard to consolidate 86 data centers spread throughout the world to just three, with three backups, says Pat Tiernan, the firm’s vice president of social and environmental responsibility.

The industry is also tackling the energy issue at the computer-chip level. With every doubling of processing power in recent years has come a doubling in power consumption. But to save energy, chipmakers such as Intel and AMD have shifted to so-called multicore technology, which packs multiple processors into one circuit rather than separating them. “When we moved to multicore—away from a linear focus on megahertz and gigahertz—and throttled down microprocessors, the energy savings were pretty substantial,” says Allyson Klein, Intel’s marketing manager for its Ecotech Initiative. Chipmakers continue to shrink circuits on the nanoscale as well, which “means a chip needs less electricity” to deliver the same performance, she adds.

With such chips, more personal computers will meet various efficiency standards, such as Energy Star compliance (which mandates that a desktop consume no more than 65 watts). The federal government, led by agencies such as NASA and the Department of Defense may soon require all their purchases to meet the Electronic Product Environmental Assessment Tool standard. And Google, Intel and others have formed the Climate Savers Computing Initiative, an effort to cut power consumption from all computers by 50 percent by 2010.

Sleep modes and other power management tools built into most operating systems can offer savings today. Yet about 90 percent of computers do not have such settings enabled, according to Klein. Properly activated, they would prevent a computer from leading to the emission of thousands of kilograms of carbon dioxide from power plants every year. But if powering down or unplugging the computer (the only way it uses zero power) is not an option, then perhaps the most environmentally friendly use of all those wasted computing cycles is in helping to model climate change. The University of Oxford’s offers an opportunity to at least predict the consequences of all that coal burning.