ADVERTISEMENT
latest stories:

Plan B for Energy: 8 Revolutionary Energy Sources

If efficiency improvements and incremental advances in today's technologies fail to halt global warming, could revolutionary new carbon-free energy sources save the day? Don't count on it—but don't count it out, either



SOUTHGEIST / WIKIMEDIA

Editor's Note: We are posting this feature from our September 2006 issue in light of the Obama administration's renewed focus on how to power the country without overloading the atmosphere with greenhouse gases.

To keep this world tolerable for life as we like it, humanity must complete a marathon of technological change whose finish line lies far over the horizon. Robert H. Socolow and Stephen W. Pacala of Princeton University have compared the feat to a multigenerational relay race. They outline a strategy to win the first 50-year leg by reining back carbon dioxide emissions from a century of unbridled acceleration. Existing technologies, applied both wisely and promptly, should carry us to this first milestone without trampling the global economy. That is a sound plan A.

The plan is far from foolproof, however. It depends on societies ramping up an array of carbon-reducing practices to form seven “wedges,” each of which keeps 25 billion tons of carbon in the ground and out of the air. Any slow starts or early plateaus will pull us off track. And some scientists worry that stabilizing greenhouse gas emissions will require up to 18 wedges by 2056, not the seven that Socolow and Pacala forecast in their most widely cited model.

It is a mistake to assume that carbon releases will rise more slowly than will economic output and energy use, argues Martin I. Hoffert, a physicist at New York University. As oil and gas prices rise, he notes, the energy industry is “recarbonizing” by turning back to coal. “About 850 coal-fired power plants are slated to be built by the U.S., China and India—none of which signed the Kyoto Protocol,” Hoffert says. “By 2012 the emissions of those plants will overwhelm Kyoto reductions by a factor of five.”

Even if plan A works and the teenagers of today complete the first leg of the relay by the time they retire, the race will be but half won. The baton will then pass in 2056 to a new generation for the next and possibly harder part of the marathon: cutting the rate of CO2 emissions in half by 2106.

Sooner or later the world is thus going to need a plan B: one or more fundamentally new technologies that together can supply 10 to 30 terawatts without belching a single ton of carbon dioxide. Energy buffs have been kicking around many such wild ideas since the 1960s. It is time to get serious about them. “If we don’t start now building the infrastructure for a revolutionary change in the energy system,” Hoffert warns, “we’ll never be able to do it in time.”

But what to build? The survey that follows sizes up some of the most promising options, as well as a couple that are popular yet implausible. None of them is a sure thing. But from one of these ideas might emerge a new engine of human civilization.

* Reality factors represent estimated technical feasibility from 1 (implausible) to 5 (ready for market)

1. Nuclear Fusion -- Reality Factor: 3*
Starry-eyed physicists point to the promise of unlimited fuel and minimal waste. But politicians blanch at fusion’s price tag and worry about getting burned

Fusion reactors—which make nuclear power by joining atoms rather than splitting them—top almost everyone’s list of ultimate energy technologies for humanity. By harnessing the same strong thermonuclear force that fires the sun, a fusion plant could extract a gigawatt of electricity from just a few kilograms of fuel a day. Its hydrogen-isotope fuel would come from seawater and lithium, a common metal. The reactor would produce no greenhouse gases and relatively small amounts of low-level radioactive waste, which would become harmless within a century. “Even if the plant were flattened [by an accident or attack], the radiation level one kilometer outside the fence would be so small that evacuation would not be necessary,” says Farrokh Najmabadi, a fusion expert who directs the Center for Energy Research at the University of California, San Diego.

The question is whether fusion can make a large contribution to the 21st century or is a 22nd-century solution. “A decade ago some scientists questioned whether fusion was possible, even in the lab,” says David E. Baldwin, who as head of the energy group at General Atomics oversees the largest fusion reactor in the U.S., the DIII-D. But the past 20 years have seen dramatic improvements in tokamaks, machines that use giant electromagnetic coils to confine the ionized fuel within a doughnut-shaped chamber as it heats the plasma to more than 100 million degrees Celsius.

“We now know that fusion will work,” Baldwin says. “The question is whether it is economically practical”—and if so, how quickly fusion could move from its current experimental form into large-scale commercial reactors. “Even with a crash program,” he says, “I think we would need 25 to 30 years” to develop such a design.

So far political leaders have chosen to push fusion along much more slowly. Nearly 20 years after it was first proposed, the International Thermonuclear Experimental Reactor (ITER) is only now nearing final approval. If construction begins on schedule next year, the $10-billion reactor should begin operation in southeastern France in 2016.

Meanwhile an intermediate generation of tokamaks now nearing completion in India, China and Korea will test whether coils made of superconducting materials can swirl the burning plasma within its magnetic bottle for minutes at a time. Current reactors manage a few dozen seconds at best before their power supplies give out.

ITER aims for three principal goals. First it must demonstrate that a large tokamak can control the fusion of the hydrogen isotopes deuterium and tritium into helium long enough to generate 10 times the energy it consumes. A secondary aim is to test ways to use the high-speed neutrons created by the reaction to breed tritium fuel—for example, by shooting them into a surrounding blanket of lithium. The third goal is to integrate the wide range of technologies needed for a commercial fusion plant.

If ITER succeeds, it will not add a single watt to the grid. But it will carry fusion past a milestone that nuclear fission energy reached in 1942, when Enrico Fermi oversaw the first self-sustaining nuclear chain reaction. Fission reactors were powering submarines 11 years later. Fusion is an incomparably harder problem, however, and some veterans in the field predict that 20 to 30 years of experiments with ITER will be needed to refine designs for a production plant.

Najmabadi is more optimistic. He leads a working group that has already produced three rough designs for commercial fusion reactors. The latest, called ARIES-AT, would have a more compact footprint—and thus a lower capital cost—than ITER. The ARIES-AT machine would produce 1,000 megawatts at a price of roughly five cents per kilowatt-hour, competitive with today’s oil- and gas-fired plants. If work on a commercial plant began in parallel with ITER, rather than decades after it goes online, fusion might be ready to scale up for production by midcentury, Najmabadi argues.

Fusion would be even more cost-competitive, Hoffert suggests, if the fast neutrons produced by tokamaks were used to transmute thorium (which is relatively abundant) into uranium (which may be scarce 50 years hence) to use as fuel in nuclear fission plants. “Fusion advocates don’t want to sully its clean image,” Hoffert observes, “but fusion-fission hybrids may be the way to go.”

2. High-Altitude Wind -- Reality Factor: 4*
The most energetic gales soar far over the tops of today’s turbines. New designs would rise higher—perhaps even to the jet stream

Wind is solar energy in motion. About 0.5 percent of the sunlight entering the atmosphere is transmuted into the kinetic energy of air: a mere 1.7 watts, on average, in the atmospheric column above every square meter of the earth. Fortunately, that energy is not distributed evenly but concentrated into strong currents. Unfortunately, the largest, most powerful and most consistent currents are all at high altitude. Hoffert estimates that roughly two thirds of the total wind energy on this planet resides in the upper troposphere, beyond the reach of today’s wind farms.

Ken Caldeira of the Carnegie Institution of Washington once calculated how wind power varies with altitude, latitude and season. The mother lode is the jet stream, about 10,000 meters (33,000 feet) up between 20 and 40 degrees latitude in the Northern Hemisphere. In the skies over the U.S., Europe, China and Japan—indeed, many of the countries best prepared to exploit it—wind power surges to 5,000 or even 10,000 watts a square meter. The jet stream does wander. But it never stops.

If wind is ever to contribute terawatts to the global energy budget, engineers will have to invent affordable ways to mine the mother lode. Three high-flying designs are in active development.

Magenn Power in Ottawa, Ontario, plans to begin selling next year a rotating, helium-filled generator that exploits the Magnus effect (best known for giving loft to spinning golf balls) to float on a tether up to 122 meters above the ground. The bus-size device will produce four kilowatts at its ground station and will retail for about $10,000—helium not included. The company aims to produce higher-flying, 1.6-megawatt units, each the size of a football field, by 2010.

“We looked at balloons; the drag they produce seemed unmanageable in high winds,” says Al Grenier of Sky WindPower in Ramona, Calif. Gre­nier’s venture is instead pursuing autogiros, which catch the wind with helicopterlike rotors. Rising to 10,000 meters, the machines could realize 90 percent of their peak capacity. The inconstancy of surface winds limits ground turbines to about half that. But the company has struggled to gather the $4 million it needs for a 250-kilowatt prototype.

Still in the conceptual stages is the “laddermill,” designed by astronaut Wubbo J. Ockels and his students at the Delft University of Technology in the Netherlands. Ockels envisions a series of computer-controlled kites connected by a long tether. The ladder of kites rises and descends, turning a generator on the ground as it yo-yos up and down. Simulations of the system suggest that a single laddermill reaching to the jet stream could produce up to 50 megawatts of energy.

Until high-altitude machines are fielded, no one can be certain how well they will hold up under turbulence, gusts and lightning strikes. Steep maintenance costs could be their downfall.

There are regulatory hurdles to clear as well. Airborne wind farms need less land than their terrestrial counterparts, but their operators must persuade national aviation agencies to restrict aircraft traffic in the vicinity. There is precedent for this, Grenier points out: the U.S. Air Force has for years flown up to a dozen large tethered aerostats at high altitude above the country’s southern border.

By the standards of revolutionary technologies, however, high-altitude wind looks relatively straightforward and benign.

3. Sci-Fi Solutions -- Reality Factor: 1*
Futuristic visions make for great entertainment. Too bad about the physics

3-A: Cold Fusion and Bubble Fusion
B. Stanley Pons and Martin Fleischmann spun a tempest in a teacup in 1989 with their claim of room-temperature fusion in a bottle. The idea drew a coterie of die-hard supporters, but mainstream scientists have roundly rejected that variety of cold fusion.

Theoretically more plausible—but still experimentally contentious—is sonofusion. In 2002 Rusi Taleyarkhan, a physicist then at Oak Ridge National Laboratory, reported in Science that beaming high-intensity ultrasound and neutrons into a vat of acetone caused microscopic bubbles to form and then implode at hypersonic speeds. The acetone had been made using deuterium, a neutron-bearing form of hydrogen, and Taleyarkhan’s group claimed that the extraordinary temperatures and pressures created inside the imploding bubbles forced a few deuterium atoms to fuse with incoming neutrons to form tritium (hydrogen with two neutrons per atom). Another group at Oak Ridge replicated the experiment but saw no clear signs of fusion.

Taleyarkhan moved to Purdue University and continued reporting success with sonic fusion even as others tried but failed. Purdue this year investigated allegations that Taleyarkhan had interfered with colleagues whose work seemed to contradict his own. The results of the inquiry were sealed—and with them another chapter in the disappointing history of cold fusion. Other researchers hold out hope that different methods might someday turn a new page on sonofusion.

3-B: Matter-Antimatter Reactors
The storied Enterprise starships fueled their warp drives with a mix of matter and antimatter; why can’t we? The combination is undoubtedly powerful: a kilogram of each would, through their mutual annihilation, release about half as much energy as all the gasoline burned in the U.S. last year. But there are no known natural sources of antimatter, so we would have to synthesize it. And the most efficient antimatter maker in the world, the particle accelerator at CERN near Geneva, would have to run nonstop for 100 trillion years to make a kilogram of antiprotons.

So even though physicists have ways to capture the odd antiatom [see “Making Cold Antimatter,” by Graham P. Collins; Scientific American, June 2005], antimatter power plants will never materialize.

4. Space-Based Solar -- Reality Factor: 3*
With panels in orbit, where the sun shines brightest— and all the time—solar could really take off. But there’s a catch

When Peter Glaser proposed in 1968 that city-size satellites could harvest solar power from deep space and beam it back to the earth as invisible microwaves, the idea seemed pretty far out, even given Glaser’s credentials as president of the International Solar Energy Society. But after the oil crises of the 1970s sent fuel prices skyrocketing, NASA engineers gave the scheme a long hard look. The technology seemed feasible until, in 1979, they estimated the “cost to first power”: $305 billion (in 2000 dollars). That was the end of that project.

Solar and space technologies have made great strides since then, however, and space solar power (SSP) still has its champions. Hoffert cites two big advantages that high-flying arrays could lord over their earthbound brethren. In a geostationary orbit well clear of the earth’s shadow and atmosphere, the average intensity of sunshine is eight times as strong as it is on the ground. And with the sun always in their sights, SSP stations could feed a reliable, fixed amount of electricity into the grid. (A rectifying antenna, or “rectenna,” spread over several square kilometers of land could convert microwaves to electric current with about 90 percent efficiency, even when obstructed by clouds.)

“SSP offers a truly sustainable, global-scale and emission-free electricity source,” Hoffert argues. “It is more cost-effective and more technologically feasible than controlled thermonuclear fusion.” Yet there is minimal research funding for space-based solar, he complains, while a $10-billion fusion reactor has just been approved.

NASA did in fact fund small studies from 1995 to 2003 that evaluated a variety of SSP components and architectures. The designs took advantage of thin-film photovoltaics to create the electricity, high-temperature superconductors to carry it, and infrared lasers (in place of microwave emitters) to beam it to ground stations. Such high-tech innovations enabled SSP engineers to cut the systems’ weight and thus reduce the formidable cost of launching them into orbit.

But here’s the catch: the power-to-payload ratio, at a few hundred watts per kilogram, has remained far too low. Until it rises, space-based solar will never match the price of other renewable energy sources, even accounting for the energy storage systems that ground-based alternatives require to smooth over nighttime and poor-weather lulls.

Technical advances could change the game rapidly, however. Lighter or more efficient photovoltaic materials are in the works]. In May, for example, researchers at the University of Neuchâtel in Switzerland reported a new technique for depositing amorphous silicon cells on a space-hardy film that yields power densities of 3,200 watts per kilogram. Although that is encouraging, says John C. Mankins, who led NASA’s SSP program from 1995 to 2003, “the devil is in the supporting structure and power management.” Mankins sees more promise in advanced earth-to-orbit space transportation systems, now on drawing boards, that might cut launch costs from more than $10,000 a kilogram to a few hundred dollars in coming ­decades.

JAXA, the Japanese space agency, last year announced plans to launch by 2010 a satellite that will unfurl a large solar array and beam 100 kilowatts of microwave or laser power to a receiving station on the earth. The agency’s long-term road map calls for flying a 250-megawatt prototype system by 2020 in preparation for a gigawatt-class commercial SSP plant a decade later.

NASA once had similarly grand designs, but the agency largely halted work on SSP when its priorities shifted to space exploration two years ago.

5. Nanotech Solar Cells -- Reality Factor: 4*
Materials engineered from the atoms up could boost photovoltaic efficiencies from pathetic to profitable

Five gigawatts—a paltry 0.038 percent of the world’s consumption of energy from all sources. That, roughly, is the cumulative capacity of all photovoltaic (PV) power systems installed in the world, half a century after solar cells were first commercialized. In the category of greatest unfulfilled potential, solar-electric power is a technology without rival.

Even if orbiting arrays never get off the ground, nanotechnology now looks set to rescue solar from its perennial irrelevance, however. Engineers are working on a wide range of materials that outshine the bulk silicon used in most PV cells today, improving both their efficiency and their cost.

The most sophisticated (and expensive) second-generation silicon cells eke out about 22 percent efficiency. New materials laced with quantum dots might double that, if discoveries reported this past March pan out as hoped. The dots, each less than 10 billionths of a meter wide, were created by groups at the National Renewable Energy Laboratory in Colorado and Los Alamos National Laboratory in New Mexico.

When sunlight hits a silicon cell, most of it ends up as heat. At best, a photon can knock loose one electron. Quantum dots can put a wider range of wavelengths to useful work and can kick out as many as seven electrons for every photon. Most of those electrons soon get stuck again, so engineers are testing better ways to funnel them into wires. They are also hunting for dot materials that are more environmentally friendly than the lead, selenium and cadmium in today’s nanocrystals. Despite their high-tech name, the dots are relatively inexpensive to make.

Nanoparticles of a different kind promise to help solar compete on price. Near San Francisco, Nanosolar is building a factory that will churn out 200 million cells a year by printing nanoscopic bits of copper-indium-gallium-diselenide onto continuous reels of ultrathin film. The particles self-assemble into light-harvesting structures. Nanosolar’s CEO says he is aiming to bring the cost down to 50 cents a watt.

The buzz has awakened energy giants. Shell now has a subsidiary making solar cells, and BP in June launched a five-year project with the California Institute of Technology. Its goal: high-efficiency solar cells made from silicon nanorods.

6. A Global Supergrid -- Reality Factor: 2*
Revolutionary energy sources need a revolutionary superconducting electrical grid that spans the planet

“A basic problem with renewable energy sources is matching supply and demand,” Hoffert observes. Supplies of sunshine, wind, waves and even biofuel crops fade in and out unpredictably, and they tend to be concentrated where people are not. One solution is to build long-distance transmission lines from superconducting wires. When chilled to near absolute zero, these conduits can wheel tremendous currents over vast distances with almost no loss.

In July the BOC Group in New Jersey and its partners began installing 350 meters of superconducting cable into the grid in Albany, N.Y. The nitrogen-cooled link will carry up to 48 megawatts’ worth of current at 34,500 volts. “We know the technology works; this project will demonstrate that,” says Ed Garcia, a vice president at BOC.

At a 2004 workshop, experts sketched out designs for a “SuperGrid” that would simultaneously transport electricity and hydrogen. The hydrogen, condensed to a liquid or ultracold gas, would cool the superconducting wires and could also power fuel cells and combustion engines.

With a transcontinental SuperGrid, solar arrays in Australia and wind farms in Siberia might power lights in the U.S. and air conditioners in Europe. But building such infrastructure would most likely take generations and trillions of dollars.

7. Waves and Tides -- Reality Factor: 5*
The surging ocean offers a huge, but virtually untapped, energy resource. Companies are now gearing up to catch the wave

The tide has clearly turned for the dream of harnessing the incessant motion of the sea. “Ocean energy is about 20 years behind wind power,” acknowledges Roger Bedard, ocean energy leader at the Electric Power Research Institute. “But it certainly isn’t going to take 20 years to catch up.”

Through the 1980s and 1990s, advocates of tidal and wave power could point to only two commercial successes: a 240-megawatt (MW) tidal plant in France and a 20-MW tidal station in Nova Scotia. Now China has jumped onboard with a 40-kilowatt (kW) facility in Daishan. Six 36-kW turbines are soon to start spinning in New York City’s East River. This summer the first commercial wave farm will go online in Portugal. And investors and governments are hatching much grander schemes.

The grandest is in Britain, where analysts suggest ocean power could eventually supply one fifth of the country’s electricity and fulfill its obligations under the Kyoto Protocol. The U.K. government in July ordered a feasibility study for a 16-kilometer dam across the Severn estuary, whose tides rank second largest in the world. The Severn barrage, as it is called, would cost $25 billion and produce 8.6 gigawatts when tides were flowing. Proponents claim it would operate for a century or more.

Environmental groups warn that the barrage would wreak havoc on the estuarine ecosystem. Better than a dam, argues Peter Fraenkel of Marine Current Turbines, would be arrays of the SeaGen turbines his company has developed. Such tide farms dotting the U.K. coast could generate almost as much electricity as the Severn dam but with less capital investment, power variation and environmental impact.

Fraenkel’s claims will be put to a small test this year, when a tidal generator the company is installing in Strangford Lough begins contributing an average power of 540 kW to the grid in Northern Ireland. The machine works much like an underwater windmill, with two rotors sharing a single mast cemented into the seabed.

“The biggest advantage of tidal power is that it is completely predictable,” Bedard says. “But on a global scale, it will never be very large.” There are too few places where tides move fast enough.

Energetic waves are more capricious but also more ubiquitous. An analysis by Bedard’s group found that if just 20 percent of the commercially viable offshore wave resources in the U.S. were harnessed with 50-percent-efficient wave farms, the energy produced would exceed all conventional hydroelectric generation in the country.

Four companies have recently completed sea trials of their wave conversion designs. One of them, Ocean Power Delivery, will soon begin reaping 2.25 MW off the coast of Portugal from three of its 120-meter-long Pelamis machines. If all goes well, it will order another 30 this year. Surf’s up.

8. Designer Microbes -- Reality Factor: 4*
Genetic engineers think they can create synthetic life-forms that will let us grow energy as easily as we do food

“We view the genome as the software, or even the operating system, of the cell,” said J. Craig Venter. It’s time for an upgrade, he suggested. Venter was preaching to the choir: a large group of biologists at the Synthetic Biology 2.0 conference this past May. Many of the scientists there have projects to genetically rewire organisms so extensively that the resulting cells would qualify as synthetic species. Venter, who gained fame and fortune for the high-speed methods he helped to develop to sequence the human genome, recently founded a company, Synthetic Genomics, to commercialize custom-built cells. “We think this field has tremendous potential to replace the petrochemical industry, possibly within a decade,” he said.

That assessment may be overly optimistic; no one has yet assembled a single cell from scratch. But Venter reported rapid progress on his team’s efforts to create artificial chromosomes that contain just the minimum set of genes required for self-sustaining life within a controlled, nutrient-rich environment. “The first synthetic prokaryotic cell [lacking a nucleus] will definitely happen within the next two years,” he predicted. “And synthetic eukaryotic genomes [for cells with nuclei] will happen within a decade at most.”

Venter envisions novel microbes that capture carbon dioxide from the smokestack of a power plant and turn it into natural gas for the boiler. “There are already thousands, perhaps millions, of organisms on our planet that know how to do this,” Venter said. Although none of those species may be suited for life in a power plant, engineers could borrow their genetic circuits for new creations. “We also have biological systems under construction that are trying to produce hydrogen directly from sunlight, using photosynthesis,” he added.

Steven Chu, director of Lawrence Berkeley National Laboratory, announced that his lab is readying a proposal for a major project to harness the power of the sun and turn it into fuels for transportation. With the tools of genetic engineering, Chu explained, “we can work on modifying plants and algaes to make them self-fertilizing and resistant to drought and pests.” The novel crops would offer high yields of cellulose, which man-made microbes could then convert to fuels. Chu expects biological processing to be far more efficient than the energy-intensive processes, such as steam explosion and thermal hydrolysis, currently used to make ethanol.

With oil prices approaching $80 a barrel, bioprocessing may not have to wait for life-forms built from scratch. GreenFuel in Cambridge, Mass., has installed algae farms at power plants to convert up to 40 percent of the CO2 they spew into raw material for biofuels. The company claims that a large algae farm next to a 1-GW plant could yield 50 million gallons a year of ethanol. “There are great opportunities here,” Chu avers. “And not only that—it will help save the world.” 

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Holiday Sale

Give a Gift &
Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now! >

X

Email this Article

X