Getting Serious about Flu
A combination of public health measures and technology raises hope for the flu fight

As the specter of a global flu pandemic looms ever larger, both veteran flu scientists and newcomers to the field are making important progress against the disease. Robert G. Webster, now at St. Jude Children's Research Hospital in Memphis, Tenn., first discovered during the 1960s that the novel flu viruses that seem to sweep through the human population every 30-odd years can arise from combinations of bird and human flu strains. He realized then that preventing new pandemics would require humans to control the avian half of the threat at its source.

After a 1997 human outbreak in Hong Kong of an avian flu virus called H5N1, Webster turned his insight into action, guiding a reform of the territory's live-animal markets to avert new opportunities for birds, animals and people to infect one another with flu viruses that might combine into a pandemic strain. The H5N1 virus has since raged throughout Asia's bird populations and infected more than 100 people, yet Hong Kong has been largely spared. That stark contrast has in the past year prompted several Asian nations as well as global agriculture and public health authorities to begin seriously discussing regional agricultural reforms that would follow the Hong Kong model.

Much of the intensified awareness of the avian flu threat to humans is also owed to the efforts of Klaus Sthr, head of flu surveillance and response for the World Health Organization. Sthr has been working tirelessly behind the scenes to break down barriers to better global preparedness for the crisis, including fostering international scientific collaborations, advising countries on how to bolster their detection of flu cases, and brokering negotiations between vaccine makers and national governments.

Both vaccines and antiviral drugs will be in short supply during a pandemic, but expanded choices are in the pipeline. Robert B. Belshe of Saint Louis University and his colleagues demonstrated a dose-sparing approach to flu vaccination this year, showing that administering vaccine just under the skin, instead of into muscle, prompts a greater immune response with less vaccine. This insight could also lead to an array of new techniques for administering regular flu vaccine to groups, such as the elderly, who often have a weak response to the traditional flu shot.

The Iomai Corporation in Gaithersburg, Md., is working on intradermal vaccination toward the goal of doing away with flu shots entirely and instead delivering the vaccine through a skin patch. Iomai has already shown that a patch containing an adjuvant--a substance that enhances immune system response--improves the effect of an intradermal vaccine when the patch is placed on or near the vaccination site. A patch that combines vaccine with an adjuvant boost would represent a radically new approach to flu immunization.

Innovative thinking also distinguishes NexBio, Inc., in San Diego, which is gearing up for the first clinical trials of its antiflu drug, Fludase. Most flu antivirals work by disabling specific parts of the virus, so their effectiveness can vary considerably depending on the individual strain's defenses. Fludase instead blocks the doorway in lung cells that flu viruses use to enter them. By targeting the door rather than the intruder, the company hopes to create a drug that is equally effective on all flu viruses and offers them no way to develop resistance. This work illustrates how science, technology and policy are being marshaled to combat future flu pandemics. --Christine Soares

More Power to Solar
Photovoltaic advances make the ever lagging technology more of a competitor

Brazilians joke that theirs is the country of the future--and always will be. Likewise, solar power has always been the ultimate green technology of the future. But maybe the sun is finally rising. The photovoltaic market, though small, has been growing briskly: by more than 60 percent in 2004. Plastering your roof with solar cells now runs as little as 20 cents per kilowatt-hour over the system's estimated lifetime, which is approaching what most households pay for electricity.

One especially promising technology that emerged in the 1990s was to make solar cells from plastic spiked with nanometer-scale crystals. Even those composite devices, though, were restricted to absorbing visible light. This year a group led by Edward H. Sargent at the University of Toronto coaxed them to absorb infrared light as well. A concoction of lead sulfide particles a few nanometers in size can absorb wavelengths as long as two microns. Thus able to harvest a wider swath of the solar spectrum, inexpensive plastic cells could rival the performance of pricey silicon ones.

Other avant-garde photovoltaic devices consist of nanoparticles coated with dye and doused in electrolyte, an approach pioneered by Michael Grtzel of the Swiss Federal Institute of Technology in Lausanne a decade ago. The dye handles the job of absorbing photons and generating a current of electrons. Because the source of the electrons (the dye) is divorced from the matrix through which they flow in (the electrolyte) and out (the nanoparticles), electrons are less likely to get prematurely recaptured by atoms, a process that impairs current flow in conventional cells. Consequently, the dye-based cells work better under weak lighting conditions.

Tsutomu Miyasaka and Takurou N. Murakami of the Toin University of Yokohama have extended the technique to create the world's first photocapacitor: a solar cell that both generates and stores electricity. Alongside the dye-coated particles, the researchers slapped down layers of activated carbon, which traps electrons and holds them until a switch completes the circuit. Under a 500-watt bulb, their latest design takes a couple of minutes to charge up to 0.8 volt. It has a capacitance of about 0.5 farad per square centimeter, which would give a typical solar panel the same energy storage capacity as the so-called ultracapacitors developed to replace or supplement batteries in hybrid cars and uninterruptible power supplies. In 2004 Miyasaka founded a company, Peccell Technologies, to commercialize this and other innovations.

Another way to store energy is in the form of hydrogen gas. In the late 1960s Japanese researchers Akira Fujishima and Kenichi Honda discovered that a solar cell can act like an artificial tree leaf, splitting water into its constituent elements. The trouble was that the materials involved, such as titanium dioxide, absorb mostly ultraviolet light. Restricted to such a narrow band of spectrum, the process was pitifully inefficient. Tinkering with their chemical properties allowed the cells to absorb visible light but also made them prone to corrosion.

Grtzel recently developed a way around this unhappy trade-off: put two solar cells together. The first contains tungsten trioxide or iron oxide, which soaks up the ultraviolet. The second is one of his dye-sensitized cells, which absorbs the rest of the visible spectrum and provides more electrons to aid the photolysis.

A year ago Hydrogen Solar, a British company trying to commercialize the work, made the announcement of a nearly 10-fold improvement in the efficiency of water splitting. It estimates that hydrogen produced this way would still cost about twice as much as hydrogen from natural gas but might become competitive if greenhouse gas emissions were restricted. You wouldn't need to go to a gas station to refill your fuel-cell car; the solar panel on the roof of your house could be your private gas station.--George Musser

Stem Cell Imperative
Despite political obstacles, research and commercial endeavors advance

Embryonic stem cells can become any other cell in the body, a capability researchers hope one day to direct toward healing organs ravaged by disease. In the U.S., President George W. Bush restricted federal funding of human embryonic stem cell research in 2001, with just $24.8 million doled out in 2004. But in a dramatic rejection of Bush's policy, the state of California in the past year became the world's largest single backer of stem cell research, a move spearheaded by Palo Alto, Calif.-based housing developer Robert Klein.

The California Institute for Regenerative Medicine, created in November 2004, has the power to issue $3 billion in grants over 10 years for embryonic stem cell and other biomedical research. Klein, a Stanford University-educated lawyer, was the chief architect of the campaign for the institute and, at $2.6 million, the campaign's largest financial supporter. Klein, who was unanimously elected as the institute's chairman, is hopeful that stem cells can help cure his youngest son's diabetes [see "A Proposition for Stem Cells," by Sally Lehrman; Insights, Scientific American, September].

The institute's creation has propelled 10 states in domino fashion to play catch-up and consider establishing their own more modest stem cell funding initiatives, if only to halt brain drains of researchers to California. Yet although Klein has undoubtedly won a victory for U.S. embryonic stem cell research, his success remains controversial. Besides the criticism research on stem cells from human embryos typically attracts, Klein and the institute have drawn fire for alleged secrecy over meetings and potential conflicts of interest of board members who represent corporations, universities and nonprofit groups that stand to gain from research grants.

Internationally, stem cell companies are growing more global, and quite likely the most multinational among them is Stem Cell Sciences. Headquartered in Scotland, the company employs roughly 40 people in research and development centers in the U.K., Japan and Australia, and it anticipates establishing a U.S. operation this year.

Stem Cell Sciences's bold plan is focused on commercializing human embryonic stem cells. Its technology can generate an unlimited supply of highly purified stem cells and their differentiated progeny for drug development. That promise has led to licensing agreements with pharmaceutical giants such as Pfizer, GlaxoSmithKline and Aventis. The company's primary goal is to be the first to develop an embryonic stem cell-derived therapy aimed at targets such as diabetes and Parkinson's disease.

One problem with embryonic stem cells is how they can spontaneously differentiate into other cells when scientists do not want them to. R. Michael Roberts and his colleagues at the University of Missouri at Columbia have now discovered a way to grow human embryonic stem cells more predictably. They noted that mammal embryos grow in low-oxygen environments in the early stages of their development but that human embryonic stem cells are generally cultured in normal atmospheric oxygen conditions. In atmospheres of only 3 or 5 percent oxygen, the scientists found cells proliferated as well as they did under normal conditions while differentiation was markedly suppressed. The outlook is improving. Good science, new funding and commercial endeavors assure a place for stem cells, no matter what the federal stance. --Charles Q. Choi

Repairing Broken Hearts
Research in zebra fish could transform cardiology

The human heart possesses stem cells but cannot regenerate after injury, instead replacing damaged muscle with scar tissue. Our powerlessness to mend an ailing heart is the primary cause of death in the developed world.

In May cardiologist Mark T. Keating of Harvard Medical School and his colleagues accomplished the long-sought goal of enticing adult mammal heart muscle cells to multiply, the first step on the road to heart-repair therapies. In 2002 Keating discovered that zebra fish could regrow up to a fifth of their hearts within two months without scarring following removal of 20 percent of the muscle from the lower chamber. He then sought to achieve the same results in humans. His laboratory found that activity of the enzyme p38 MAP kinase was lowest in fetal rats when the heart was growing and highest when heart muscle cell growth slowed or halted, suggesting it put the brakes on cell division.

When the researchers partnered a p38-inhibiting drug with growth factor FGF1, this combination spurred adult rat heart muscle cells to proliferate. Keating and his team believe a drug-based approach to repairing hearts could prove more elegant than ones based on stem cells. The scientists are currently testing mixtures of p38 inhibitor and growth factor on animals that have suffered heart attacks to see if they can help the body's most important muscle heal. Such drugs would mark a revolution in cardiology. --Charles Q. Choi

Protections for the Earth's Climate
Industry, local governments and academia look for solutions to global warming

The battle to prevent or at least slow global warming has intensified in the past year as scientists have learned more about the magnitude of the problem. One of the leading climate experts, Inez Y. Fung, director of the Atmospheric Sciences Center at the University of California, Berkeley, recently showed that the earth may soon lose its ability to absorb much of the greenhouse gas that is raising temperatures. The oceans and continents currently soak up about half the carbon dioxide produced by the burning of fossil fuels. In the oceans, the gas combines with water to form carbonic acid; on land, plants take in more carbon dioxide and grow faster. But computer modeling done by Fung and her colleagues indicates that these carbon sinks will become less effective as the earth continues to warm. For example, as the tropics become hotter and drier in the summer, plants will curtail their respiration of carbon dioxide to avoid water loss. Atmospheric measurements over the past decade have confirmed this effect. If the oceans and land take in less carbon dioxide, more will remain in the atmosphere and global warming could accelerate catastrophically.

Despite these warning signs, the administration of President George W. Bush has opposed ratification of the Kyoto Protocol, the international treaty mandating reductions in greenhouse gas emissions. (Signed by more than 150 nations, the treaty went into effect this past February.) But nine states in the northeastern U.S. are attempting to sidestep the federal government's opposition by taking action on their own. In 2003 the governors of Connecticut, Delaware, Maine, Massachusetts, New Hampshire, New Jersey, New York, Rhode Island and Vermont created the Regional Greenhouse Gas Initiative. Last August the group reached a preliminary agreement to freeze power plant emissions of carbon dioxide by 2009 and then reduce them by 10 percent by 2020. The plan requires approval by the state legislatures, but environmentalists are already hoping that other regions of the U.S. will follow suit. If adopted nationwide, the proposal would lower greenhouse gas emissions by roughly as much as the Kyoto Protocol would have.

Steve Howard, chief executive of the Climate Group, is tackling the global-warming problem from a different angle. Founded in 2004, the Climate Group is a coalition of corporations and state and local governments that have voluntarily committed to reducing their greenhouse gas emissions. Members include oil giant BP, drugmaker Johnson & Johnson, and Starbucks. Businesses in the Climate Group have discovered that improvements in energy efficiency can enhance profits as well as cut fossil-fuel emissions; BP, for instance, slashed its energy bills by $650 million over 10 years. "We have seen important evidence about successful emission reduction scattered here and there in the most surprising places all over the globe," Howard says. "We are working to bring all of it together so that it forms a body of evidence." --Mark Alpert

A Future in Plastics
The march toward less expensive, more flexible electronics continues

Organic semiconducting materials will never replace the silicon chips in your computer, but now they are finding their way into applications ranging from flexible displays to low-cost radio-frequency identity tags, for which silicon chips are not suited. The past year has witnessed advances in both the development of specific devices and understanding of the basic physics of the materials.

On the device front, Paul W. M. Blom and his student Ronald C. G. Naber of the University of Groningen in the Netherlands and their collaborators developed an inexpensive nonvolatile memory chip out of a physically tough polymer. The working element of the device is a field-effect transistor containing a layer of ferroelectric polymer that can be switched between two states by a voltage pulse. Similar structures have been studied before, but the Groningen device is the first to combine several desirable properties, including a long data-retention time after the power is turned off and a short programming time (it takes only a millisecond to write data to the transistor). In addition, the devices can be manufactured by depositing the various layers of the transistor out of a liquid solution, including the all-important ferroelectric layer. Large-scale industrial production should therefore be feasible using low-cost techniques such as spin-coating or printing. The work was done in collaboration with researchers at Philips Research Eindhoven in the Netherlands.

Of crucial importance for the future of plastic electronics is the cultivation of a good understanding of precisely how electric currents flow in the devices. Most organic semiconductor devices suffer from numerous material defects, which dominate the behavior of moving charges and obscure efforts to understand the intrinsic properties of the material. In August 2004 a research group led by John A. Rogers of the University of Illinois and Michael E. Gershenson of Rutgers University reported a major advance in unraveling these effects. The group made an extremely pure and defect-free crystal of rubrene by vapor deposition. (Rubrene consists of four benzene rings in a chain with four more attached individually as side groups, like two pairs of wings.) They constructed electrodes separately in the form of a "stamp" that was pressed against the rubrene to create a transistor. This technique avoids damage to the rubrene by the electrode-making process. Measurements of the transistor's properties revealed that the flow of charges in organics is slower than in silicon largely because the charges distort the flexible organic crystal lattice and then drag around the distortions with them.

Samuel I. Stupp and his co-workers at Northwestern University have pursued a different technique to reduce the amount of defects and disorder in organic materials. They worked with a short chainlike molecule called phenylene vinylene, attaching a water-repelling molecule to one end of the chain and a water-attracting molecule to the other end. Then they poured a water-based solution of the molecules onto glass, where the molecules self-assembled into well-ordered layers.

Such tightly packed and orderly films have two advantages over more typically disordered polymers: charges flow through the material far more efficiently, and when used as a light source (phenylene vinylene is widely used to make organic light-emitting diodes), the material has fewer luminescence-quenching defects. The group plans to make light-emitting diodes and solar cells out of the material. It won't be long before these various new findings make their way into the designs of commercial devices. --Graham P. Collins

New Offensives against HIV
A research insight, a new drug target and an advocacy group assist in fighting the disease

In the past decade, HIV infection in the industrial world has largely evolved from a virtual death sentence to more of a chronic disease, which is a testament to the efforts of researchers and patient advocates. But the 40 million HIV-positive people worldwide are a somber reminder of the work ahead. Resistant strains of the virus have appeared; citizens in developing countries lack access to lifesaving drugs; and basic questions about the progression of the virus postinfection remain. Yet 2005 brought hopeful news on all fronts.

Researchers know that HIV infection leads to a massive depletion of CD4 white blood cells, but why this happens is still up for debate. Is the virus killing all the cells directly, or is there an indirect mechanism that explains the widespread death? Daniel C. Douek, an immunologist in the Vaccine Research Center at the National Institutes of Health, implicates both direct and indirect mechanisms. His work shows that HIV starts in the gut, home to the largest population of the virus's preferred CD4 targets--those with a receptor called CCR5. HIV attacks and kills these cells directly early in the course of the infection.

The mechanism behind CD4 cell death turns indirect as the disease continues. The lack of immunity in the gut allows other pathogens to thrive. This condition overstimulates the lymph nodes, and they activate large numbers of CD4 cells. Once activated, the cells progress through a natural process that eventually leads to their death, whether or not they are infected. Continuous rounds of activation and death slowly deplete CD4s. Douek's research not only sheds light on how HIV wreaks havoc, it also suggests that HIV vaccines might do well to initiate an immune response in the gut.

With vaccines still on the horizon and with HIV strains developing resistance to antiretrovirals, identifying new drugs is critical. Current treatments focus on disrupting viral proteins, which have many opportunities to mutate. But HIV also relies on various host cell proteins, so researchers are beginning to investigate as therapeutic targets these human proteins, which are much less likely to mutate.

Joachim Hauber of the Heinrich Pette Institute in Hamburg, Germany, has discovered one such target: deoxyhypusine synthase (DHS), an enzyme that activates a host protein necessary for viral replication. Hauber's group found it could inhibit the virus by blocking DHS with an experimental drug. The drug did not appear to harm host cells or induce resistance after prolonged use and was effective against strains resistant to current antiretroviral therapies. Even though potential treatments are still years away, Hauber's work demonstrates a powerful new offensive strategy.

Today's drugs have made HIV infection a manageable disease for many, but millions in developing countries cannot afford them. A South African advocacy group founded in 1998, Treatment Action Campaign (TAC), has worked toward greater access to HIV therapies for the estimated 5.3 million infected in their country. TAC's legal actions have forced the government to provide free antiretrovirals to HIV-positive pregnant women. More recently, TAC has negotiated a discount from Bristol-Myers Squibb on amphotericin B, used to treat a deadly opportunistic infection common to HIV patients. And this past February, TAC launched a campaign urging the government to treat 200,000 people with no-cost antiretrovirals by 2006. TAC chairman Zackie Achmat has said that the government must wake up to AIDS; TAC is ringing the alarm. --Aimee Cunningham

Creative Paths to Open Access
Technology supplies new protections against threats of a fortress society

It seems a truism that science and technology function best when new discoveries and ideas can circulate freely and find the widest audience. But governments and businesses face constant pressures toward secrecy. Ideally, society should strike a balance between transparency in government and the privacy that citizens have come to expect, between openness in research and the protections that commercialization requires.

Individuals and companies have launched initiatives recently that enhance open access in many welcome ways. Patrick O. Brown and Michael B. Eisen have served as effective champions for the burgeoning open-access movement in fundamental research. Eisen and Brown are among the founders of the Public Library of Science and were instrumental in its successful launch of several new journals that are freely available online, including a medical journal inaugurated in October 2004. They worked behind the scenes to persuade Congress and the National Institutes of Health to increase public access to taxpayer-funded research, and they guided the formation of new policies that require that all scientific articles generated by NIH grants be deposited in PubMed Central.

A substantial endowment of intellectual property to the public domain came from IBM. The tech giant waived royalties permanently on 500 of its software patents and counterpart patents in other countries, in essence providing a substantial endowment of intellectual property to the fertile community of "open source" programmers. Although the 500 patents represent a small fraction of Big Blue's portfolio, they reinforce numerous other steps that make IBM a noteworthy leader in embracing the open-source movement. IBM buttressed Linux by offering the operating system on a number of its products years ago, and the company has assigned many of its own programmers to develop new software for that free and open platform.

Freeing information of a different kind, MyPublicInfo, Inc., launched in 2004 a novel and valuable service that, for an affordable fee, allows citizens to view the complete contents of public records attached to their identity. Whereas credit reports have been available for a long time, MyPublicInfo gathers a much broader dossier from legal, government and educational records. The company wisely takes pains to authenticate the identity of its customers so that it does not inadvertently abet the very identity thieves that it aims to thwart.

Customers of MyPublicInfo may be unpleasantly surprised to see how much of their "private" information is available in the public domain--and how much of it is erroneous. In the same vein, Senator Arlen Specter of Pennsylvania and Senator Patrick Leahy of Vermont introduced legislation earlier this year that would give Americans more control over information that businesses and government agencies collect on them. The Personal Data Privacy and Security Act requires data brokers to let people see what information the brokers have about them and allows citizens to correct many kinds of inaccuracies in the databases. The bill is scheduled for consideration by the Senate this year. --W. Wayt Gibbs

A Force for Change
China's homegrown NGOs serve as the nation's environmental conscience

"New Social Power in China" read one 2004 cover of Economics, a Beijing-based magazine. The reference cited the work of domestic environmental groups that oppose the building of massive dams in the country. These nongovernmental organizations--along with others that militate on environmental, public health and legal issues--have begun to serve as a vital counterpoint to the government's otherwise unchecked push to propel the nation's blisteringly fast-paced economic development. The NGOs have become a new force for political activism in China's post-Tiananmen era.

They survive by not confronting the government directly but by adopting more subtle paths to social change. "Environmental NGOs ... play a critical role in advancing transparency, rule of law and official accountability within the Chinese political system," noted Elizabeth C. Economy of the Council on Foreign Relations at a hearing before the U.S. Congress this past February. "Through this process, they have become a significant force for political reform."

Green Watershed, a Chinese NGO dedicated to river management issues, has organized peasants in the province of Yunnan to protect wetlands and to oppose dam construction. There government projects have threatened the ability of farmers and fishers, many of them ethnic minorities, to earn a living. The group has waged a successful campaign to suspend one plan to build 13 dams on the Nu River, which slices through remote gorges in Yunnan. The government had announced the project in 2003, weeks after Unesco named the surrounding area a World Heritage site. In the summer of 2004 the group made an underground documentary about the poor living conditions of peasants located near the 12-year-old Manwan dam, touted by the government as a paragon of development.

The organizer of the group, Yu Xiaogang, is an environmental scientist and a Communist Party member who illustrates the ambivalent relationship the government has with these groups. A competition organized by several government agencies chose his group as one of the top 10 examples of sustainable development within China for its work on preserving the Lashi watershed. More recently, police have confiscated his passport and prohibited him from leaving China.

The emergence of environmental NGOs in China is a new phenomenon dating only to 1994, when the government gave permission to establish independent organizations that survive typically without government funding. The first group to take advantage was Friends of Nature, which has adopted positions on issues such as preserving the golden snub-nosed monkey and the Tibetan antelope.

Seventy-three-year-old Liang Congjie, a former history professor who founded the group, continues to speak out about the environmental price paid to pursue development of the economy. Some of Liang's oft-cited remarks throw cold water on the Chinese economic miracle. "If Chinese wanted to live like Americans, we would need the resources of four worlds to do so," he has said.

Although Friends of Nature often collaborates with the government, the relationship can still be a tenuous one. In 2002 the government gave Friends of Nature an ultimatum: expel Wang Li-xiong or shut its doors. Wang is one of the group's founding board members and was a supporter of two Tibetan monks who faced execution. Such tensions may establish a dynamic that delineates the pathway to political as well as environmental reform. --Gary Stix

Waiting for Wi-Far
New standards and hardware expand the reach of wireless

Wireless networking is expanding beyond the next room--or from downstairs to upstairs. Standards developers have set their sights on unused television bands to create regional-area networks. Makers of fractal-shaped antennas, for their part, hope to integrate suites of wireless services unobtrusively into cars, cell phones and other devices.

The most common networks, based on Wi-Fi technology, radiate only about 100 meters from their source. Compare that with a range of kilometers for the frequencies between 54 and 862 megahertz, home to VHF and UHF television. Lower frequencies that propagate with less loss are better at penetrating foliage and buildings and are better at non-line-of-sight transmission. Hence, frequencies less than 1 GHz are better for longer-range wireless than the 2.4 GHz or higher used in current wireless networks. In 2002 the Federal Communications Commission began soliciting public comments on the feasibility of wireless network transmissions broadcasting over the largely unused frequencies separating these TV channels. Access to these frequencies would mean a huge growth in wireless service.

To pave the way for UHF wireless, the Institute of Electrical and Electronics Engineers (IEEE) has formed a working group to develop standards for wireless interoperability in the UHF channels. Led by chairman Carl Stevenson, an early pioneer in wireless standardization, the group is charged with formulating a standard, known as 802.22, specifying how wireless transmitters and receivers must coordinate so as not to interfere with one another or with TV stations. The idea is to use new technology called cognitive (or "smart") radio capable of sensing the spectral environment. Base stations and user terminals would check for an incumbent's presence on a channel, looking for an open channel or adjusting power levels so they do not interfere with the incumbent's signal. Stevenson's group, which works closely with TV broadcasters and other licensed users, hopes to finish in early 2007. The FCC would then have to establish final rules for transmitter powers and frequencies, if it had not already.

As wireless services proliferate, manufacturers have to figure out how to combine them into one device. The obvious approach is multiple antennas. But makers of fractal-shaped antennas contend that a single antenna might do it all. Fractals--branching shapes that look identical at all size scales--scrunch a tremendously long curve into a small space. Fractal antennas would behave like several traditional whip antennas of different lengths all twisted together, allowing them to receive multiple-frequency bands.

In 2002 a Spanish designer of fractal antennas, Fractus, partnered with automotive supplier Ficosa International to bring the technology to vehicles. Fractus supplies antennas for devices, including headsets, gaming systems and European cell phones, and the U.S.-based Fractal Antenna Systems develops antennas for defense and other applications. The joint venture, called Advanced Automotive Antennas (or A3) and now wholly owned by Ficosa, has supplied fractal antennas for the Fiat Ducato, Peugeot Boxer and Citron Jumper. In January, A3 signed a licensing agreement with Nippon Antenna, a supplier of Mazda and Nissan. The company produces two antennas that can fit inside the external rearview mirror: a miniature AM/FM radio antenna and a triple-function antenna combining radio, the GSM cellular phone standard and GPS. --JR Minkel

Designing Artificial Life
Biologists move a few steps toward building cells from scratch

Think of a biological cell as a tiny programmable device that happens to be alive, and you have the basic idea behind an emerging field called synthetic biology. Instead of trying to unravel the complexities of natural biological systems, "synthetic biologists" ultimately want to build highly predictable, simple living cells from scratch, using off-the-shelf parts. No one has done that just yet, but a growing number of scientists and engineers are now taking the first steps toward manufacturing life-forms to order.

Figuring out how to program an artificial cell is high on the list of priorities. The functions of a natural cell are controlled by complex networks, or circuits, of interacting genes. Much the way that engineers can assemble toggle switches and oscillators in an electronic circuit, the new breed of biologists hopes to build modular "plug and play" genetic circuitry.

One such module was described by James J. Collins and his colleagues at Boston University in the June 1, 2004, issue of the Proceedings of the National Academy of Sciences USA. Collins's team designed genetic toggles--on/off switches--that could control natural networks, such as those that direct the production of proteins, inside a bacterial cell. The work not only demonstrates that cells can be programmed using modular-design strategies, it also serves as a model for a new class of therapeutics that could regulate the networks. Collins is now busy attempting to reverse-engineer some genetic networks--a technique that may someday help scientists determine the molecular targets of new drugs.

Collins's genetic modules were constructed using standard cloning techniques, basically by cutting and pasting natural DNA into place. Late last year another group of scientists, led by George M. Church of Harvard Medical School, described a new method of making synthetic DNA. Assembling DNA, life's programming code, has existed as a laboratory technique for many years. Church and his colleagues used the new method, however, to manufacture all 21 genes needed to make a subunit of a ribosome, the cellular machine that assembles proteins. The ability to construct long sequences of synthetic DNA gives scientists the power to create genes that never existed before.

More recently, Church announced a new DNA-sequencing technology that promises to be faster and about one ninth the cost of conventional methods. It is a crucial step toward developing affordable genome maps that could become part of everyone's medical record.

As scientists begin to manufacture genetic circuits and artificial molecules in greater numbers, they will undoubtedly wish to package them inside a membrane of their own design--in due course producing a truly artificial cell. Last December, Albert Libchaber of the Rockefeller University described the creation of a cell-like assembly, which he called a "vesicle bioreactor." The vesicle consists of a fluid extracted from Escherichiacoli bacteria that is encircled by a laboratory-made lipid bilayer--much like the membrane of a real cell. The vesicles did not have their own DNA, but they were able to metabolize nutrients acquired from the surrounding medium through special proteins in the membrane. Libchaber thinks of the vesicles as enclosed laboratories that not only may have practical applications in chemistry and medicine but also might help us understand how the first natural cells evolved. --Michael Szpir

New Aircraft, Big and Small
A 570-metric-ton mammoth and a craft that burns alcohol are now flying

One of the biggest and one of the smallest commercial airplanes took to the skies during the past year. In April, the world's largest passenger airliner, the Airbus A380 Navigator, made its maiden flight over the company's Toulouse, France, assembly plant. Soon thereafter the first alcohol-powered aircraft, the EMB 202 Ipanema crop duster, was introduced by Brazil's Indstria Aeronutica Neiva, a subsidiary of Embraer SA.

A few months later at the Paris Air Show, the massive A380 superjumbo jet wowed the crowds of onlookers, who were amazed to hear how quiet it was. Designed to carry 555 to 800 passengers, which is at least a third more than the current airline heavyweight, the Boeing 747, the twin-aisle double-decker from Airbus will weigh 570,000 kilograms when fully loaded. The plane's wings span 80 meters, 15 meters more than a 747's, and the jet provides 50 percent more floor space. Yet on a per-seat basis, the A380's four turbofans burn 12 percent less fuel than a 747's engines do.

This year's debut follows a complex, $15-billion effort by French, German, Spanish and British aerospace firms to develop what promises to be a significant step for the civil airliner. Airbus designers and engineers have enhanced the A380's flight operations and economic performance by incorporating several cutting-edge technologies into structures and systems. The new mega-transporter, for example, achieves significant weight savings by using lightweight but strong carbon-fiber and other advanced resin epoxy composite materials. About 800 kilograms are saved per plane by replacing conventional aluminum fuselage panels with ones constructed of Glare, a glass-fiber reinforced aluminum laminate that is about one quarter lighter and has much better resistance to mechanical fatigue and damage. A new high-pressure hydraulic system for controlling the flight surfaces provides reliability and cost benefits and reduces weight. The giant airliner also boasts a high-tech cockpit with the latest interactive displays and fly-by-wire avionic systems.

After test flights are completed and the A380 is certified, it is slated to enter service in the second half of 2006 with its first operator, Singapore Airlines. If Airbus planners are correct, the European company's flagship will ease congestion at major airports by transporting more people more efficiently than ever on the world's principal air routes.

With oil prices at record levels, pollution limits in place at many airports and the threat of emission-control regulations, the global aviation industry has good reason to embrace alternative fuel technology. The single-seat EMB 202 Ipanema agricultural utility aircraft from Neiva/Embraer is the first production-series model to burn ethanol produced from sugarcane. This achievement is a natural progression for Brazil because its automobiles have been running on this type of renewable alcohol fuel for more than two decades, an effort that was launched in response to the 1970s oil crisis.

Not only is ethanol a third or fourth the price of aviation gasoline and a cleaner energy source, it helps to improve the aircraft's overall performance. The new Ipanema piston engine also brings other advantages, including lower maintenance costs and a 20 percent reduction in operating costs. So far Neiva/Embraer has received more than 100 orders for the novel crop duster and has plans to install alcohol-burning engines in some of its other models. Company engineers say that conversion of existing aviation gas engines is not only feasible but cost-effective. --Steven Ashley

Watching the Brain at Work
Innovations in imaging let scientists ascertain what's going on in your head

Neuroscientists are exploring previously uncharted territories in the microscopic world of the neuron--observing brain cells while they work, detecting microscopic evidence of Alzheimer's disease in the living brain, and even engaging in some mind reading. The 1990s were touted as the "decade of the brain," but scientists in the 2000s are examining the living brain with far more fantastic precision.

To study the functions of neurons at the microscopic scale, researchers typically use fine-glass electrodes, but that method cannot provide the precise location of the cells while the animal is alive. The researcher can pinpoint a cell's whereabouts only by injecting it with a chemical marker that can be seen by a microscope once the animal is sacrificed.

Now a new technique--called single-neuron functional imaging--allows scientists to watch brain cells at work while they are still in the brain. Earlier in 2005 R. Clay Reid and his colleagues at Harvard Medical School made time-lapse images using a laser and a microscope that recorded the simultaneous activity of hundreds of neurons in the visual cortex of laboratory cats and rats. The method, which was described in the February 10 Nature, should enable neuroscientists to make architectural maps of brain functions such as vision, movement and learning with single-cell accuracy.

Neuroscientists are not only constructing better maps of the brain, they are taking steps toward reading the human mind. Yukiyasu Kamitani of the ATR Computational Neuroscience Laboratories in Japan and Frank Tong of Vanderbilt University showed that there is a tight coupling between brain states, as measured by functional magnetic resonance imaging, and subjective mental states. Writing in the May Nature Neuroscience, the scientists describe how they were able to predict which one of eight visual patterns a person was looking at by decoding the activity among small groups of neurons in the visual cortex. They believe this type of mind reading can be extended to investigate the neural basis of awareness, memory and other types of "mental content."

Watching the brain do its job is fine when everything is working well, but when disease strikes, scientists would like to look inside to find out what might be causing the problem. Neurologist Bradley Hyman of Massachusetts General Hospital and his colleagues have developed tools that can track microscopic neural changes in a living mouse model of Alzheimer's disease. Using multiphoton microscopy and a fluorescent tracer, the scientists were able to detect the presence of amyloid plaques--a hallmark of the disease--with microscopic accuracy. The scientists are currently exploring a similar method that uses positron-emission tomography to diagnose and study the progress of the disease in humans.

These new techniques allow scientists to catch a glimpse of what is going on in the brain, but another recent development will help them to understand what they are seeing. In the April 22 Physical Review Letters, Nathan N. Urban of Carnegie Mellon University and his co-workers describe a method that enables them to predict how groups of neurons synchronize their activity. Because synchronized activity is the basis for coding and storing information in the brain, their work has broad implications for sorting out how the brain makes the remarkable thing we call the mind. --Michael Szpir

Practical Nanotubes
Molecular-scale fabrication points toward commercial carbon electronics

It is a long way from the slender nanotube--a chicken-wirelike cylinder of carbon a billionth of a meter thick--to a revolution in electronics. The very smallness that makes nanoscale materials so attractive as components of next-generation electronics also makes them extremely challenging to manipulate collectively. Investigators in the field hope, therefore, to realize commercial devices by piggybacking on existing manufacturing techniques. This year has seen several demonstrations of how nanoscale components might be integrated with conventional manufacturing as well as a report outlining a regulatory protocol for nanomaterials.

Motorola Physical Sciences Research Lab in May unveiled a prototype high-definition television screen, eschewing the cathode-ray tube for a glass panel coated with a brushy array of nanotubes. Nanotubes usually will not grow in precise arrays below 1,200 degrees Celsius, but Motorola's James E. Jaskie and his colleagues devised a metal catalyst that brought the requisite temperature down to a few hundred degrees, low enough to be achieved in the conventional ovens used to deposit thin silicon films. Other companies had built nanotube screens, but the tubes were suspended randomly in a paste. The paste-based screens have lower resolution, and the addition of a filter adds complexity.

Nanotubes are also front and center in the quest for displays printed from bendable polymer components, so-called flexible electronics. Several groups have mixed nanotubes with a polymer to boost the material's conductivity. In the summer of 2004 a DuPont Central Research and Development team reported the first printing of such a polymer, in large sheets, using an existing technology. Called thermal printing, it uses a laser to fuse the polymer to a substrate, like an iron-on transfer. This year the researchers reported printing polymer conductors, semiconductors and dielectrics all onto the same surface.

A more advanced question is how to conveniently turn nanotube arrays into more complex devices. Bradley J. Nelson of the Swiss Federal Institute of Technology in Zurich aligns hundreds to thousands of multiwalled nanotubes on and between tiny electrodes by applying a standard two-dimensional electrical field to a suspension of tubes. He then burns off the nanotube's top layers, breaks them in the middle, or otherwise tweaks them to create electronically controlled emitters, rotating actuators and telescoping linear actuators. Arrays of such devices might serve as robust chemical sensors or self-focusing light emitters, for example.

Building precise electronic circuits out of nanotubes or other nanowires is a more challenging problem. Today's chipmakers simply etch the pattern they want. Hewlett-Packard Laboratories investigators were some of the first to suggest building nanoscale circuits from scores of crisscrossing nanowires, or crossbar arrays, which could be chemically self-assembled at low cost. Electronically activating some of those junctions would create the circuit. The same researchers recently simulated chips made of such nanowire crossbar arrays. They found that given enough redundancy, they could overcome the crossbars' high defect rates and still pack 100 times more devices into a given area than today's chips have.

A major policy concern in recent years has been whether and how to regulate nanomaterials, which can penetrate cells more easily than larger particles can. Last summer the U.K.'s Royal Society and Royal Academy of Engineering addressed those fears, concluding after a 12-month study that nanomaterials being produced in large quantities should be classified as new chemical entities under existing U.K. or European Union regulations and recommending that toxicity studies begin at once. --JR Minkel

True Green
Architects and chemists strive to place an environmental stamp on their work

The color green--read "environmentally friendly"--now prefaces everything from gasoline to mutual funds. But is there anything truly green about these products other than the profits that they make for their purveyors? In a few instances, industry and the professions have begun to earn their colors. One force steering the chemical industry in this direction is the Green Chemistry Institute (GCI) in Washington, D.C. The GCI has stewardship of the annual Presidential Green Chemistry Challenge Awards. GCI director Paul T. Anastas announced in 1998 the 12 guiding principles of "green chemistry." Number one: "It is better to prevent waste than to treat or clean up waste after it has been created." Green chemistry, according to GCI, is not only easier on the environment, it saves companies millions of dollars they would otherwise spend on cleanup and disposal.

In 2005 Archer Daniels Midland Company, along with Novozymes, won jointly a GCI award for developing a process to replace unhealthful trans-fatty acids in soybean oil (used in vegetable shortening) with healthier unsaturated fats. The Food and Drug Administration requires the labeling of trans-fatty acids on nutritional panels beginning January 1, 2006.

Their process uses an enzyme, a biological catalyst called Lipozyme. It will save hundreds of millions of pounds of sodium methoxide, detergents and bleaching clay, as well as 60 million gallons of water, every year.

In addition to reducing waste, green chemistry seeks to eliminate poisonous reagents in industry. Toxic chemicals are also a concern for architects designing "green buildings," which are friendly to both their occupants and the surrounding environment. Adhesives and paints, the sources of "new car smell," can give off volatile organic compounds that cause headaches and nausea. Other troublemakers include mold spores and dust particles, which can lead to respiratory problems that reduce employee productivity in poorly ventilated office buildings. Innovators in environmentally oriented architecture exist at both the institutional and individual firm level.

In 2000 the U.S. Green Buildings Council (USGBC) defined the LEED (Leadership in Energy and Environmental Design) standards. Architects can have their buildings LEED-certified under a system that awards points in categories such as "indoor environmental quality" and "energy and atmosphere." The USGBC launched a pilot LEED program in August for home design. As with green chemistry, green architecture is profitable for its practitioners when all aspects of the process are accounted for, including long-term power bills and disposal of waste generated by construction.

The Cond Nast building at 4 Times Square in Manhattan, the world's most famous LEED-certified building, features an integrated recycling system, solar-cell wall panels and gas-powered fuel cells. Built in 1999, the structure was designed by Fox & Fowle, the New York-based architectural firm founded by Robert Fox and Bruce Fowle. The Helena, a 37-story LEED-certified apartment building on Manhattan's Upper West Side designed by Fox & Fowle, was completed in 2005.

The Frito-Lay plant in Henrietta, N.Y., which opened this June, won a LEED Gold award for its innovative use of permeable parking lots (to filter storm water and reduce waste flow), solar cells and nonvolatile furnishings. It was designed by William McDonough & Partners, which also drew the plans to cover the Ford plant in Dearborn, Mich., with the world's largest "living roof"--10 acres of roof planted with vegetation that simultaneously insulates buildings, filters rainwater and reduces heat absorption.

Norman Foster founded Foster and Partners, a London-based firm that designed the Swiss Re building, a giant chrysalis-shaped glass spire finished in 2004--and London's first green skyscraper. Wind flow across its curved walls creates a pressure differential that drives the ventilation system and reduces the need for air-conditioning. Along with the natural lighting, the air movement halves the power needed to operate the building. A structure's natural environment can be integrated into a building's engineering design by an inspired architect such as Foster. --Kaspar Mossman

Hope for Fixing Gene Defects
Studies have shown improved hearing in animals and demonstrated a new gene delivery method

Gene therapy tries to replace, repair, augment or manipulate a patient's own genes with the goal of treating illness. The technique not only can save lives but also can treat chronic conditions such as hearing impairment.

Hearing loss affects roughly 28 million Americans because the nearly 50,000 inner ear hair cells that humans are born with die off gradually over time. Unlike those hair cells in fish, amphibians and birds, those of mammals cease proliferating early in life, meaning hearing loss is usually permanent. Two research teams have demonstrated the possibility of regrowing hair cells.

In 2003 Yehoash Raphael of the University of Michigan Medical School at Ann Arbor and his colleagues had triggered inner ear hair cell growth in guinea pigs using adenoviruses that inserted a gene called Atoh1. Normally Atoh1 is active only during embryonic development in cells that go on to become hair cells. Expanding on these experiments, they reported the first instance of scientists restoring function to inner ear hair cells in live adult mammals.

In March they published a study in which they employed their gene therapy on the left ears of 10 guinea pigs deafened by drugs. Eight weeks afterward the nonsensory cells lining the ears transformed into new inner ear hair cells and led to improved hearing. Raphael's group is not the only one working in this area.

Zheng-Yi Chen of Massachusetts General Hospital and his colleagues also found they could regenerate inner ear hair cells in mice. They started with a broad survey of gene expression patterns during embryonic development of the inner ear in mice and isolated a gene, Rb1, that appears to permanently brake hair cell growth. Their study, published in January, reported that deleting Rb1 led to mice with more apparently functional inner ear hair cells than normal. They also found that cultured mature inner ear hair cells from mice were able to regenerate when they had Rb1 knocked out.

Raphael and his team caution that they improved hearing but did not restore normal hearing and that it will take many years before Atoh1 gene therapy will prove ready for humans. Chen and his colleagues remarked that knocking out Rb1 made hair cells divide continuously, which could potentially lead to tumors. Future research should focus on inactivating Rb1 only long enough for clinical benefit.

As effective as gene therapy might prove in the future, the viruses that gene therapy often employ to carry genes into the body can sometimes kill a patient or cause cancer. Paras N. Prasad, executive director of the University of Buffalo Institute for Lasers, Photonics and Biophotonics, and his team are developing silica particles roughly 30 nanometers wide as nonviral vectors for carrying out gene therapy.

Organic molecules coating the nanoparticle surfaces bind to genetic payloads and protect the delicate DNA from enzymatic digestion. Prasad and his colleagues reported in July that when injected into mouse brains, the nanoparticles affected more than one third of targeted cells, with equal or greater effectiveness than existing viral delivery systems. No mice showed adverse side effects one month after the injections. Research on both new therapies and gene delivery methods point toward ways of overcoming the tremendous obstacles that this form of therapy has confronted. --Charles Q. Choi

Photons, Electrons and Silicon
Silicon lasers enable integration of optics and electronics

Lasers have become indispensable for even the most humdrum tasks, from highlighting Powerpoint presentations to burning music CDs. Lasers are also essential for high-speed communications along optical fiber, which has vastly greater bandwidth and much less crosstalk than electrical transmissions in copper wire. Recently scientists developed lasers made from silicon, an important first step in the development of high-speed chips that will fully integrate light-speed communications with the processing power of silicon electronics.

As CPU processing gets faster, the need increases for near-instantaneous clock synchronization within CPUs as well as for fast interchip communications for parallel computation. The integrated-circuit industry is rooted in silicon technology. Anything that can be made out of silicon can be fabricated with submicron dimensions and in huge volume, with great reliability. But silicon's electronic properties prevent it from functioning as a conventional laser. The material has an "indirect bandgap," which means that electrons cannot emit photons by dropping directly from one energy level to another. Solid-state lasers have until recently been made from direct-bandgap materials such as gallium arsenide (GaAs), which can spit out photons in the desired manner. Making an interface between the GaAs devices and silicon systems is difficult, and results are hard to reproduce to industry specifications.

A technique called Raman scattering, though, has overcome this problem. The process begins when electrons first absorb photons. The excited electrons then "scatter" energy by emitting both phonons--a vibration of the silicon's lattice crystal--and photons of lower energy than the ones absorbed. In October 2004 Ozdal Boyraz and Bahram Jalali, two engineers at the University of California, Los Angeles, announced that they had demonstrated the first silicon Raman laser. It was an infrared device that emits pulses, each lasting 25 trillionths of a second, far shorter than the interval between them. The short pulses were necessary because of a two-photon absorption effect. Silicon atoms can absorb two photons simultaneously, generating an electron and a hole (the absence of an electron). The electron-hole pair remains in the material for a long time, absorbing power and weakening the laser amplification. The long gaps between the pulses in the Raman laser allow the electrons and holes to dissipate.

In February, Intel's Haisheng Rong and his colleagues published a paper in Nature detailing construction of a continuous-output silicon laser that attacked the two-photon absorption effect in a different way. Their device, a five-centimeter-long, S-shaped silicon waveguide, cleverly sidestepped the issue. Rong used a classic silicon device, the PIN diode. He doped one side of the waveguide with positive charge and the other with negative charge and then applied a voltage sideways across the waveguide to remove electron-hole pairs generated by two-photon absorption before they could absorb laser power.

Rong's innovative device exploited the same five-centimeter length of silicon for use as both an infrared laser and a semiconductor diode. His silicon laser is a significant advance because continuous beams, which can be modulated and chopped, provide the basis for digital communications. Low-cost optoelectronic devices made entirely from an industry-standard silicon process are still a long way off, but these lasers build a foundation from which we can expect to see light-speed information processing technology develop into reality. --Kaspar Mossman