The flimsy strip of golden film lying on John Wyatt's desk looks more like a candy wrapper than something you would willingly put in your eye. Blow on it, and the two-millimeter foil curls like cellophane. Rub it, and the shiny film squeaks faintly between your fingers. In fact, you have to peer rather closely to spot a neat patchwork: a tiny photodiode array, designed to bypass damaged cells in a retina and, Wyatt hopes, allow the blind to see.
This small solar panel is part of a prototype retinal implant. For more than 15 years, Wyatt--an engineer at the Massachusetts Institute of Technology--and his colleagues have pursued an implant to electrically stimulate the retina. At first, even Wyatt doubted the project could succeed. The retina, he says, is more fragile than a wet Kleenex: it is a quarter of a millimeter thin and prone to tearing. In about 10 million Americans--those with the disorders retinitis pigmentosa and macular degeneration--the delicate rod and cone cells lining the retina's farthest edges die, although ganglion cells closer to the lens in the center survive. In 1988 Harvard Medical School neuro-ophthalmologist Joseph Rizzo asked Wyatt two key questions: Could scientists use electricity to jolt these leftover ganglion cells and force them to perceive images? Could they, in effect, engineer an electronic retina?
They decided to try. Today Wyatt and Rizzo are perfecting their second implant prototype, a subretinal device that processes images viewed through a tiny camera mounted on special eyeglasses. Supported principally by the U.S. Department of Veterans Affairs, their team--known as the Boston Retinal Implant Project--plans to begin testing the implant in animals soon. Wyatt calls the project a classic case of science: 10 seconds of brilliance followed by 10 years of dogged work.
A realist, Wyatt compares vision via retinal implants to playing the piano with boxing gloves. Despite the best of scientific advances, our instruments remain crude, and the nervous system is very refined, he says. A truly noninvasive subretinal implant, commercially available widely to people the world over, remains a dot on the horizon.
Still, that horizon beckons--and Wyatt is not the only sensory scientist marching toward it. In the coming years, if scientific dreams become reality, we will see even when our eyes are damaged, taste sweet foods with less sugar, hear even when our ears grow old. As a bonus, we will have electronic noses to sniff out environments and taste chips to diagnose disease [see box on opposite page].
Researchers are increasingly discovering the powerful yin-and-yang of engineering and biology, says neuroscientist John S. Kauer of Tufts University, who is developing an electronic nose. We start with biology and use mathematical or computer modeling as the basis for building an engineering device, Kauer explains. That sparks questions relevant to biology, and around we go. Engineering informs biology, and vice versa.
TO SOME, the blend of biology and engineering holds the allure of sweet success. For decades, a few companies have poured, tasted and tinkered with promising artificial sweeteners to rival sugar. The sweetener market has big potential. This year the average American will consume an estimated 140 pounds of pure cane sugar, corn syrup and other natural sweeteners. Today's most popular artificial sweeteners include aspartame (used in Equal), sucralose (in Splenda) and saccharine (in Sweet'N Low).
But if companies could just find the perfect formula for a fake sweetener--an elusive chemical concoction to give a bright, clear and brief sugary taste, stable when stirred into coffee or baked into cake--they could make a mint. What is more, corporate spokespeople hasten to note, this iconic artificial sweetener could significantly cut the calorie content of the average American diet.
That's where biology comes in. Over the past six years several teams of researchers--at the University of California, San Diego; the National Institutes of Health; Harvard University; the Monell Chemical Sciences Center in Philadelphia; and elsewhere--have identified and characterized major cell receptors on the human tongue required for us to taste sweet, bitter and savory (umami) flavors. (Salty and sour flavors remain a molecular mystery.)
We've used several molecular genetic approaches to prove that the cells expressing sweet and bitter responses are highly selectively tuned to respond only to attractive or aversive stimuli, respectively, and are hardwired to trigger appropriate behavioral responses, says Nicholas J. P. Ryba of the National Institute of Dental and Craniofacial Research, whose team worked with U.C.S.D. molecular biologist Charles S. Zuker's group on some of the pivotal taste cell studies.
Ryba and Zuker studied RNA sequences from the tongue to reveal likely genes for taste receptor cells. Next they bred knockout mice missing these putative genes and tested the mutants for telltale changes in taste. Finally, they homed in on specific cell receptors responsible for a mouse's ability to taste certain flavors. In this fashion, the team has, over the past five years, found a family of roughly 30 bitter receptors, one major sweet receptor and one major savory receptor.
What scientists have documented is simple: both mouse and man are suckers for taste--we lap up the sweet and avoid the bitter. From an evolutionary standpoint, this culinary bias keeps us from eating bitter poisons or foul food. On the downside, it can also make us fat.
To capitalize on these finds, Zuker in 1998 teamed with a group of scientists and businesspeople to launch Senomyx, a La Jolla, Calif.based biotechnology company working to develop novel flavor ingredients for packaged foods and drinks. With corporate partners such as Coca-Cola, Nestl and Kraft, roughly 85 Senomyx scientists are closing in on a chemical compound known as a taste potentiator--a flavor booster that will allow companies to manufacture sweet-tasting foods with less sugar.
Now Hear This
RATHER THAN AMPLIFY sensory experience, neuroscientist Jeffrey T. Corwin of the University of Virginia hopes to re-create it--in the ear. Worldwide, an estimated 250 million people endure disabling hearing impairments, according to the World Health Organization. The major culprit is the permanent loss of sensory hair cells in the inner ear.
The inner ear is home to the pea-size cochlea, which holds some 16,000 sound-detecting cells, each of which is equipped with hairlike projections that have earned them the name hair cells. This precious stock of cells is a gift at birth: they never multiply, but they do die. Loud noise, disease and just plain aging damage hair cells, muffling one's ability to hear sounds that once seemed crystal clear.
Scientists know that animals as diverse as zebra fish and chickens continue adding cells for hearing or balance throughout life. The adult shark has some 240,000 sensory cells in its inner ear, up from 20,000 in its younger days. Why not us? Biologically, human hair cells are held in a kind of mitotic arrest--shut down from cell division, so they cannot replicate.
One key protein that suspends the human hair cell's cycle is the retinoblastoma protein (pRb). This protein inhibits the expression of genes needed to kick-start cell division. And that can be good: pRb is thought to suppress the complex runaway cell growth that is cancer. But neuroscientists have long wondered whether they could effectively modulate pRb in the inner ear, essentially dimming the protein to allow hair cells to safely regenerate.
In an important first step, Corwin, together with Zheng-Yi Chen and colleagues at Harvard Medical School, TuftsNew England Medical Center and Northwestern University, recently found that shutting off pRb in mouse hair cells prompted those cells to divide and multiply. Most important, the new cells worked normally.
There are solid scientific reasons to believe that we can develop a pharmaceutical that will encourage cell growth, production and regeneration, Corwin remarks. That is the holy grail. Richard J. H. Smith, director of molecular otolaryngology at the University of Iowa, goes a step further: One day we will be able to prevent hearing loss altogether.
In preliminary experiments, Smith and his colleagues have used a technique known as RNA interference, or RNAi, to silence a potential deafness gene in mice. He hopes this early work will eventually translate into gene therapy for patients with inherited progressive hearing loss. Although skeptics question whether gene therapy--or another hyped technique, the use of stem cells--can be reliably used as clinical treatment, Smith maintains that genetics offers a unique molecular window into hearing.
Investigators have identified more than 40 specific genes that are essential for normal hearing function, Smith says. For example, if a mutated protein has an abnormal function that results in hearing loss, by preventing that protein from being made, it should be possible to prevent the hearing loss.
In the meantime, conventional hearing treatments keep improving, Corwin adds. In particular, he notes advances with the cochlear implant, a surgically implanted set of tiny electrodes that stimulates inner-ear cells, basically to turn up life's volume. Today more than 100,000 people worldwide wear these roughly 50,000 implants. Although scientists agree it is impossible to re-create completely the complex workings of the human ear, they can improve the frequencies and fluidity of sounds heard through an implant. Duke University engineers, for instance, are using mathematical algorithms to develop sound-processing software that eventually may help implant wearers enjoy music again.
RESEARCHERS are also developing better implants for the eye. M.I.T.'s Wyatt quips that the retina, which is sensitive to even the slightest pressure, doesn't welcome a brick of a microchip any more than you would like being caressed by a bulldozer. In fact, he says, this machine-man combination is the real showstopper: After the retinal implant works, and we prove them, and the surgeons are familiar with them, then the interesting story starts. It's not about the hardwiring. It's about how patients translate the images they see. How do we learn to speak the neural code? Just what sense can patients make of this visual data, months or years down the line? What is their visual reality?
If Wyatt's retinal implant makes it to market, that reality should work like this: A patient who has received an implant will wear special glasses equipped with a miniature camera that captures images. The glasses will sport a small laser that receives the camera's pictures and converts the visual information into electrical signals that travel to the implant, surgically inserted just below the retina. The implant, in turn, will activate the retina's ganglion cells to pick up the sensation of the image coming in and convey it to the brain, where it will be perceived as vision.
If it sounds complicated, Wyatt comments drily, that's because it is. Their biggest challenge, he says, is encapsulation, or waterproofing the retinal implant to last in the human eye for years. A chronic implant has to be removable in the future or good for life, he points out. That's a pretty high hurdle.
Wyatt's team is not the only one trying to jump it. Optobionics, a start-up company in Wheaton, Ill., is also developing a subretinal implant, called the artificial silicon retina (ASR). This self-contained microchip contains roughly 5,000 solar cells that convert light into an electrical signal similar to that normally produced by the retina's own photoreceptor cells. The solar cells stimulate the remaining functional cells, which process and send signals to the brain via the optic nerve.
Because the ASR does not have an outside power source, camera or other device, it may provide only moderate enhancement for people who still have some sight. Privately held Optobionics--co-founded by brothers ophthalmologist Alan Y. Chow and engineer Vincent Chow--completed the first FDA-approved clinical trials of a subretinal implant in 2002. Follow-up trials continue, with implants tested at the Wilmer Eye Institute of Johns Hopkins University, Emory University and Rush University. Since initially publishing trial results in 2004, however, the company has remained tight-lipped about ASR's efficacy, with no further peer-reviewed journal articles.
Other efforts to develop retinal implants are under way at Stanford University, the Kresge Eye Institute in Detroit and a German company called Retina Implant AG. In addition, the National Science Foundation has awarded the University of Southern California a national engineering research center for developing microelectronic devices that mimic lost neurological functions. U.S.C.'s Center for Biomimetic MicroElectronic Systems is devoting 17 million to three projects, including a retinal implant.
Despite the hype, bioengineering advances are not finger-snap miracles but rather the slow, steady progression of science. No retinal implant will ever be perfect, Wyatt cautions. Either the electrodes are too big, or the wrong cells get stimulated, or something. But you can make the sensory experience better. And we're firmly on that path.
KATHRYN S. BROWN is a science writer based in Alexandria, Va. She is principal of EndPoint Creative, LLC, and serves on the board of the D.C. Science Writers Association. She would use an e-nose to stop and smell the roses (or lavender) and an e-tongue to savor even more dark chocolate.