Boulder smells of peppermint…and crisp snow. The frozen water smells pure, as if still trapped in the clouds hanging just overhead. The sun glints off the Rocky Mountains, their iron musk mixes with mountain pine. Before crossing the road to enter the University of Colorado Boulder, a truck dashes by, muffling these scents with sulfuric exhaust.
As I approach, John Crimaldi, a fluid mechanist, pushes open an eastern door, so he can show me what these smells look like.
The halls of the CU Engineering Center are wide and tall, designed so scientists can construct mechanical goliaths. We walk into Crimaldi’s shop to view his team’s massive creation: a 50-foot-long tank with lavender railings. As I glance at the water, I half expect salmon or tiny sharks to dart through the 5,000 gallons of water.
Suddenly, the overhead lights switch off, and a maze of high-powered lasers appear underneath the flume. A sheet of light slices upwards, and the middle of the tank explodes in what looks like green flames. This fire swirls in slow motion. One spot bulges like a kid pressing a thumb through Play-doh. Other parts stretch, tugged by invisible strings.
This underwater blaze is an odor, or at least what an odor looks like when drifting through space.
This experiment was created so scientists could study how animals and humans use smells to navigate their surroundings. That question forms the heart of a new $6.4 million project, sponsored by the National Science Foundation and the White House Brain Initiative, called Cracking the Olfactory Code.
“We want to understand if different organisms have evolved different strategies for olfactory navigation, or if they’ve all converged on a similar strategy,” Crimaldi said about the project, which involves scientists spread across seven universities.
Think of the project as an A-Team for odors. This science squad has a singular mission: Unravel how the oldest guidance system in the world, smell, works. And by doing so, the team aims to teach robots how to smell too.
If a smellbot seems far-fetched, it shouldn’t, Crimaldi said. Modern life contains plenty of technology geared toward mimicking our other senses. Cameras with facial recognition can spot shoplifters or swap faces on Snapchat. Cochlear implants allow deaf infants to hear their moms for the first time. But similar technology doesn’t exist for smell.
“There’s been much more focus on other senses, especially in humans and primates,” said Nathan Urban, a neurophysiologist at the University of Pittsburgh School of Medicine and member of the COC squad.
As a result, society knows relatively little about the mechanics of how we smell. We know the nose is packed with olfactory neurons—gatekeepers that distinguish scents—but relatively little is known about what comes next. Which area of the brain pinpoints the location of a mother’s freshly baked cookies? Or distinguishes the smell of banana from that of peanut butter? Or what mind center judges the concentration of gas as it leaks from a stove?
This missing knowledge explains why we don’t have smell bots. When police officers need to find a bomb or a person trapped in an avalanche, they rely on canines or other animals, which put those creatures at risk. Crimaldi and his colleagues want to outsource this risk to robots by teaching them how to smell.
But step one in the road to building a smellbot is knowing what smells look like.
Crimaldi, the COC project leader, studies the physics and architecture of scents.
“We’re able to visualize with our own eyes something that’s normally invisible,” Crimaldi said.
An odor is a chemical molecule light enough to be swept around by the environment. Scents travel through air or underwater, before ultimately tripping sensors in our noses—known as olfactory neurons.
Imagine that you’re standing in front of One World Trade Center. Your eyes and visual mind dissect the structure, instinctively detecting contours in the glass and steel as the building rises toward the sky. Based on the smellscapes revealed by Crimaldi’s team, our noses and olfactory brain areas interpret scents in a similar way.
“An odor cloud is very three-dimensional,” Crimaldi said, as the green “flames” flitter through his lab’s giant tank behind him.
This smellscape is created by laser light striking against a fluorescent compound that is injected at one end of the tank. This compound is a surrogate odor, or a chemical with the same physical properties as a scent. It flows underwater with the same motions as a fragrance in the air—but at a slower pace.
The slow motion effect is primarily due to viscosity, Crimaldi said. Viscosity describes a substance’s resistance to fluid movement. Honey, for instance, is more viscous than grape juice. Water is more viscous than air.
“If we want to model something in water that’s meant to mimic something in air, then we have to account for that difference in viscosity,” Crimaldi said. “Since it’s moving through a liquid, it just looks a lot slower. But the shape is the same.”
The physical laws in control of these two settings are exactly the same, Crimaldi said.
When the laser sheet cuts through the tank, it doesn’t reveal the whole odor cloud—but a slice of it. This single layer tells a lot about an odor cloud’s overall shape.
Odors don’t move in a single blob, like fog or clouds. They creep more like an octopus blob, shooting off arms in one direction and then another. The tentacles of odors are created by the water’s turbulent flow as it passes through the tank. As the liquid shifts, the odor is pulled like taffy into thin filaments.
This tug-of-war becomes apparent when you consider how a nose interacts with a smell. Aaron True, a postdoctoral researcher in Crimaldi’s lab, has built the mechanical version of a nostril using a small tank and a tube.
“As you inhale, the odor essentially gets stretched out, so you end up with very thin regions of strong concentrated smells,” True said. “But then right next to it, you’ll have a region with a very low signal, very low odor.”
So when you’re in a garden and you stop to smell the roses, every sniff you make is changing the aroma for someone else. Inhalation creates large, blank pockets of space inside an odor cloud. These voids are known as intermittency, and they’re the olfactory equivalent of negative space in a photograph.
By analyzing these movements with cameras and computers, the researchers learned that intermittency is not chaotic.
“Filaments of odor will tend to arrange themselves,” graduate student Kenneth Pratt told me as we stared at a splitscreen of swirling colors.
The left side showed the concentrations of the odor in shades of green—dark green meant odor-rich, while black areas showed intermittency. The right panel expresses physical strain placed on an odor as it stretches. Pratt explained odors start by tightening into these long tense filaments, but then they buckle and fold on top of each other.
Virtual reality for smell
These visual patterns create the basis for mathematical equations that calculate how odors move. The formulas, in theory, should apply to all mediums, but subtle differences might exist between water and air.
First-year graduate student Maggie McHugh is double-checking by constructing an air-based version of the lab’s odor flume. Her experiment uses acetone—the compound responsible for the smell of nail polish. An ultraviolet laser will reveal its otherwise invisible motion.
“There shouldn’t be a whole lot of difference, but we’re doing this so we can then recreate these systems for our collaborators working on animal-based experiments,” McHugh said.
The work in Crimaldi’s lab sets the stage for creating virtual arenas for smell, both underwater and in the air. Right now, their laser sheet maps a single slice of the odor cloud. But ultimately, the team plans to move the laser sheet back and forth, so the computer can build a 3-D model of an entire odor plume.
“We plan on taking these measurements made in 3-D space and time to then create a database of smell landscapes,” Crimaldi said.
Crimaldi’s collaborators plan to puff these smellscapes into a closed arena—sized and designed specifically for fruit flies, mice, dogs or humans—creating virtual reality for smell.
“The team members are looking into the brains of the animals, to directly visualize their neurological responses to smell inputs,” Crimaldi said.
Once the COC teams compare notes, they should have a precise outline of how different brains respond to a variety of odor landscapes. This project could then, theoretically, encode these behavioral reflexes into a robot.
“Because at some level, from programming a robot, you may not need to do it the same way our brain does it, or the same way a rat brain does it, or the same way a fly brain does it,” Crimaldi said. “You may simply want to replicate the basic instinct.”
The road to a smellbot
Scents are chemical beacons that convey interesting things about the environment. If you’ve ever caught whiff of a bakery and turned your head, then you’ve used smell for navigation. Yet these smell maps are an overlooked part of everyday human existence, and this bias has bled into how scientists approach our senses.
It’s ironic that smelling—or olfaction—has landed last on the scientific priority list, because it arrived first.
“Smell was part of very ancient evolution,” said COC teammate Katherine Nagel, an electrophysiologist at New York University who is exploring how fruit flies turn in response to smell. “Bacteria use olfaction, single-celled organisms use olfaction, worms use olfaction.”
This utility expresses itself as behavior. Sweet smells draw bees to flowers, fueling pollination and the survival of crops. A wandering albatrosscan sniff a piece of floating carrion from three miles away, while sulfur compounds found in decaying animals were once added to 40-mile gas lines, so turkey vultures and their giant beaks could be used to spot leaks. Adult salmon cross hundred of miles using olfactory cues to find and spawn in their place of birth.
The team is studying various animals to unpack these behaviors. Lucia Jacobs, an evolutionary ecologist at the University of California Berkeley, is examining rescue dogs to see how their skills stack against other super smellers, like hermit crabs and cockroaches. Urban and Pitt mathematical neuroscientist Bard Ermentrout use infrared lights to track the whiskers of mice as the rodents follow odor trails. Ermentrout can then program those instincts into computerized mice.
“Can we create smell-o-vision or a smellbot? I don’t know. What we’re really trying is to understand mechanistically how the brain works,” Crimaldi said. “We don’t necessarily know what we’re going to find. That’s exciting part of it.”
This is the first of a three-part series.