You know that a carrier of an airborne strain of Ebola is about to board a plane where he will share the same stale air with scores of strangers. Do you allow him to risk infecting fellow passengers or do you kill him if that is the only way to prevent him from getting on the flight?

Psychologists and neuroscientists (and of course, philosophers) have long pondered such quandaries that come down to whether one person should be sacrificed for the greater good?

Most people see the value in the utilitarian option of harming one if it protects scores of others. But there is also a significant emotional component given that the decision involves hurting another human being. Some neuroscientists theorize that the choice ultimately comes down to a moral tug-of-war between compassion and cold reasoning

According to a new report, published in Nature, damage to the ventromedial prefrontal cortex (VMPC)—a region in the forebrain associated with emotional response—can blunt a person's emotional response to sacrificing a single person to save many others.

"Moral decision making is based on our emotional reaction to situations as much as it is to any kind of rational thought," says Mario Mendez, a neurologist at the University of California, Los Angeles. "When [the former] is taken away, you have a Mr. Spock, who's just rational about decisions."

Researchers involved in the new study, conducted at the University of Iowa, confronted 30 volunteers with a set of situations in which one person had to be killed to save others. A dozen of the participants were healthy controls and 18 had brain damage, including impairment of the VMPC in six subjects. The initial set of scenarios involved a variable personal factor: In one setup, the choice was whether or not to push someone onto a railroad track to prevent a runaway train from killing five other people; in another, the choice was whether to flip a switch that would route the train from a track where it could strike five people to another track where it would kill only one. Whereas, the case of remotely switching the train's trajectory produced no difference among control and brain-damaged subjects, when it came to actually pushing someone into the train's path, those people with insults to their VMPC were, on average, three times more likely to advocate throwing the man to his certain death for "the good of the many," as Spock would say.

A second set of scenarios was sorted into "low conflict," such as abandoning a child to avoid caring for it, and "high conflict," where one would smother their child to avoid harm to others. In the low conflict case everyone said that it was unacceptable to hurt the child. In the "for the greater good" circumstance, however, the patients with VMPC damage were five times more likely to advocate smothering the baby.

"The decisions of VMPC patients are not amoral," says senior study author Antonio Damasio, formerly a University of Iowa neurologist and now director of the University of Southern California Brain and Creativity Institute. "They are just different from the decisions of other subjects." He adds that these subjects seem to lack the human conflict between emotion and reason. "Because of their brain damage, they have abnormal social emotions in real life," says Ralph Adolphs, a neuroscientist at the California Institute of Technology. "They lack empathy and compassion."

Joshua Greene, an assistant professor of psychology at Harvard University, who first proposed that utilitarian decision making involves overcoming an emotional component, says that the new study is "a really nice demonstration of the idea that moral decisions—at least in cases like these—are not driven by a single moral faculty but rather by two different kinds of processes that can be in competition with each other." He adds that patients with damaged VMPCs are left only with an intact reasoning faculty, which he believes is seated in the dorsolateral prefrontal cortex region at the top of the forebrain. In effect, no competition occurs between this reasoning process and an emotional aversion to harming another human being.

U.C.L.A.'s Mendez notes that the study provides evidence that people do not need cultural and social taboos to form morality. "Part of normal development is this emotional responding to another human being," he says. "It's not something you have to learn or you have to go have a specific religious experience to pick up, or have a cultural experience…. It is based on emotionally responding to others, and there's a part of the brain dedicated to that."