When Pixar screened a computer-animated short film called "Tin Toy" in 1988, test audiences hated the sight of the pseudo-realistic baby named "Billy" who terrorized the toys. Such a strong reaction persuaded Pixar to avoid making uncannily realistic human characters — it has since focused its efforts on films about living toys, curious robots and talking cars to win Academy Awards and moviegoers' hearts.

Today, the "uncanny valley" phenomenon remains almost as mysterious as when Japanese roboticist Masahiro Mori first coined the term in 1970. But scientists have begun venturing deeper into the metaphorical valley to better understand why robots or virtual characters with certain human characteristics can trigger such mental uneasiness. That understanding may prove crucial as humanlike robots or virtual companions enter homes and businesses in coming years.

"We still don't understand why it occurs or whether you can get used to it, and people don't necessarily agree it exists," said Ayse Saygin, a cognitive scientist at the University of California, San Diego. "This is one of those cases where we're at the very beginning of understanding it."

The uncanny valley metaphor suggests that a human appearance or behavior can make an artificial figure seem more familiar for viewers — but only up to a point. The sense of viewer familiarity drops sharply into the uncanny valley once the artificial figure tries but fails to mimic a realistic human.

"If you look humanlike but your motion is jerky or you can't make proper eye contact, those are the things that make them uncanny," Saygin told InnovationNewsDaily. "I think the key is that when you make appearances humanlike, you raise expectations for the brain. When those expectations are not met, then you have the problem in the brain."

All too human
Saygin and fellow researchers don't think the phenomenon follows the valley metaphor exactly. Instead, they suggest the uncanny valley sensation arises when an artificial figure looks or behaves real enough to trigger a mental switchover — the viewer's brain suddenly begins to consider the figure as a possible human. The artificial figure almost inevitably fails such close inspection.

"Pixar took a lesson from 'Tin Toy,'" said Thalia Wheatley, a psychologist at Dartmouth College. "We have to nail the human form or not even go there."

Wheatley's lab has found that everyone from Dartmouth college students to a remote tribe in Cambodia shows a strong sensitivity to what does or does not appear human. But such findings held up only when the researchers showed people human faces that were familiar to their ethnic group.

When shown a series of doll-like and human faces made with "morphing" software, people said a face was more human than doll only if it had at least a 65 percent mix of a human face. People could even judge an artificial figure's human appearance based on seeing a single eye.

"Evolutionary history has tuned us to detect minor distortions that indicate disease, mental or physical problems," Wheatley explained. "To go after a human-looking robot or avatar is to go up against millions of years of evolutionary history."

When it matters
Today's world has gotten by without conquering the uncanny valley. Most people don't yet expect (or want) perfectly humanlike robot lovers, servants or virtual companions in their lives. But some cases already exist where a more humanlike artificial figure could prove helpful.

Medical students perform better in real-life emergencies if they trained with a simulator that appears and behaves like a real person, said Karl MacDorman, a robotics researcher at Indiana University. More ambitious Hollywood films that want to use computer-animated figures for real-life scenes could also benefit — whether they need a virtual stunt double or a realistic emotional performance to match the gravitas of films such as "Schindler's List."

"For medical applications or certain films, aiming for the first peak [of the uncanny valley] is not adequate," MacDorman said. "We really do need to overcome the uncanny valley."

Most experiments up until now have focused on studying the human perception of a "mismatch" in an artificial figure's human realism. But MacDorman has begun developing an interactive experiment that makes volunteers talk with either real actors or their digital doubles — a next step toward clearing the mists from the uncanny valley.

"We predict that uncanniness will interfere with participants' normal empathetic response within this scenario," MacDorman said. "This will help us understand how the uncanny valley influences emotional empathy during an interaction."

Copyright 2012 InnovationNewsDaily, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.