ADVERTISEMENT

Cyber Sensitive: Therapeutic Buddy Bots Get Emotional

Robotic companions that are capable of expressing some emotion might be better as pals for autistic children as well as mentors and health advisors for young diabetic patients



Tony Belpaeme, University of Plymouth

Diabetic children that enter the San Raffaele Hospital in Milan, Italy, are often full of apprehension about their disease, their diet and the possibility of giving themselves injections. Hospitals have tried introducing pets to calm young patients down. "Pets don't mind being at hospitals, can reduce patient hospital stays, but are expensive to train and keep, and are not very hygienic," says Tony Belpaeme of the University of Plymouth in England. Belpaeme is the coordinator of ALIZ-E, a European Union consortium of schools and institutions that is trying to develop a robot that will take the place of a pet, and that eventually may serve in the capacity of an older companion who can not only bond with the younger patient but offer counsel about diet and health matters.

Lola Cañamero of the University of Hertfordshire in Hatfield, England, says that young patients are quite willing to suppress disbelief and bond with the robot, with one caveat—the robot has to be capable of  expressing emotion: "And the robot must not only learn to express emotions themselves but read them in the patient, all of which is a tremendous challenge."

The dream of a friendly bot that could emote is old, but fiction is way ahead of science. The Massachusetts Institute of Technology's "Nexi", along with "Kansei" from Meiji University in Japan were some of the first efforts in the last decade. Both tried to replicate real human faces with lifelike expressions. But according to Belpaeme, "The effect was eerie. You enter the room with one of these things and your brain is screaming at you, 'Don't get so close to that!'" Which is why ALIZ-E and other efforts at robot caregivers for children have shied away from other than simple toylike facial expressions, choosing to express emotion other ways.

Cañamero worked as the coordinator of an earlier project, completed this year, called FELIX GROWING, in which she and other E.U. scientists trained robots to express emotions through body language alone. Robots were programmed via motion-capture technology used in special effects for films: Actors are wired at critical spots on their bodies and their gestures used to program the robot's moves and stances expressing anger, sadness, fear, pride, happiness and excitement. Cañamero is now continuing her efforts with the ALIZ-E project at hospitals in Italy, the Netherlands and the U.K., where there is great variation in the way people in each culture utilize body language. "We expect some variety of gestures, of course, but believe there will be a basic core that will emerge from them all," she says. Robots aren't allowed the emotional range of their subjects, she adds: "They can't go crazy like a baby."

ALIZ-E will take further steps toward developing a robot that kids can relate to by eventually adding speech and perhaps facial expressions to robot functions. Such problems are ripe with challenges. "Most speech recognition programs are written to sound like adults, but a small plastic robot sounds unnatural speaking like an adult," says Belpaeme. So programmers may have to design higher-pitched speech mechanisms to sound like a child. "And it can't talk completely like a human. After all, it is plastic," Belpaeme says.

Computer power is a primary bottleneck to creating robots that can speak and emote. Currently ALIZ-E uses a 56-centimeter tall, five-kilogram robot with a little Linux computer developed by Gostai, a robotics company in Paris. But the consortium is experimenting with online robots. Although these robots will be in the room with kids, talking and emoting, their brains will be somewhere kilometers away on a server with lots more computer capacity than can fit into an ALIZ-E "mini-me". The big problem there is data-transmission delays over the Internet. "If it takes too long to respond, it sounds unnatural," Belpaeme says. So researchers are working with what they call conversational fillers, like "uh-huh," and "that's interesting," leading a kid to believe the robot is thinking about what he or she is saying even if it is just pausing due to a server bottleneck somewhere.

But such problems are ultimately dwarfed by the complexity of getting robots to express emotion accurately. According to Ian Horswill, associate professor at Northwestern University's Department of Electrical Engineering and Computer Science, who has worked with emotional expression in both robots and avatars, "Some emotions like approach and avoidance—run toward food, run away from fire—are very simple stuff that doesn't require a lot of fancy cognition. But the anxiety of wanting to ask someone on a date and wanting to run away is language behavior, which is a much higher order of cognition." Scientists don't completely understand that in humans, let alone in robots, Horswill says.

Still, robots may be able to do some things better than humans, like taking care of autistic children. Says Cañamero, "Autism patients have problems reading people's facial expressions. Some people feel these patients may be may be able to relate to a robot better."

And whereas ALIZ-E robots may become caregivers for diabetic children and therapists to autistic children, they may also help both patients and researchers better understand how human emotions work.

Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Special Universe

Get the latest Special Collector's edition

Secrets of the Universe: Past, Present, Future

Order Now >

X

Email this Article

X