Sunday, May 4, 2008

The rise of the emotional robot


Duke is careering noisily across a living room floor resplendent in the dark blue and white colours of Duke University in Durham, North Carolina. He's no student but a disc-shaped robotic vacuum cleaner called the Roomba. Not only have his owners dressed him up, they have also given him a name and gender.
Duke is not alone. Such behaviour is common, and takes myriad forms according to a survey of almost 400 Roomba owners, conducted late last year by Ja-Young Sung and Rebecca Grinter, who research human-computer interaction at the Georgia Institute of Technology in Atlanta.
"Dressing up Roomba happens in many ways," Sung says. People also often gave their robots a name and gender, according to the survey (see Diagram) which Sung presented at the Human-Robot Interaction conference earlier this month in Amsterdam, the Netherlands.
Kathy Morgan, an engineer based in Atlanta, said that her robot wore a sticker saying "Our Baby", indicating that she viewed it almost as part of the family. "We just love it. It frees up our lives from so much cleaning drudgery," she says.
Sung believes that the notion of humans relating to their robots almost as if they were family members or friends is more than just a curiosity. "People want their Roomba to look unique because it has evolved into something that's much more than a gadget," she says. Understanding these responses could be the key to figuring out the sort of relationships people are willing to have with robots.
Until now, robots have been designed for what the robotics industry dubs "dull, dirty and dangerous" jobs, like welding cars, defusing bombs or mowing lawns. Even the name robot comes from robota, the Czech word for drudgery. But Sung's observations suggest that we have moved on. "I have not seen a single family who treats Roomba like a machine if they clothe it," she says. "With skins or costumes on, people tend to treat Roomba with more respect."
The Roomba, which is made by iRobot in Burlington, Massachusetts, isn't the only robot that people seem to bond with. US soldiers serving in Iraq and interviewed last year by The Washington Post developed strong emotional attachments to Packbots and Talon robots, which dispose of bombs and locate landmines, and admitted feeling deep sadness when their robots were destroyed in explosions. Some ensured the robots were reconstructed from spare parts when they were damaged and even took them fishing, using the robot arm's gripper to hold their rod.

Figuring out just how far humans are willing to go in shifting the boundaries towards accepting robots as partners rather than mere machines will help designers decide what tasks and functions are appropriate for robots. Meanwhile, working out whether it's the robot or the person who determines the boundary shift might mean designers can deliberately create robots that elicit more feeling from humans. "Engineers will need to identify the positive robot design factors that yield good emotions and not bad ones - and try to design robots that promote them," says Sung.
To work out which kinds of robots are more likely to coax social responses from humans, researchers led by Frank Heger at Bielefeld University in Germany are scanning the brains of people as they interact with robots. The team starts by getting humans to "meet" four different "opponents": a computer program running on a laptop, a pair of robotic lego arms that tap the keys of a laptop, a robot with a human-shaped body and rubbery human-like head, which also taps at a laptop, and a human. Then the volunteers don video goggles and enter an MRI machine. While inside the machine, a picture of the opponent they must play against flashes up inside their goggles.
The game, a modified version of the prisoner's dilemma, asks volunteers to choose between cooperating with their opponent or betraying them. As they can't tell what their opponent will do, it requires them to predict what their opponent is thinking. The volunteers indicate their choice from inside the scanner using a handset that controls their video display. The team carried out the experiment on 32 volunteers, who each played all four opponents. Then they compared the brain scans for each opponent, paying particular attention to the parts of the brain associated with assessing someone else's mental state, known as theory of mind. This ability is considered a vital part of successful social interactions.
Unsurprisingly, the team found that neurons associated with having a theory of mind were active to some extent when playing all opponents. However, they were more active the more human-like their opponent was, with the human triggering the most activity in this region, followed by the robot with the human-like body and head. The team says this shows that the way a robot looks affects the sophistication of an interaction.
Not surprisingly, though there are similarities between the way people view robots and other human beings, there are also differences. Daniel Levin and colleagues at Vanderbilt University in Nashville, Tennessee, showed people videos of robots in action and then interviewed them. He says that people are unwilling to attribute intentions to robots, no matter how sophisticated they appear to be.
Further complicating the matter, researchers have also shown that the degree to which someone socialises with and trusts a robot depends on their gender and nationality (See "Enter the gender-specific robot").
These uncertainties haven't stopped some researchers from forming strong opinions. Herbert Clark, a psychologist at Stanford University in California, is sceptical about humans ever having sophisticated relationships with robots. "Roboticists should admit that robots will never approach human-like interaction levels - and the sooner they do the sooner we'll get a realistic idea of what people can expect from robots." He says that robots' lack of desire and free will is always going to limit the way humans view them.
But Hiroshi Ishiguro of Osaka University in Japan thinks that the sophistication of our interactions with robots will have few constraints. He has built a remote-controlled doppelgänger, which fidgets, blinks, breathes, talks, moves its eyes and looks eerily like him (New Scientist, 12 October 2006, p 42). Recently he has used it to hold classes at his university while he controls it remotely. He says that people's reactions to his doppelgänger suggest that they are engaging with the robot emotionally. "People treat my copy completely naturally and say hello to it as they walk past," he says. "Robots can be people's partners and they will be."


Enter the gender-specific robot
How people view robots may inform what future robots can do, but it seems that gender and nationality feed into our reaction, too.
Cognitive scientist Paul Schermerhorn and colleagues at Indiana University in Bloomington asked 24 men and 23 women to cooperate with a machine-like robot on solving a mathematical problem and filling in a survey form. The robot consisted of a base with metre-high posts either side supporting a head with two cameras that looked like eyes. A voice synthesiser allowed it to speak. The team found that men thought of the robot as "more human-like" than women and engaged well with it at a social level, while women felt socially aloof and described it as "more machine-like".
However, the researchers say the difference in perception may be due to the way this particular robot interacted with the women - perhaps for some reason that robot appealed to men. They say that robots might need to acquire gender-specific behaviours to engage with humans. "People might prefer to interact with robots that exhibit characteristics of their gender, or of the opposite gender," says Schermerhorn. "This could lead to tailoring of the robot's characteristics to the [gender of the] human in future interactions."
Meanwhile, Vanessa Evers of the University of Amsterdam, the Netherlands, together with researchers at Stanford University in California have found that US volunteers of European descent perceive robots differently to people raised in China who lived elsewhere for less than six years. They asked their volunteers how they would react in a hypothetical space emergency when a robot was on hand that might save them. It turned out that the US participants were more willing to trust the robot's decisions and were happier giving it control of the spacecraft than the Chinese participants. "This confirms that people from different national cultures may respond differently to robots," Evers says.

Fausto Intilla - www.oloscience.com

No comments: