Can You Teach a Robot How Not to Be a Jerk?


What's This?



For some people, the only thing scarier than a robot is a rude robot. But what if you could teach robots how to be polite — even ethical?


Robots can be programmed to think and use absolute logic to make good decisions. They're best, though, when the task and desired outcomes are known. Variables, like human interaction, can be accounted for, but good luck getting a robot to intuit what us emotional humans are thinking or understand our intent. Some experts though, believe robots can be trained to do the right thing.



AJung Moon has been working on developing robots that know right from wrong and can act accordingly. She's focused, in particular, on what she calls "robot ethics." She's so committed to the cause that her Twitter handle is actually @RoboEthics.


Currently a University of British Columbia Ph.D. student studying Human-Robot Interaction, Moon and a team from the Open RoboEthics Initiative recently designed an ethical challenge using a Willow Garage PR2 robot, an elevator and some patient humans.


The idea was fairly simple. First survey people to find out under what circumstances they would cede control of the elevator to a robot carrying both urgent and non-urgent mail. Moon presented people with four robot response options, which people graded on levels of acceptability. The four responses and actions the robot could provide were:




  • Yield by saying, “Go ahead. I will ride the next one,”




  • Do nothing and remain standing by the elevator,




  • Not yield, say, “I have urgent mail to deliver and need to ride the elevator. Please exit the elevator,” and take the elevator once the person exits, or




  • Engage in a dialogue by telling the person that it’s on an urgent mission and asking if they are in a hurry.




Overall, respondents said, "the most appropriate action chosen for the robot was to engage in dialogue with the person, and the least appropriate behavior was to take no action at all." In other words, no one wanted the robot to just stand there, looking weird. Humans also expect that if the robot has a non-urgent letter, it should always yield to people. Researchers used that data to program the Willow Garage robot and then filmed the results.


In the video above, the robot never moves quickly or pushes anyone out of the way. Instead, if it has urgent mail, it announces its need for the elevator, but if someone refuses to get out (the robot is so large that it and a person may not fit comfortably), it simply tells the person to go ahead and waits for the next one. Moon also programmed it to act inappropriately. So when the robot and a wheelchair-bound person are both waiting for the elevator, the robot announces it has urgent mail and then bounds forward into the elevator, leaving one person with a disability person pretty pissed off.


Ultimately, researchers found that there are no real hard and fast rules. The robot should, like people, assess the context of the situation and use communication and whatever other interaction is at its disposal to act appropriately. She presented her findings on the research blog Footnote1 .


Even if robots can't be taught to distinguish right from wrong, they can fake it. Honda ASIMO designers recently explained to me how they taught the humanoid robot to pause and look at everyone seated around a table before serving them tea. That simple act makes ASIMO look like it understands social mores, when, in fact, it's simply running a routine.


Similarly, Moon programmed a robot to pause before grabbing something from a bowl if it sees that a person is also grabbing at the same time. Again, the robot doesn't understand that it's impolite to reach into the bowl when someone else's hand is already in there, but its programming and sensors can detect the other hand and know that, in that instance, it has to pull back and wait.


Robots may never understand right from wrong, but it does appear that they can be programmed to be less of a jerk.


Have something to add to this story? Share it in the comments.


Topics: Gadgets, Robot, Tech

Image: University of British Columbia






0 comments: