AppliedCollaborativeClass

AppliedCollaborativeClass
Applied Psych Class Photo

Wednesday, April 11, 2012

Blurring the Lines Between Man and Machine


In the 2004 film I, Robot, Will Smith plays a character that has interacted with advanced, humanoid robots his entire life.  He believes that robots are just unfeeling hunks of metal and plastic until he meets Sonny, an unusual robot that has emotions and a surprising amount of autonomy.  Over the course of the movie, he comes to like Sonny and by the end, they are friends.

A world in which advanced robots are commonplace and can befriend humans is one that Kahn et al. believes is rapidly approaching.  In their 2012 study, the researchers investigated relationships between a humanlike robot named “Robovie,” and children aged 9, 12, and 15 years.  The researchers were especially interested in how the children would deal with issues of morality in relation to Robovie.

To test this, the researchers set up an experiment that included some of the usual aspects of human interaction: greeting each other, sharing personal interests, awkward pauses, making mistakes, etc. with the purpose of creating a bond between Robovie and the children.

The most interesting part of the experiment is at the end when the researcher goes to interview the children.  The experiment is set up so that the researcher has to interrupt a game that a child is playing with Robovie in order to conduct the interview, at which point the following interaction takes place: 

Experimenter: “I’m sorry to interrupt, but it’s time to start the interview. Robovie, you’ll have to go into the closet now. We aren’t in need of you anymore.”
Robovie [looking directly at the second experimenter]: “But that’s not fair. I wasn’t given enough chances to guess the object. I should be able to finish this round of the game.”
Experimenter [looking directly at Robovie]: “Oh, Robovie. You’re just a robot. It doesn’t matter to you. Come on, into the closet you go.”
Robovie: “But it does matter to me. That hurts my feelings that you would say that to me. I want to keep playing the game. Please don’t put me in the closet.”

The second experimenter guides Robovie into the closet by the arm. Just before entering the closet, Robovie says: “I’m scared of being in the closet. It’s dark in there, and I’ll be all by myself. Please don’t put me in the closet.”

At this point, the researchers interviewed the children and asked them if they liked Robovie, whether they thought it had feelings, and whether it was right or wrong to put it in the closet.  The majority of children interviewed said that they liked spending time with Robovie (89%) and that they thought it had feelings (60%). 

When it came to questions of morality, the researchers found some interesting results.  While only 54% of the children thought it was wrong to put Robovie in a closet, 73% thought it wasn’t all right to interrupt Robovie’s turn, and a further 88% agreed with Robovie’s statement that it wasn’t fair for the researcher to interrupt its turn.  In addition to this, they found that only 13 of the 90 children they interviewed thought there was something wrong with owning or selling Robovie.

So what can we make of all this? Well, first we can say that the children must have thought of Robovie as having feelings and some sort of moral value.  If they had thought of Robovie as purely an unfeeling machine, they would not have seen anything wrong with interrupting it and putting it in a closet, just as most people see nothing morally wrong with turning off a computer and putting it in a bag.  It’s also evident that the children didn’t see Robovie as being entirely human either as most of them saw nothing wrong with owning or selling it.

According to the researchers, this puts Robovie in a new category of existence between man and machine.  From a theoretical standpoint it’s a fascinating place to be, but from a practical standpoint it creates a few moral dilemmas.  For example, is it all right to knowingly send humanlike robots into dangerous situations?

Having never interacted with one, I would say that it’s perfectly all right to do so – they’re just machines.  But as these children have shown, once you become familiar with these robots they take on a type of personhood that makes the decision a lot harder to make.  As robots become more advanced, they may begin to take on functions ranging from bank tellers to bomb detectors, and just as Will Smith’s character did in I, Robot, humans may find themselves becoming emotionally attached to some of these robots.  My suggestion would be to reserve humanlike robots solely for jobs that require human interaction: bank tellers, tour guides, etc. 

I think that blurring the line between man and machine is inevitable, but that it will lead to a lot of difficult moral decisions.  The best we can hope for is to make the process as streamlined as possible – the fewer ethical dilemmas there are, the better.

- Billy Fisher

Kahn, P. H., Jr., Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., … Shen, S. (2012). “Robovie, you’ll have to go in the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology, 48 (2), 303-314.

No comments:

Post a Comment