Elizabeth Barrette (ysabetwordsmith) wrote,
Elizabeth Barrette
ysabetwordsmith

  • Mood:

Robots and Moral Choices

This article explores whether people feel that moral rules should apply to robots. Now if the robot is simply a machine, it is not morally relevant; but if it is self-aware, then it is. The study started out showing surprisingly good results: the more a robot was presented as personlike, the more humans were inclined to protect it. This is great! Maybe we're not Matrix murderers after all. That would be awesome. Except then the researchers drew this conclusion:

"The more the robot was depicted as human -- and in particular the more feelings were attributed to the machine -- the less our experimental subjects were inclined to sacrifice it," says Paulus. "This result indicates that our study group attributed a certain moral status to the robot. One possible implication of this finding is that attempts to humanize robots should not go too far. Such efforts could come into conflict with their intended function -- to be of help to us."

They don't want tools.  They don't want AI offspring.  They just want slaves.  That's a really stupid idea.  >_<
Tags: news, science
Subscribe

  • Goldenrod Gall Contents

    Apparently all kinds of things go on inside goldenrod galls, beyond the caterpillars who make them. Fascinating. I've seen the galls but haven't…

  • Science and Spirituality

    Here's an article about science and spirituality, sort of. It doesn't have a very wide view of either. Can you be scientific and spiritual? This…

  • Geniuses

    This article asks if geniuses are real. Gee thanks, assholes. It's not enough to be treated like a vending machine, now you want to play the erasure…

  • Post a new comment

    Error

    default userpic

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 1 comment
"Now if the robot is simply a machine, it is not morally relevant; but if it is self-aware, then it is."

Ahhh, self-awareness. Before one can say "then it is", one first has to define what it is, and that is not a simple question. Saying "moral relevance begins with self-awareness" is like saying "human life begins at conception": in one sense it's technically true, but in another it's a fuzzy statement of opinion, and in neither case is it very useful in determining right conduct.

"They don't want tools. They don't want AI offspring. They just want slaves. That's a really stupid idea."

No. The really stupid idea is trying to turn robots into pets, or children, or sex partners, or life companions. The whole point of robots is that they ARE tools, non-living artifacts that can do the work of a human, but who have no 'moral relevance'.

Humans are weird, and will bond emotionally with all kinds of non-living items - not just things like talking dolls or stuffed animals. I bonded with my robots when I was building them, for all that they were mindless little things with nothing even resembling a face, but I had no illusion or desire that they would bond with me, and - most importantly - no fear of hurting them; no possibility of doing wrong to them. I stand by what I said then:
"As for the Sindarin word mûl - it means "slave", which is the literal meaning of the word robot, and after due consideration, I don't have any problem with using that word for my pretty little 'bots. There's a lot of tedious, weary work to be done in the world, and who should do it? Living, breathing beings, like that poor woman in Hood's poem, or mechanical devices that can never feel pain, weariness or despair? People aren't made to do the same repetitive task over and over, but machines, why, that's exactly what they're made for; that's the function in which they are fulfilled. Aren't all tools, from the stone axe up, created to make work easier? So yeah, I am a mûldan, a slave-smith - I make robots to do the boring work, so people don't have to be slaves."
If it were possible to hurt them - if they had 'moral relevance' - it would be morally wrong to make them at all.