The Sword and Laser discussion

160 views
against the laws of robotics

Comments Showing 1-14 of 14 (14 new)    post a comment »
dateUp arrow    newest »

message 1: by Kamil (new)

Kamil | 372 comments i recently found many people talking about robots having romantic feelings towards humans. and as much as the idea pleases me i think it would be against the 3 laws of the robotics since the robot would have to hurt a human one way or another. it might be just me but i don't recall movies/books/games where this kind of love story worked out


message 2: by Micah (last edited Apr 03, 2012 11:43AM) (new)

Micah (onemorebaker) | 1071 comments terminator doesn't count?

they loved us so much they didn't want us to hurt ourselves anymore.


message 3: by Rob (new)

Rob Osterman (robosterman) I'm working on that concept now. It can work but it's problematic if you make your robots "Three Laws Safe."

In a very simple sense, affection begets pain. Either it is returned, in which case when the relationship ends (either with a break up or death) there is pain. However the dismissal will also cause pain, putting any robot told "I love you" into a permenant death cycle of conflict.

On the other hand, can you boil "romance" down to the basic stimuli and positive stimuli pursuit? We have a good feeling so we seek out ways to have that good feeling again. And often that good feeling is caused by things based on our experiences, our own "programming".

So in a sense a robot could be programmed to respond to stimuli, and then 'learn' what kinds of things trigger the 'pleasure' response, and thus when it is queueing through action options, things that give pleasure can migrate up on it's stack towards the top. Of course other things may push those actions down. Needs such as power, or orders from a human may force a robot to put off pleasure pursuit.

Most of it boils down to how you define "romance".

Perhaps the pleasure principle and the short term pleasure that romance can provide to someone overrides the long term potential harm that would make it forbidden by the 1st law?


message 4: by John (new)

John | 6 comments Kamil wrote: "i recently found many people talking about robots having romantic feelings towards humans. and as much as the idea pleases me i think it would be against the 3 laws of the robotics since the robot ..."
Bicentennial Man had a romance between the android and "Little Miss". But after she dies of old age, he instead falls for another droid, while they both strive to become mortal. A romantic tragedy maybe?


message 5: by John (new)

John | 6 comments Micah wrote: "terminator doesn't count?

they loved us so much they didn't want us to hurt ourselves anymore."

How about AI? The love of a child robot for its Imprinted "mother"?


message 6: by Adrian (new)

Adrian (aashdown) Three laws safe? Bah, where's the fun in that!

Please excuse me while I go fire up my 100,000 strong Kill-o-Tron robotic army and deal with a few pesky rebels ...


message 7: by Kevin (new)

Kevin Xu (kxu65) | 1081 comments What about the Zeroth Law?


message 8: by Kev (new)

Kev (sporadicreviews) | 667 comments Will robots ever be able to truly experience true emotions? Will it just be a software approximation of emotion? Or will it so closely emulate true emotion that it doesn't matter?

Am I recalling correctly that R. Jander died/shut-down because he was providing a service to Gladia and realized in doing so, he was hurting her?

The Zeroth Law could be argued as a form of caring for humanity - caring being an emotion. Caring doesn't equal romantic feelings, but it's related.


message 9: by Stan (new)

Stan Slaughter | 359 comments "...robots having romantic feelings towards humans. and as much as the idea pleases me i think it would be against the 3 laws of the robotics since the robot would have to hurt a human..."

Isn't it weird that after 7 replies that I am the first one to mention that having romantic feelings is not supposed to hurt? It's really the more opposite of that.

P.S.

You find more stories on AI and human love than robot and human. Remember most robots in sci fi are not necessarily intelligent and self aware.


message 10: by Rob (new)

Rob Osterman (robosterman) Thing is, it ~will~ hurt and most people know that. Relationships end at some point, even if it's a temporary suspension because you believe in a rejoining in the afterlife. But either someone dies, or the couple breaks up. That means there will be pain on some level. The 1st Law of Robotics forbids the creating of pain, or through inaction allowing pain to occur. It does not, however, allow for "greater net pleasure than pain".

And that's what separates us from them. I love my wife dearly. I cherish every moment we spend together. I have an indescribable contentment when I hold her at night. All of these things add up to some kind of positive factor, call it P.

However, eventually she'll leave me, through death, divorce, etc. This will cause pain. There are also the moments of pain that come in every relationship, when there's a horrific fight, a massive misunderstanding, when I disappoint her. These all add up to some negative factor. Call it S.

On the margin, P - S > 0. Thus, we're happy to engage in relationships because for the most part we know we get out more than it will cost, emotionally. Of course there are other cases where someone is able to convince themselves that S = 0, either because they see no suffering, or because they believe it will never happen. Usually at the beginning of a relationship we tend to do that as well.

A robot, at least a 3-laws safe robot, cannot do that. The first law says that if S =/= 0 then the robot cannot act. It does not allow for the robot to attempt a calculation of net benefit but rather binds the robot to a single action.

Of course there are ways that an advanced AI might slip around this by doing a mass calculation. Whether or not you liked the movie adaptation of I, Robot, (view spoiler)

I like the concept of the 3 Laws but for a truly advanced AI you almost need to default them down to be more like Guidelines than actual rules.


message 11: by Stan (new)

Stan Slaughter | 359 comments "Thing is, it ~will~ hurt and most people know that. Relationships end at some point, even if it's a temporary..."

Well, all I have to say is that you have a different expirence than I. I Had my first girl friend when I was 16. Lovely girl, never had a better time than when I was with her.

I guess that is why I married her, and now I'm 50 and she is still my best friend.

P.S. using your logic no one should ever have a relationship. After all any one can die at any minute. A car could run you over simply walking across the street. So why bother getting to know anyone? ( sounds like a real depressing way to live to me)

P.P.S. My wife and I both have a secret. We have both agreed not to die before the other. So, effectively we are now both immortal :)


message 12: by Otto (new)

Otto (andrewlinke) | 110 comments Also... "pain" is not the law. The law is "harm."
Pain can be good for people (recovery from surgery, muscles aching after a strenuous workout).


message 13: by Rob (new)

Rob Osterman (robosterman) P.S. using your logic no one should ever have a relationship.

Quite the opposite. That we have relationships despite the fact that we know there will be pain (harm?) at the end is what makes love such a glorious thing. It lets us forget that life is a fleeting experience and to enjoy the pleasures we have while we might have it.

I believe in love and I, as I noted, am an amazingly happily married man myself. But I know one of us is going to die. It's all but inevitable, really, and what that happens the other will experience significant loss and pain. We have two GLORIOUS children. More than likely we'll shuffle off this mortal coil before they do. And they will feel pain.

But we can ignore that. We can push it from our minds as insigificant, as too far away to think about, as too removed from day to day life. We're human which means we can prioritize better than a computer can.

But computers don't know that. Binary is Yes or No, On or Off.

Robots, by the laws, could not pursue love for that reason. They cannot ignore the eventual loss that comes with romantic entanglement. But ~we~ can.

The law is "harm."

True but in the story "Liar", the robot was caught in a loop of "harm" being tied to the disappointment felt at being bested by a robot. If that pain was significant to drive the robot insane, then how does it compare to "harm" inflicted with the death of a loved one?

I think that for the most part this is where we see that the laws are not really great programming but rather a really clever plot device, a tool to drive storytelling rather than simply backdrops within the story.

Don't think me a clinical sociopath. I revel in my relationship to my wife and to my family. I just don't believe that a robot that is "3 Laws Safe" could do the same without adjustment to the laws.


message 14: by Kev (new)

Kev (sporadicreviews) | 667 comments I completely agree with Rob. Romantic relationships, even successful, loving, long-term/lifetime relationships have some element of pain, even if you've been happily married for decades. You can't tell me you never had an argument, or a disagreement, or a point in your life when something that normally makes you happy hurt you in some way. That's the nature of humanity, imo.

And robots programmed to not cause harm (which they would most likely include pain under that heading), would have a hard time dealing with that.


back to top