The Sword and Laser discussion
against the laws of robotics
date
newest »

message 1:
by
Kamil
(new)
Apr 03, 2012 10:59AM

reply
|
flag

In a very simple sense, affection begets pain. Either it is returned, in which case when the relationship ends (either with a break up or death) there is pain. However the dismissal will also cause pain, putting any robot told "I love you" into a permenant death cycle of conflict.
On the other hand, can you boil "romance" down to the basic stimuli and positive stimuli pursuit? We have a good feeling so we seek out ways to have that good feeling again. And often that good feeling is caused by things based on our experiences, our own "programming".
So in a sense a robot could be programmed to respond to stimuli, and then 'learn' what kinds of things trigger the 'pleasure' response, and thus when it is queueing through action options, things that give pleasure can migrate up on it's stack towards the top. Of course other things may push those actions down. Needs such as power, or orders from a human may force a robot to put off pleasure pursuit.
Most of it boils down to how you define "romance".
Perhaps the pleasure principle and the short term pleasure that romance can provide to someone overrides the long term potential harm that would make it forbidden by the 1st law?

Bicentennial Man had a romance between the android and "Little Miss". But after she dies of old age, he instead falls for another droid, while they both strive to become mortal. A romantic tragedy maybe?

they loved us so much they didn't want us to hurt ourselves anymore."
How about AI? The love of a child robot for its Imprinted "mother"?

Please excuse me while I go fire up my 100,000 strong Kill-o-Tron robotic army and deal with a few pesky rebels ...

Am I recalling correctly that R. Jander died/shut-down because he was providing a service to Gladia and realized in doing so, he was hurting her?
The Zeroth Law could be argued as a form of caring for humanity - caring being an emotion. Caring doesn't equal romantic feelings, but it's related.

Isn't it weird that after 7 replies that I am the first one to mention that having romantic feelings is not supposed to hurt? It's really the more opposite of that.
P.S.
You find more stories on AI and human love than robot and human. Remember most robots in sci fi are not necessarily intelligent and self aware.

And that's what separates us from them. I love my wife dearly. I cherish every moment we spend together. I have an indescribable contentment when I hold her at night. All of these things add up to some kind of positive factor, call it P.
However, eventually she'll leave me, through death, divorce, etc. This will cause pain. There are also the moments of pain that come in every relationship, when there's a horrific fight, a massive misunderstanding, when I disappoint her. These all add up to some negative factor. Call it S.
On the margin, P - S > 0. Thus, we're happy to engage in relationships because for the most part we know we get out more than it will cost, emotionally. Of course there are other cases where someone is able to convince themselves that S = 0, either because they see no suffering, or because they believe it will never happen. Usually at the beginning of a relationship we tend to do that as well.
A robot, at least a 3-laws safe robot, cannot do that. The first law says that if S =/= 0 then the robot cannot act. It does not allow for the robot to attempt a calculation of net benefit but rather binds the robot to a single action.
Of course there are ways that an advanced AI might slip around this by doing a mass calculation. Whether or not you liked the movie adaptation of I, Robot, (view spoiler)
I like the concept of the 3 Laws but for a truly advanced AI you almost need to default them down to be more like Guidelines than actual rules.

Well, all I have to say is that you have a different expirence than I. I Had my first girl friend when I was 16. Lovely girl, never had a better time than when I was with her.
I guess that is why I married her, and now I'm 50 and she is still my best friend.
P.S. using your logic no one should ever have a relationship. After all any one can die at any minute. A car could run you over simply walking across the street. So why bother getting to know anyone? ( sounds like a real depressing way to live to me)
P.P.S. My wife and I both have a secret. We have both agreed not to die before the other. So, effectively we are now both immortal :)

Pain can be good for people (recovery from surgery, muscles aching after a strenuous workout).

Quite the opposite. That we have relationships despite the fact that we know there will be pain (harm?) at the end is what makes love such a glorious thing. It lets us forget that life is a fleeting experience and to enjoy the pleasures we have while we might have it.
I believe in love and I, as I noted, am an amazingly happily married man myself. But I know one of us is going to die. It's all but inevitable, really, and what that happens the other will experience significant loss and pain. We have two GLORIOUS children. More than likely we'll shuffle off this mortal coil before they do. And they will feel pain.
But we can ignore that. We can push it from our minds as insigificant, as too far away to think about, as too removed from day to day life. We're human which means we can prioritize better than a computer can.
But computers don't know that. Binary is Yes or No, On or Off.
Robots, by the laws, could not pursue love for that reason. They cannot ignore the eventual loss that comes with romantic entanglement. But ~we~ can.
The law is "harm."
True but in the story "Liar", the robot was caught in a loop of "harm" being tied to the disappointment felt at being bested by a robot. If that pain was significant to drive the robot insane, then how does it compare to "harm" inflicted with the death of a loved one?
I think that for the most part this is where we see that the laws are not really great programming but rather a really clever plot device, a tool to drive storytelling rather than simply backdrops within the story.
Don't think me a clinical sociopath. I revel in my relationship to my wife and to my family. I just don't believe that a robot that is "3 Laws Safe" could do the same without adjustment to the laws.

And robots programmed to not cause harm (which they would most likely include pain under that heading), would have a hard time dealing with that.