The Sword and Laser discussion

Do Androids Dream of Electric Sheep?
149 views
2014 Reads > DADOES: Empathy

Comments Showing 1-21 of 21 (21 new)    post a comment »
dateDown arrow    newest »

message 1: by Jay (new) - rated it 3 stars

Jay (snakedok79) | 10 comments So... Why can't empathy be programmed?


message 2: by Dwayne (new)

Dwayne Caldwell | 141 comments I feel ya.


message 3: by Warren (new)

Warren | 1556 comments I think you know what the problem is just as well as I do.
Quite honestly, I wouldn't worry myself about that.


message 5: by Alex (new) - added it

Alex (asato) In 1965, you had a bunch of mainframes--no PCs. So, for it's time, asking for the suspension of disbelief on this fact is reasonable. Even now, the problem is difficult.

But on second "search", maybe not too difficult. This is pretty interesting:

http://www.ai.mit.edu/projects/humano...

(it's amazing what people have posted on the web and how easy it is to find now)


message 6: by Jay (new) - rated it 3 stars

Jay (snakedok79) | 10 comments I kind of see your point Alex but... It's hard for me to imagine a world where robot animals and androids exist that are so complex they are nearly indistinguishable from the real thing yet empathy can't be programmed.


message 7: by Alex (new) - added it

Alex (asato) (I'm on p.54, so I just passed the section on the empathy test.)

I think from a 2014 point-of-view, it might be hard to believe but not for a 1965 reader. Asimov's I, Robot came out in 1950.

Interestingly, earlier this year, there was a winner for one kind of Turing test:

http://www.bbc.com/news/technology-27...


message 8: by Tassie Dave, S&L Historian (last edited Nov 02, 2014 04:27PM) (new) - rated it 4 stars

Tassie Dave | 3459 comments Mod
Alex wrote: "Interestingly, earlier this year, there was a winner for one kind of Turing test:"

A 33% success rate was always considered a failure when I was at school :-?
I think it is more a sad indictment on the 10 people who were fooled than a success for the programmers.

The fact that the programme was pretending to be a 13 yo conversing in a foreign language would subconsciously sway humans to forgive quirks in it's grammar.


message 9: by Alex (new) - added it

Alex (asato) sad indeed… i found the rules somewhat lacking in rigor as well.


Rochelle | 69 comments I did find it interesting that the humans in the book had to build such elaborate tests to be able to tell whether someone had true empathy or not. At some level, I think people usually believe they can tell whether someone is being empathic or not, whether someone understands and feels emotions correctly. But humans are also really good at projecting emotions onto animals and inanimate objects, so in the end are not good judges of what another person/android is feeling. That leant it a little more credibility for me. It's not like these androids are so poorly programmed they immediately stand out. It takes a test from a trained professional to find the flaws.


Joe Informatico (joeinformatico) | 888 comments I think the other bounty hunter, Resch, was introduced for the purpose of showing that human beings lacking empathy can exist and function in human society.

There was a lot of 20th century philosophy and thought dealing with how industrialized society was dehumanizing and decreasing empathy in society. I think in a lot of SF of this era, robots and AI are used as a metaphor for the alienating effects of industrialization, and not necessarily a warning against them specifically.


message 12: by John (Taloni) (new)

John (Taloni) Taloni (johntaloni) | 3824 comments If you're following Asimov, empathy can be programmed. His positronic brains encoded the Three Laws not by a program, but by fields of potential within the design of the positronic brain. R. Daneel Olivaw showed empathy, especially in the later books. The robot politician in one of the shorts also showed empathy.


Rob Secundus (quintessential_defenestration) | 1035 comments I think there's an intentional incongruity going on here. I'm only 1/3 of the way through, but there's some *really* weird stuff going on with empathy and emotions in general. First of all, from the very first scene it seems like the human brain, at the very least, is programmable with empathy-- Deckard's box is capable of making he and his wife feel everything from despair to genuine empathy towards each other, or to cut off all empathy they feel. Then we get, through Isidore, and introduction to the Mercer empathy box, whose entire point seems to be emparting empathy. And then when we actually see the Voight test, all of the empathic responses (except for the last clincher Deckard uses) have to do with animals, and more specifically, how a human being living in a society of Mercerists would react to questions about animals. And Deckard buys the explanation that Robowoman is from a spaceship without having grown up in earth culture (Mercerism) or around animals.

And so this is to say that it seems like, in this novel, it's the humans who act like programmable robots, and they misunderstand something basic about the androids they produce-- it's not that they don't feel empathy (Robowoman is not at all a fan of using the skin of human babies as a trade good, even if she shows her revulsion a fraction of a second too late), it's that the empathy they feel is different from, and expressed differently from, the weird, twisted things that the remaining humans on earth feel. But because they're human, the human beings just assume that this means they can't actually feel empathy.

Does that make sense?


message 14: by Jay (new) - rated it 3 stars

Jay (snakedok79) | 10 comments PKD's description of the society/public makes me think the decision to say robots have empathy is almost a political decision not that it's impossible.


message 15: by Buzz (new) - rated it 5 stars

Buzz Park (buzzpark) | 306 comments I think if Empathy is purely a biological function, albeit incredibly complex, then eventually it should be able to be programmed.

However, if Empathy is part of a spiritual nature, then it probably can never be programmed. If it can't be programmed, it seems to follow that there will likely be a way to test for it.


message 16: by Alex (last edited Nov 14, 2014 09:34AM) (new) - rated it 5 stars

Alex (alexcpierce) | 47 comments Rob Secundus wrote: "I think there's an intentional incongruity going on here. I'm only 1/3 of the way through, but there's some *really* weird stuff going on with empathy and emotions in general. First of all, from th..."

The human brain IS programmable with empathy. I'm of the opinion that our definition and level of empathy is very much a learned response. Yes, there's an innate biological component that allows us to learn it, but without sharing experience with those around us and learning through cause and effect the human capacity for empathy can be stunted and weak.

At the same time, sociopaths are considered largely incapable of empathy AND fully understanding and determining cause and effect without having a template or rule for that experience. That alone makes me think that at least the semblance of empathy can be programmed, even if we can't do it now.

Still, having read Do Androids Dream of Electric Sheep a few times, I think the entire book is (as is typical for Philip K. Dick), a deconstruction of what it is that makes us human. Even the humans in the story don't empathize in what we would consider a "normal" way, often relying on artificial external stimulus to enable what should be a standard human response.


message 17: by Ally (new) - rated it 4 stars

Ally (leopardqueen) Maybe I misunderstood something about the V-K test. The way I understood it was that the questions were asked seemingly to provoke a verbal response, they were actually asked to provoke an involuntary response, a physical one, the dilation of the eye. To me, this suggests that although an android may have empathy programmed into them (I assume that if an android is programmed with memories enough to make them believe they are human, that they actually have "learned" empathy in the same way a human does), they are incapable of performing the involuntary response associated with empathy, the dilation. I had just assumed that this was a flaw in the design of the Nexus 6.

Regardless of this, I have a counterpoint, that maybe empathy truly can't be achieved by a non-human or artificial intelligence. Perhaps empathy is a trait associated with the controversial concept of the "soul". I won't really get into my opinion of that, because I don't want to say anything too taboo, but as a Christian I have often shone this light upon the subject as an explanation.


Barak Raguan (shiningheart) | 40 comments [Content warning: discussion of the Holocaust and other atrocities]
I have to say that the more I think about how empathy is handled in this book, the more unsettled I become.
First, the importance placed on empathy as the defining characteristic of humanity seems impossible naive to me. If historical atrocities like the treatment of indigenous cultures, chattel slavery or the Holocaust have taught us anything, it's that empathy is hardly permanent, and far from universal. It seems relatively common to simply decide that my empathy for other humans only stretches so far, only to a certain group. Only to others like me.
The Holocaust, for example, supplies us with many examples of German men and women who were considered the peak of culture and breeding in the so-called civilized world, who set themselves to serve their country by systematically annihilating Jews, Roma and other "undesirables" in death camps, while continuing to pursue their interests in literature, music, philanthropy, vegetarianism and other "enlightened" pursuits.

Back to the book, this brings me to another disturbing point. Compare for a moment the attitude most characters have towards animals with that they have for the "specials". They consider the "specials" to be degenerate, to be hardly human due to their genetic differences. And yet they treat animals, a different species entirely, with reverence.


message 19: by Alex (new) - rated it 5 stars

Alex (alexcpierce) | 47 comments Barak, I've always thought that was exactly the sort of commentary Philip K. Dick was making.

Whereas many authors write without implicitly targeting a message or point, Dick always built his books around a theme and a reflection of society or humanity as a whole. He liked to dig into our deepest and darkest points and assumptions about ourselves and shine a light on them.


message 20: by Ruth (tilltab) Ashworth (last edited Nov 27, 2014 12:42PM) (new) - rated it 4 stars

Ruth (tilltab) Ashworth | 1854 comments Barak wrote: "It seems relatively common to simply decide that my empathy for other humans only stretches so far, only to a certain group. Only to others like me."

Barak, I never thought about it in that light, so thank you. It is an interesting idea to ponder, and now that you bring it to mind, I guess that is the thought behind Deckard questioning why the test has never taken into consideration feelings for other androids. He says many times during the book that androids don't feel anything for each other, but this isn't something that is ever proven. Perhaps this is because it would be harder to kill the androids if the bounty hunters thought they had feelings like everybody else. Actually, it is when Deckard starts questioning this that he starts doubting himself, and at that point he seems to more firmly conclude that Androids do not care about one another, perhaps a defensive reaction?

Actually, this reminds me of a truly horribly thing a person I once knew said. A local Pakistani family had been killed in a fire and he said he was glad. "What?" he protested against my look of disgust. "They don't feel anything."


message 21: by Warren (new)

Warren | 1556 comments Happy People Aren't Always Great At Empathy
http://www.huffingtonpost.com/2014/11...

People who are happy first thing in the morning lack empathy.
Therefore their”skin jobs” and need to be retired.
Works for me. ;-}


back to top