The Sword and Laser discussion

This topic is about
Do Androids Dream of Electric Sheep?
2014 Reads
>
DADOES- Uncomfortable question
date
newest »

Bookshelf wrote: "The Tyrell corporation claims that the replicans are androids (Robots).
Therefore they have no rights and its OK to kill them or use them as slaves.
Every criteria for distinguishing humans from ro..."
You don't. That's the unsettling part of the entire book. The Nexus-6es die from one shot to the chest, where a heart would be. They are labelled as androids and written down as androids on paper but you don't know if they're humans raised on a different planet who tried to escape and were punished for it.
After all, Pris and Rachael had empathy for humans. Just not for animals.
Therefore they have no rights and its OK to kill them or use them as slaves.
Every criteria for distinguishing humans from ro..."
You don't. That's the unsettling part of the entire book. The Nexus-6es die from one shot to the chest, where a heart would be. They are labelled as androids and written down as androids on paper but you don't know if they're humans raised on a different planet who tried to escape and were punished for it.
After all, Pris and Rachael had empathy for humans. Just not for animals.

Because killing members of your own species is slightly more horrifying than killing members of other species. Having a government program that's used to kill humans off who escape a planet is slightly more horrifying than having a program to control robots that escaped. (Albeit, still horrifying, I agree.)
Also, the problem with sapience is that every time an animal meets a definition of sapience, the definition of it gets changed. For all we know, in this book the definition of sapience gets defined to require empathy. When it was discovered that animals are self-aware, the definition got changed.
Also, the problem with sapience is that every time an animal meets a definition of sapience, the definition of it gets changed. For all we know, in this book the definition of sapience gets defined to require empathy. When it was discovered that animals are self-aware, the definition got changed.
Sean wrote: "Why would it matter whether they're robots or biological humans? If they're sapient, they deserve rights regardless of what kind of mechanics their bodies use."
That's the nice thing about Star Trek. Mechanical beings with sentience are extended rights.
That's the nice thing about Star Trek. Mechanical beings with sentience are extended rights.

When was this decided, exactly? Was it put to a vote, or are we relying on a court ruling?
Sean wrote: "Anja wrote: "Because killing members of your own species is slightly more horrifying than killing members of other species."
When was this decided, exactly? Was it put to a vote, or are we relying..."
Excuse me? Why the hostility? Just say, "I disagree" and explain your point. It is based off of my experience with the metaphors of older literature. The killing of another human being was the "uncrossable line". The main protag spent most of the intro speaking about how he's "not a killer" and using the non-human status of the robots to justify it. He went out of his way to avoid performing the test on the residents of the house because he needed the money to pay for his goat. He sought a loophole to cut through so that he'd avoid risking finding out they were human. It may not matter to you but it very much mattered to the protagonist. And my initial point was more about how much it matters to him.
What happened to the androids was devastating and I never saw Deckard as a positive protagonist and I wanted them to get away from him and was happy when Rachel got her form of revenge. But from Deckard's point of view, the horror for him would've been finding out that he killed other human beings.
When was this decided, exactly? Was it put to a vote, or are we relying..."
Excuse me? Why the hostility? Just say, "I disagree" and explain your point. It is based off of my experience with the metaphors of older literature. The killing of another human being was the "uncrossable line". The main protag spent most of the intro speaking about how he's "not a killer" and using the non-human status of the robots to justify it. He went out of his way to avoid performing the test on the residents of the house because he needed the money to pay for his goat. He sought a loophole to cut through so that he'd avoid risking finding out they were human. It may not matter to you but it very much mattered to the protagonist. And my initial point was more about how much it matters to him.
What happened to the androids was devastating and I never saw Deckard as a positive protagonist and I wanted them to get away from him and was happy when Rachel got her form of revenge. But from Deckard's point of view, the horror for him would've been finding out that he killed other human beings.
John wrote: "Rachel's revenge killed an innocent, though."
Not from her point of view. Just like how from Deckard's point of view he wasn't killing innocents.
She had the choice between killing him, his wife or the goat for revenge and from her perspective killing the goat was the best way to get revenge without having blood on her hands.
Not from her point of view. Just like how from Deckard's point of view he wasn't killing innocents.
She had the choice between killing him, his wife or the goat for revenge and from her perspective killing the goat was the best way to get revenge without having blood on her hands.

Anja wrote: "That's the nice thing about Star Trek. Mechanical beings with sentience are extended rights.
Did you ever watch TOS? Kirk verbal jui jitsued to "death" every sentient computer he came in contact with... And there were a few.

Did you ever watch TOS? Kirk verbal jui jitsued to "death" every sentient computer he ca..."
Dude, having rights extended to sentient machines is the plot of one of the most famous episodes of TNG.
It would be similarly nonsensical to respond to someone mentioning the prime directive with something like "Bro? Are you a real trekkie?! Kirk interfered with low tech civilizations all the time!"

with some fava beans and a nice chianti and the table gets real quite.
;-}

It doesn't matter whether Batty and Pris are indistinguishable from us, or if they look like Bender and Robby the Robot -- killing them is every bit as wrong as killing a naturally born human. To suggest otherwise is morally repugnant.
It is based off of my experience with the metaphors of older literature. The killing of another human being was the "uncrossable line".
Older SF is generally okay with mass genocide against anyone who isn't a white European -- robots, aliens, or just plain Asians. (In this it is completely unlike modern SF.)
The main protag spent most of the intro speaking about how he's "not a killer" and using the non-human status of the robots to justify it. He went out of his way to avoid performing the test on the residents of the house because he needed the money to pay for his goat. He sought a loophole to cut through so that he'd avoid risking finding out they were human. It may not matter to you but it very much mattered to the protagonist. And my initial point was more about how much it matters to him.
But you didn't say that. You made normative statements about the relative wrongness of killing androids vs killing humans.
Rob Secundus wrote: "Dave wrote: "
Did you ever watch TOS? Kirk verbal jui jitsued to "death" every sentient computer he ca..."
Dude, having rights extended to sentient machines is the plot of one of the most famous ..."
I never watched TOS. Only TNG. I didn't realize that that meant I lost all credibility to talk about Star Trek.
Did you ever watch TOS? Kirk verbal jui jitsued to "death" every sentient computer he ca..."
Dude, having rights extended to sentient machines is the plot of one of the most famous ..."
I never watched TOS. Only TNG. I didn't realize that that meant I lost all credibility to talk about Star Trek.

I have to admit that I only picked up on the androids are human subtext the second time I read the book. Clearly, they have no mechanical parts. Otherwise, it would be easy to identify them.
Sean wrote: "Anja wrote: "Excuse me? Why the hostility? Just say, "I disagree" and explain your point."
It doesn't matter whether Batty and Pris are indistinguishable from us, or if they look like Bender and R..."
I wasn't trying to make huge ethical and moral statements. I thought this community was such one that I didn't have to edit my language 500x to make it clear that I'm talking about the book. I am more or less tired of books that talk about robots needing to be treated as people while there are people in the book not being treated like people and the author never seems to care about that.
The hostileness of your tone however, makes me not want to continue discussing this. I don't want to be on pins and needles when I'm discussing my opinion thinking someone is going to call me "morally repugnant".
It doesn't matter whether Batty and Pris are indistinguishable from us, or if they look like Bender and R..."
I wasn't trying to make huge ethical and moral statements. I thought this community was such one that I didn't have to edit my language 500x to make it clear that I'm talking about the book. I am more or less tired of books that talk about robots needing to be treated as people while there are people in the book not being treated like people and the author never seems to care about that.
The hostileness of your tone however, makes me not want to continue discussing this. I don't want to be on pins and needles when I'm discussing my opinion thinking someone is going to call me "morally repugnant".

And I only watched TOS... I have just started watching TNG along with the Mission Log Podcast, so I'm sure I will see what you are talking about!

The main difference (besides some kind of unspecified bone goo test) is that they don't have empathy in the same way that humans do... but wasn't there somewhere in the book that mentioned that Androids were purposefully excluded from having that ability? As if they WOULD be able to empathize if we hadn't "made" them the way we did? Maybe I misread something, but I thought it was in reference to not being able to use the empathy-box devices.
Then again, it doesn't explain Androids like Buster who are operating what seems like 24/7... that's a bit of a clincher for the "yes, they're machines, not humans" angle.

Therefore they have no rights and its OK to kill them or use them as slaves.
Every criteria for distinguishing humans from robots is shown to be flawed.
So how do you know they aren’t humans grown in a lab?