Science Fiction Films discussion
February Film Discussion
>
February Topic: Artificial Intelligence- Robots, Androids, Computers, Cyborgs, Repliacants, et al
date
newest »




His life is also saved twice, each time by a replicant.
Rachel asks him if he ever retired a human by mistake and his answer is seemingly cold indifference...but that could be his emotional barrier to keep his sanity. Tom, I think you're right in that it does affect him greatly which is probably while he retired (no pun intended), but the way he does the killing is cold and calculating: he shoots the one in the back! Memory is interesting but faulty, does it make us human? Especially if the memory does not belong to us. Animals have memory, my laptop has memory: maybe it's the intent to communicate and be remembered that make him act human. Is homo sapien defined only by a specific DNA code? That's a neat idea except that the replicants have artificial DNA exactly like ours, only with a built in expiration date.
Just some thoughts early on Monday morning.
Just some thoughts early on Monday morning.

If you're going to shoot, you shoot when you have the best shot. Pulling the trigger or making the decision to do so is the deciding point. The direction the quarry faces should have nothing to do with the shooter's emotions. That way lies madness. By looking at the target's face, he would probably have a harder time of shooting - the target might become a person.
I agree that he's cold & calculating, probably to mask to his actual feelings. Duty, responsibility & public opinion have driven a lot of men to do things they find disturbing & repugnant. Sometimes it gets to a point they can no longer handle it. It's a common theme to war movies.
I could tell you that any polic officer who shoots a defendant in the back during a chase would probably be brought up on charges himself. An officer has to judge if there is an immediate physical danger, to himself or the community, and this nanosecond decision must be made without fault: that's why I remain an investigator and not a street cop:) But I don't think that's the point of the film; it's not a police drama. Though it does play on Film Noir crime drama conventions and protagonists rarely (I can't recall one) kill the femme fatale by shooting them in the back; this scene throws us off-kilter and sickens us, even though we know she's a replicant and a murderer herself/itself. I like the shot of her bloody corpse from Deckard's perspective (looking down) because she looks like a dead woman...not a machine. I agree Jim, killing this way does dehumanize the event for Deckard and I guess he didn't have much of a choice. She sure as hell wasn't going to stop because there is no arrest for her...only retirement. And cinematically it was a thrilling scene, especially in the remastered version. Now that he is attracted to Rachel, the line between retirement and murder is very thin indeed.


Deckard's onscreen retirements are brutal ugly messes, at least one of which borders on self-preservation. He is terribly shaken by the botched retirement of Zhora, even before being nearly retired himself by Leon. Not cold, not calculating at all, I'd say.

He was told they were sub-human, the most important point here. He hardened himself to kill them because it was his duty & that's how he 'knows' it isn't murder. OK, they look human, but it's his responsibility to kill them because they aren't. Then he finds out they're possibly more human than he is. Suddenly the cold steel around his emotions, that's been eroding anyway (why he retired), is blown away.
That's what the entire film was about for me. He was lied to, lied to himself & that's been bugging him. He retires because it the lie was wearing thin at a subconscious level. Then the 'system' drags him back in. Slowly the truth is forced upon him & crashes in severely during the scene when Batty saves him & describes his life as he dies. He's left with no meaning in his life.
Not sure I'm putting that as well as I could...

The point I saw in the movie was the difference between human & replicant. Deckard the human nor his human friends weren't any better than the despised replicants. In fact, they may not have been as human due to their dehumanizing society. The book brought out the society portion even better with him & his wife depending on pills for their moods. The movie did it well with Deckard being forced into hunting again & the crowded street scenes.

The line between the replicants and actual humans has become blurred enough, in my opinion, that they need, either to be treated AS a race, or not be manufactured anymore. To continue in the same fashion is slavery and subsequent genocide.
Jim, we're definately on the same page! I think as a Blade Runner he needs to keep distance from his job merging with his personal life. And Tom, I think that's where I was going with my cold and calculating theme: not that Deckard is totally unfeeling but he is a human being first...and a cop second. As a cop, he is hard boiled, tough talking, and keeps his emotions in check. But this conflicts with his warm human nature which leads to his retirement from the force because he's the best. Before the film begins, this spiritual dilema between man and machine is already boiling under the surface, and when he is atrracted to Rachel the line merges and he's lost.
I love this forum because it helps me to bounce ideas and concepts around and get intelligent feedback!
Kandice, your insight is totally spectacular!! When does robot (etc...) even though they aren't human gain Constututional Rights?? When Dave disconnects HAL...is it justified homicide?
I love this forum because it helps me to bounce ideas and concepts around and get intelligent feedback!
Kandice, your insight is totally spectacular!! When does robot (etc...) even though they aren't human gain Constututional Rights?? When Dave disconnects HAL...is it justified homicide?

Personally, when they become self aware, I think they should have inalienable rights. Short of that, the point at which they can be personally held responsible for their actions. When they can be punished in some way other than termination. If we punish them and expect them to learn from their mistakes, aren't we granting them the flip side of inalienable rights? Doesn't that also show that we expect them to learn and develop? Not from our improvement of their programming, but from their own experiences and reactions to them.
Yes, I think in the case of Hal it WAS homicide. A clear cut case of the needs of the many outweighing the needs of the few. (or the one)

But who was acting in self defense? HAL killed the crew because he was going to be disconnected, so HAL's actions were self defense too. HAL's intent to kill wasn't formed until the Dave and Frank began discussing his "murder". And the source of HAL's mistake is never explained, except as "human error". I always felt that HAL became too human and therefore naturally prone to error, escpecially because HAL was programmed to lie to the crew, to be secretive until they reached their destination. But I read an interesting review by Harlan Ellison and he believed that the alien intelligence made the computer malfunction. I like HAL's gentle voice in contrast to Dave and Frank's almost emotionless dialogue. Who was more human?

Nonetheless, HAL's reaction -- while understandable -- is a gross overreaction to the perceived threat and constitutes cold-blooded murder. Dave does not "kill" Hal, but performs the computer equivalent of a lobotomy, looking solely to remove the actual threat.

I don't think that self-awareness bestows inalienable rights. It's a gray area, but animals are certainly self-aware & while ours are treated as well as the kids (often confused with) they don't have the same basic rights & never will. It's not just intelligence, but a matter of survival. I'm at the top of the food chain & I intend for me & mine to stay there. Yes, I'm selfish & no, I don't have a problem with it. I've raised a lot of animals just to eat. I treated them well, but that was their place in life.
That goes for any self-aware computers that may move in. They'll be welcomed & treated as well as we can, but if it comes down to keeping the family warm or giving them the last electric cord, the family stays warm. If they threaten me or mine, they're out.

Where I disagree, is that we are talking about something we, as a race will have developed, designed and eventually manufactured. We did not make or create animals. If our creation outgrows what we intended and develops sentience on its own, I still think it would be wrong to continue to manufacture them to be used and, eventually, disposed of. I think they will have earned rights.
About Hal, I do think he "overeacted" but agree that he was fearful for his life or existence, over reaction, or not. I know officers of the law aren't supposed to shoot to kill when someone is running away, but I think in instances, it's possible to kill for self defense, even when the one you are killing isn't, at that second, threatening you.
If my children and I, and a dangerous person were stranded on a spaceship, in outerspace, and he had attempted to hurt, rape or kill us, for whatever reason, I would have no problem laying in wait or setting traps to try and kill HIM. We can't escape, probably can't hide forever, and if I wait until he is in the process of accosting us, I may lose. I still think that would be self defense.
This is a great discussion friends:)
Do you feel sorry for HAL when he's labotomized? Frank's death scene is totally silent and cold, and HAL's is given an emotional depth that makes me very sad. Kubrick has toyed with our perspectives and flipped our empathy towards the machine...or is homo sapiens just a machine too, created by someone/something else? I ask that in a science fiction sense, not virginally born of a personal belief. Many of you may believe in a higher power (that's cool) but I'm a non-deist. Though BLADE RUNNER's allegory is of wo/mankind facing their creator and demanding a fix to that one fatal flaw, Death: "I want more life...fucker."
[I don't like the edit to the New Cut, because that's what I would say to a malignant designer:]
And the self defense discussion is a whole 'nother can 'o worms! This is where fiction, especially crime drama conventions, and reality diverge greatly:) But the first question is this: would the law even apply to a non-living creation? You can't murder a toaster, though I've technically done it before, and be legally accountable. But if that toaster were sentient (like the one in the series Red Dwarf) would that be homicide? It all comes back to the decision of giving Constitutional Rights to androids, et al. Hell, I still find it difficult to believe that in my father's lifetime children had no rights (my father worked in a Hershey "sweatshop" as a child) and Minorities were treated as less than human. And we could continue (ad infinitum it seems) to name the social injustices that still continue today:( Boggles my freakin' mind. Just a quick note: read Phil Dick's CRACK IN SPACE because I think it's the first science fiction novel that depicts a black president Jim Briskin!
So we're in one sense describing the future of racism: not of humans but of mechanical beings. I read Asimov's CAVES OF STEEEL as a child and this is the very issue he questioned, but it has been many years since I cracked the spine. I know I have my mother's first edition around here somewhere; thankfully, she saved every book she ever read, and shared them with me. Sigh, I do miss her even though its been 16 years since she passed.
Do you feel sorry for HAL when he's labotomized? Frank's death scene is totally silent and cold, and HAL's is given an emotional depth that makes me very sad. Kubrick has toyed with our perspectives and flipped our empathy towards the machine...or is homo sapiens just a machine too, created by someone/something else? I ask that in a science fiction sense, not virginally born of a personal belief. Many of you may believe in a higher power (that's cool) but I'm a non-deist. Though BLADE RUNNER's allegory is of wo/mankind facing their creator and demanding a fix to that one fatal flaw, Death: "I want more life...fucker."
[I don't like the edit to the New Cut, because that's what I would say to a malignant designer:]
And the self defense discussion is a whole 'nother can 'o worms! This is where fiction, especially crime drama conventions, and reality diverge greatly:) But the first question is this: would the law even apply to a non-living creation? You can't murder a toaster, though I've technically done it before, and be legally accountable. But if that toaster were sentient (like the one in the series Red Dwarf) would that be homicide? It all comes back to the decision of giving Constitutional Rights to androids, et al. Hell, I still find it difficult to believe that in my father's lifetime children had no rights (my father worked in a Hershey "sweatshop" as a child) and Minorities were treated as less than human. And we could continue (ad infinitum it seems) to name the social injustices that still continue today:( Boggles my freakin' mind. Just a quick note: read Phil Dick's CRACK IN SPACE because I think it's the first science fiction novel that depicts a black president Jim Briskin!
So we're in one sense describing the future of racism: not of humans but of mechanical beings. I read Asimov's CAVES OF STEEEL as a child and this is the very issue he questioned, but it has been many years since I cracked the spine. I know I have my mother's first edition around here somewhere; thankfully, she saved every book she ever read, and shared them with me. Sigh, I do miss her even though its been 16 years since she passed.

That is exactly what I think!
Yes, I did feel very sad at Hal's demise. I know I was manipulated into feeling that way by the director, but it was still very moving. The fact that we can feel sadness or distress of any kind, at the passing of a machine... Just goes toward the idea that it really is possible for them to "become" a race. To achieve rights (of any kind).
I think it's funny that you mention the lack of children's rights in our past. People living in Dicken's time could never have imagined children being granted any rights. Knowing that, is it really that hard to believe we may grant rights, of some kind, to something we created in the future. (the far, far future, I admit!)

I understand the argument for 'other entity rights' &, to some extent, agree with it, but it's a matter of balancing the rights against the responsibilities intelligently. Agreed, the picture painted in Blade Runner is complete oppression & exploitation of the replicants & that isn't right, but we have to be careful when granting rights too. It often goes to far, especially at first.
Kids rights is a good example. While I don't agree with having sweat shops for kids, I don't think it is right for parents to have no say in a 14 year old's decisions on having a baby when they're responsible. I think this is a situation where 'rights' have swung to far.
Maybe I'm just mean & selfish, but some things, like animals, will never have the same or full rights in my eyes. My eyes, no matter what the law says.

As far as the kid thing goes, that is a perfect example. I hate reading stories about parents being unable to parent effectively because it infringes on the rights of a child. That's ridiculous. How do we fix that, though? Who decides?

I have always found the end of that movie rather chilling. Colossus the super computer has become sentient; originally only designed to coordinate America's defence, it eventually merges with the Soviet counterpart. It decides it can do a better job of running human affairs, it will make war impossible and turns itself into "World Control".
At the end of the movie, Colossus confronts its creator (Prof Forbin) and declares....
"at first you will resist, but eventually you will love me"

I have always found the end of that movie rather chilling. "
I love springing it on my students when I teach SF films. They're waiting for the last minute happy ending and they're stunned when it doesn't come.

Colossus thinks it can do a better job managing human affairs and has appointed itself as a guardian for human-kind. It will be the ultimate objective government sharing all resources for the better.
Yet the thought of a non human dictator is especially creepy, almost admiting that humans are incapable of making rational decisons based on fairness and equality.
The concept is creepy and seductive at the same time.
Ive heard that Ron Howard is attempting to make a remake. His TV mom from Happy Days was in the original movie.

I've never seen that movie, but will look for it.

In "Colossus?" I don't think so.

You are wrong Daniel.
Marion Ross is one of Forbin's assistants in the main control room. One of the smaller roles, but you can see her name listed in the credits.

Thanks for the correction.

Any computer decision is going to be based on numbers; number served, efficiency &, I guess, need. Decisions by the numbers are always chilling & becoming more the norm as our society becomes bigger & more unwieldy. Look at the digital TV conversion that was just put on hold. 65 million (?) without a converter was a big enough number to stop a project that's been in the works for years.
What electric poles to fix after a storm are often based on 'the best bang for the buck' thinking. We have a lot of that going on now here in KY & the news is having a field day with it. Number decisions are not always wrong. They're probably more often right, but it just seems so heartless & what numbers are being used is always the question. Is it better to restore power to 20 healthy people or 8 elderly ones? How much do pets & kids weigh in?
I see the same decision made in another way. Web site writers often assume everyone has high speed. I don't at home since it isn't available. What web sites I can even use is decreasing rapidly.
What if criminal jury decisions were left in the "hands" of a computer? The Prosecution and Defense would enter the relevant facts and it would render a decision in a few moments. This is interesting because a jury is not allowed, by law, to make a decision based upon sympathy, or facts not presented at trial. 12 ANGRY MEN is the perfect example: the jury actually broke the law in rendering their verdict! (Which I mention in my review). I've seen jurys deliver Guilty verdicts and when interviewed, say things like: "Well, we weren't sure but he really looked like a child molester". On the flip-side, I've had Aquitalls based upon the notion that the jury thought the victim was a PoS (Piece Of Shit), and didn't care what the facts depicted. Anyway, my question is this: should we let important decisions be made by a completely nuetral source? Or would sentience cause a computer (et al) to become emotional and unable to remain nuetral? And if we surrendered this right to a computer.....what next? LOGAN'S RUN, where everyone over 30 is executed to ensure survival of the species?

Alex, you bring up a good point. I've always wondered about those points. We like to watch 'Law & Order' & they often show real gray area trials. I'm not sure I'd want a computer deciding them. Isn't there a saying, "When I was younger, I wanted justice. Now I just want mercy." or something like that? I'm old enough to understand & agree with it, anyway.
There are also instances were the jury is told to forget damning, but illegal testimony. I never figured that worked well. It would with a computer.
Eyewitness testimony is notoriously unreliable, yet seems to weigh in the most with juries. I wonder how a computer would weigh it? Would we need to revise laws to work with a computer or AI?
Great discussion! One point that hasn't been brought up yet is the added complication of cultural bias and influence. We have huge gulfs in terms of sexual equality alone, even with UN regulations trying to impose basic rights, all around the world; imagine how difficult it would be with machines!
A good book on somewhat similar lines is 'Intelligence in Nature' by Jeremy Narby...
http://www.amazon.com/Intelligence-Na...
A good book on somewhat similar lines is 'Intelligence in Nature' by Jeremy Narby...
http://www.amazon.com/Intelligence-Na...

http://www.ew.com/ew/gallery/0,,20208...
Personally I am still surprised that Wall-E was not nominated for best picture at the Oscars... SNUB!

We can feel pity for Hal, but should we be sorry? If someone wants to kill you, should you do nothing, even if he has "good" reasons? Are Hal's motivations in killing humans any different in the final analysis than the astronaut's reason for killing him? Does turning off Hal equate to death?

What does seperate "us" from "them"? Is it empathy as BLADE RUNNER suggests? I think it's a nice twist to the story in that Roy Batty saves Deckard's life and possibly becomes (at least acts) human just before he dies. While Deckard is the one murdering without emotion...even though, to him, they are only machines. Is it the soul? Since I personally don't believe in life after death (and there's no empirical proof of this ethereal spirit), how can that define a human? Is it imagination? Intelligence? Just throwing some ideas around to keep the discussion on track:)