Science Fiction Films discussion

February Film Discussion > February Topic: Artificial Intelligence- Robots, Androids, Computers, Cyborgs, Repliacants, et al

Comments Showing 1-43 of 43 (43 new)    post a comment »
dateDown arrow    newest »

message 1: by Alex DeLarge (new)

Alex DeLarge | 342 comments Mod
We've already begun a very interesting discussion so let's continue under this heading:) I think we can discuss the theme in general or as it pertains to a specific film. I'm waiting for THE STEPFORD WIVES (original) so I'll bring my feelings here in a few days.

What does seperate "us" from "them"? Is it empathy as BLADE RUNNER suggests? I think it's a nice twist to the story in that Roy Batty saves Deckard's life and possibly becomes (at least acts) human just before he dies. While Deckard is the one murdering without emotion...even though, to him, they are only machines. Is it the soul? Since I personally don't believe in life after death (and there's no empirical proof of this ethereal spirit), how can that define a human? Is it imagination? Intelligence? Just throwing some ideas around to keep the discussion on track:)

message 2: by Daniel (new)

Daniel | 39 comments I'd argue, in the case of "Blade Runner," it's memory. Deckard puts down Rachel by pointing out all her memories belong to other people and have been implanted in her. Later in the film, when Roy is dying, he recounts the things he's actually experienced and seen, the memories of which will die with him. Why does he tell them to Deckard? Because he wants to be remembered after he's gone.

message 3: by Tom (new)

Tom | 166 comments Alex, I'm going to have to disagree with you. I don't see Deckard in BLADE RUNNER murdering without emotion. The only actual onscreen retirements he performs upset him terribly, and there has to be some reason why he quit the police force.

message 4: by Daniel (new)

Daniel | 39 comments Well it's interesting that the only retirements we see him perform are of the two women who came back with Roy, and both are presented as long, drawn out, brutal affairs.

His life is also saved twice, each time by a replicant.

message 5: by Alex DeLarge (last edited Feb 02, 2009 06:12AM) (new)

Alex DeLarge | 342 comments Mod
Rachel asks him if he ever retired a human by mistake and his answer is seemingly cold indifference...but that could be his emotional barrier to keep his sanity. Tom, I think you're right in that it does affect him greatly which is probably while he retired (no pun intended), but the way he does the killing is cold and calculating: he shoots the one in the back! Memory is interesting but faulty, does it make us human? Especially if the memory does not belong to us. Animals have memory, my laptop has memory: maybe it's the intent to communicate and be remembered that make him act human. Is homo sapien defined only by a specific DNA code? That's a neat idea except that the replicants have artificial DNA exactly like ours, only with a built in expiration date.
Just some thoughts early on Monday morning.

message 6: by Jim (new)

Jim (jimmaclachlan) Alex DeLarge wrote: "...but the way he does the killing is cold and calculating: he shoots the one in the back! ..."

If you're going to shoot, you shoot when you have the best shot. Pulling the trigger or making the decision to do so is the deciding point. The direction the quarry faces should have nothing to do with the shooter's emotions. That way lies madness. By looking at the target's face, he would probably have a harder time of shooting - the target might become a person.

I agree that he's cold & calculating, probably to mask to his actual feelings. Duty, responsibility & public opinion have driven a lot of men to do things they find disturbing & repugnant. Sometimes it gets to a point they can no longer handle it. It's a common theme to war movies.

message 7: by Alex DeLarge (new)

Alex DeLarge | 342 comments Mod
I could tell you that any polic officer who shoots a defendant in the back during a chase would probably be brought up on charges himself. An officer has to judge if there is an immediate physical danger, to himself or the community, and this nanosecond decision must be made without fault: that's why I remain an investigator and not a street cop:) But I don't think that's the point of the film; it's not a police drama. Though it does play on Film Noir crime drama conventions and protagonists rarely (I can't recall one) kill the femme fatale by shooting them in the back; this scene throws us off-kilter and sickens us, even though we know she's a replicant and a murderer herself/itself. I like the shot of her bloody corpse from Deckard's perspective (looking down) because she looks like a dead woman...not a machine. I agree Jim, killing this way does dehumanize the event for Deckard and I guess he didn't have much of a choice. She sure as hell wasn't going to stop because there is no arrest for her...only retirement. And cinematically it was a thrilling scene, especially in the remastered version. Now that he is attracted to Rachel, the line between retirement and murder is very thin indeed.

message 8: by Kandice (new)

Kandice I have seen numerous "versions" of the movie,(Blade Runner) and wonder, does anyone hold with idea that Decker may be a replicant himself. Dick's book DID NOT portray him that way, but Scott says he purposely made that point vague and up to the viewers discretion. He did point more in that direction with subsequent releases, but how does this change the debate? If Decker himself is possibly not human, how can he judge and mete out justice? Can He?

message 9: by Tom (new)

Tom | 166 comments Okay, maybe I'm just getting some terms wrong. For me, cold and calculating is James Mason in NORTH BY NORTHWEST saying that these matters are best disposed of from a great height, over water. Or Alex Sebastian and his mother in NOTORIOUS, or Mr. Potter keeping the money he's found after realizing what it is in IT'S A WONDERFUL LIFE, or Roy Batty and Pris exchanging some rather loaded glances after getting Sebastian to agree to take Roy to see Mr. Tyrrell.

Deckard's onscreen retirements are brutal ugly messes, at least one of which borders on self-preservation. He is terribly shaken by the botched retirement of Zhora, even before being nearly retired himself by Leon. Not cold, not calculating at all, I'd say.

message 10: by Jim (new)

Jim (jimmaclachlan) Alex DeLarge, I guess my unstated point was that Deckard isn't a cop, but has a very specific job, like a guy who wants to bag a deer or a military sniper. He's an exterminator, a hunter-killer. Part of that persona is the decision to kill the target he's pointed at before he actually sees it. So cold & calculating are prerequisites & prey direction is meaningless.

He was told they were sub-human, the most important point here. He hardened himself to kill them because it was his duty & that's how he 'knows' it isn't murder. OK, they look human, but it's his responsibility to kill them because they aren't. Then he finds out they're possibly more human than he is. Suddenly the cold steel around his emotions, that's been eroding anyway (why he retired), is blown away.

That's what the entire film was about for me. He was lied to, lied to himself & that's been bugging him. He retires because it the lie was wearing thin at a subconscious level. Then the 'system' drags him back in. Slowly the truth is forced upon him & crashes in severely during the scene when Batty saves him & describes his life as he dies. He's left with no meaning in his life.

Not sure I'm putting that as well as I could...

message 11: by Jim (new)

Jim (jimmaclachlan) Kandice, I also read the book & no matter how different the movie was, it couldn't be that much different. No way was Deckard a replicant. It would defeat the entire point of the story, IMO.

The point I saw in the movie was the difference between human & replicant. Deckard the human nor his human friends weren't any better than the despised replicants. In fact, they may not have been as human due to their dehumanizing society. The book brought out the society portion even better with him & his wife depending on pills for their moods. The movie did it well with Deckard being forced into hunting again & the crowded street scenes.

message 12: by Kandice (new)

Kandice I agree. I always try to see movies adapted from books as different entities, so I am not confused. I have never liked the idea of Scott trying to push on us the idea that Deckard may be a replicant. I think there is enough internal conflict to prove the point without that. I also agree, that the point IS that just because the humans ARE humans, does not make them better than the replicants. They too have hopes, desires, memories (as someone said), and urges to do what is right, such as saving Deckard's butt! The fact that they are created by men and have less time to realize these hopes, is immaterial. They should still be allowed.
The line between the replicants and actual humans has become blurred enough, in my opinion, that they need, either to be treated AS a race, or not be manufactured anymore. To continue in the same fashion is slavery and subsequent genocide.

message 13: by Alex DeLarge (new)

Alex DeLarge | 342 comments Mod
Jim, we're definately on the same page! I think as a Blade Runner he needs to keep distance from his job merging with his personal life. And Tom, I think that's where I was going with my cold and calculating theme: not that Deckard is totally unfeeling but he is a human being first...and a cop second. As a cop, he is hard boiled, tough talking, and keeps his emotions in check. But this conflicts with his warm human nature which leads to his retirement from the force because he's the best. Before the film begins, this spiritual dilema between man and machine is already boiling under the surface, and when he is atrracted to Rachel the line merges and he's lost.

I love this forum because it helps me to bounce ideas and concepts around and get intelligent feedback!

Kandice, your insight is totally spectacular!! When does robot (etc...) even though they aren't human gain Constututional Rights?? When Dave disconnects it justified homicide?

message 14: by Kandice (new)

Kandice Thanks Alex. *blushes*

Personally, when they become self aware, I think they should have inalienable rights. Short of that, the point at which they can be personally held responsible for their actions. When they can be punished in some way other than termination. If we punish them and expect them to learn from their mistakes, aren't we granting them the flip side of inalienable rights? Doesn't that also show that we expect them to learn and develop? Not from our improvement of their programming, but from their own experiences and reactions to them.

Yes, I think in the case of Hal it WAS homicide. A clear cut case of the needs of the many outweighing the needs of the few. (or the one)

message 15: by Daniel (new)

Daniel | 39 comments I think in the case of HAL, tragic though his "death" is, it was rather clearly a case of self-defense. HAL had killed everyone else on the ship and tried to kill Dave (by not letting him back on). What more evidence would be needed?

message 16: by Alex DeLarge (last edited Feb 03, 2009 08:47AM) (new)

Alex DeLarge | 342 comments Mod
But who was acting in self defense? HAL killed the crew because he was going to be disconnected, so HAL's actions were self defense too. HAL's intent to kill wasn't formed until the Dave and Frank began discussing his "murder". And the source of HAL's mistake is never explained, except as "human error". I always felt that HAL became too human and therefore naturally prone to error, escpecially because HAL was programmed to lie to the crew, to be secretive until they reached their destination. But I read an interesting review by Harlan Ellison and he believed that the alien intelligence made the computer malfunction. I like HAL's gentle voice in contrast to Dave and Frank's almost emotionless dialogue. Who was more human?

message 17: by Daniel (new)

Daniel | 39 comments Clearly HAL is the most "human" character in "2001," even though Kubrick scrapped dialogue recorded by Martin Balsam and had it redone by Canadian actor Douglas Rain. HAL has "feelings" whereas no one else in the film engages in anything except small talk and exchange of technical information.

Nonetheless, HAL's reaction -- while understandable -- is a gross overreaction to the perceived threat and constitutes cold-blooded murder. Dave does not "kill" Hal, but performs the computer equivalent of a lobotomy, looking solely to remove the actual threat.

message 18: by Jim (new)

Jim (jimmaclachlan) Hmmm... If someone told me they were going to lobotomize me or kill me, I wouldn't see a difference. Death of me (ego, self) in any case. I'd try my best to do them in first.

I don't think that self-awareness bestows inalienable rights. It's a gray area, but animals are certainly self-aware & while ours are treated as well as the kids (often confused with) they don't have the same basic rights & never will. It's not just intelligence, but a matter of survival. I'm at the top of the food chain & I intend for me & mine to stay there. Yes, I'm selfish & no, I don't have a problem with it. I've raised a lot of animals just to eat. I treated them well, but that was their place in life.

That goes for any self-aware computers that may move in. They'll be welcomed & treated as well as we can, but if it comes down to keeping the family warm or giving them the last electric cord, the family stays warm. If they threaten me or mine, they're out.

message 19: by Kandice (new)

Kandice I am certainly in agreement with you about our being the top of the food chain, and have no problem reaping the benefits of that. I'm not a vegetarian, and I wear leather, and am still able to treat animals very well.
Where I disagree, is that we are talking about something we, as a race will have developed, designed and eventually manufactured. We did not make or create animals. If our creation outgrows what we intended and develops sentience on its own, I still think it would be wrong to continue to manufacture them to be used and, eventually, disposed of. I think they will have earned rights.

About Hal, I do think he "overeacted" but agree that he was fearful for his life or existence, over reaction, or not. I know officers of the law aren't supposed to shoot to kill when someone is running away, but I think in instances, it's possible to kill for self defense, even when the one you are killing isn't, at that second, threatening you.

If my children and I, and a dangerous person were stranded on a spaceship, in outerspace, and he had attempted to hurt, rape or kill us, for whatever reason, I would have no problem laying in wait or setting traps to try and kill HIM. We can't escape, probably can't hide forever, and if I wait until he is in the process of accosting us, I may lose. I still think that would be self defense.

message 20: by Alex DeLarge (last edited Feb 03, 2009 03:30PM) (new)

Alex DeLarge | 342 comments Mod
This is a great discussion friends:)

Do you feel sorry for HAL when he's labotomized? Frank's death scene is totally silent and cold, and HAL's is given an emotional depth that makes me very sad. Kubrick has toyed with our perspectives and flipped our empathy towards the machine...or is homo sapiens just a machine too, created by someone/something else? I ask that in a science fiction sense, not virginally born of a personal belief. Many of you may believe in a higher power (that's cool) but I'm a non-deist. Though BLADE RUNNER's allegory is of wo/mankind facing their creator and demanding a fix to that one fatal flaw, Death: "I want more life...fucker."
[I don't like the edit to the New Cut, because that's what I would say to a malignant designer:]

And the self defense discussion is a whole 'nother can 'o worms! This is where fiction, especially crime drama conventions, and reality diverge greatly:) But the first question is this: would the law even apply to a non-living creation? You can't murder a toaster, though I've technically done it before, and be legally accountable. But if that toaster were sentient (like the one in the series Red Dwarf) would that be homicide? It all comes back to the decision of giving Constitutional Rights to androids, et al. Hell, I still find it difficult to believe that in my father's lifetime children had no rights (my father worked in a Hershey "sweatshop" as a child) and Minorities were treated as less than human. And we could continue (ad infinitum it seems) to name the social injustices that still continue today:( Boggles my freakin' mind. Just a quick note: read Phil Dick's CRACK IN SPACE because I think it's the first science fiction novel that depicts a black president Jim Briskin!

So we're in one sense describing the future of racism: not of humans but of mechanical beings. I read Asimov's CAVES OF STEEEL as a child and this is the very issue he questioned, but it has been many years since I cracked the spine. I know I have my mother's first edition around here somewhere; thankfully, she saved every book she ever read, and shared them with me. Sigh, I do miss her even though its been 16 years since she passed.

message 21: by Kandice (new)

Kandice "So we're in one sense describing the future of racism: not of humans but of mechanical beings."

That is exactly what I think!
Yes, I did feel very sad at Hal's demise. I know I was manipulated into feeling that way by the director, but it was still very moving. The fact that we can feel sadness or distress of any kind, at the passing of a machine... Just goes toward the idea that it really is possible for them to "become" a race. To achieve rights (of any kind).

I think it's funny that you mention the lack of children's rights in our past. People living in Dicken's time could never have imagined children being granted any rights. Knowing that, is it really that hard to believe we may grant rights, of some kind, to something we created in the future. (the far, far future, I admit!)

message 22: by Jim (new)

Jim (jimmaclachlan) I haven't read Asimov's books in years. I probably should. Smart man, but I do vaguely recall that issue & you're right. I also worry about the pendulum swinging too far the other way.

I understand the argument for 'other entity rights' &, to some extent, agree with it, but it's a matter of balancing the rights against the responsibilities intelligently. Agreed, the picture painted in Blade Runner is complete oppression & exploitation of the replicants & that isn't right, but we have to be careful when granting rights too. It often goes to far, especially at first.

Kids rights is a good example. While I don't agree with having sweat shops for kids, I don't think it is right for parents to have no say in a 14 year old's decisions on having a baby when they're responsible. I think this is a situation where 'rights' have swung to far.

Maybe I'm just mean & selfish, but some things, like animals, will never have the same or full rights in my eyes. My eyes, no matter what the law says.

message 23: by Kandice (new)

Kandice I don't think that makes you seem mean or small in any way! You're right. There would need to be a defintition and very firm line to separate human rights and android/computer/robot rights, (whatever)but at some point they would have earned, at least a modicum, of rights. (a lot of rights in that sentence;)
As far as the kid thing goes, that is a perfect example. I hate reading stories about parents being unable to parent effectively because it infringes on the rights of a child. That's ridiculous. How do we fix that, though? Who decides?

message 24: by Manuel (new)

Manuel | 144 comments Does anyone remember "Colossus" from the movie "The Forbin Project?" 1970

I have always found the end of that movie rather chilling. Colossus the super computer has become sentient; originally only designed to coordinate America's defence, it eventually merges with the Soviet counterpart. It decides it can do a better job of running human affairs, it will make war impossible and turns itself into "World Control".
At the end of the movie, Colossus confronts its creator (Prof Forbin) and declares....
"at first you will resist, but eventually you will love me"

message 25: by Daniel (new)

Daniel | 39 comments Manuel wrote: "Does anyone remember "Colossus" from the movie "The Forbin Project?" 1970

I have always found the end of that movie rather chilling. "

I love springing it on my students when I teach SF films. They're waiting for the last minute happy ending and they're stunned when it doesn't come.

message 26: by Manuel (last edited Feb 04, 2009 02:05PM) (new)

Manuel | 144 comments Ive heard some people compare Colossus to the computers in "Terminator", but the big difference in the Terminator series is that the computers decide they dont need human beings.

Colossus thinks it can do a better job managing human affairs and has appointed itself as a guardian for human-kind. It will be the ultimate objective government sharing all resources for the better.

Yet the thought of a non human dictator is especially creepy, almost admiting that humans are incapable of making rational decisons based on fairness and equality.

The concept is creepy and seductive at the same time.

Ive heard that Ron Howard is attempting to make a remake. His TV mom from Happy Days was in the original movie.

message 27: by Kandice (new)

Kandice Well, if they can weigh both sides of an issue, objectively, without emotion, they may be able to make a better decision. How horrible I just typed that, but it's true, right? The only drawback, is that there ARE decisions that need to be tempered with humanity. It's not a black and white world.
I've never seen that movie, but will look for it.

message 28: by Daniel (new)

Daniel | 39 comments Manuel wrote: "Ive heard that Ron Howard is attempting to make a remake. His TV mom from Happy Days was in the original movie. "

In "Colossus?" I don't think so.

message 29: by Alex DeLarge (new)

Alex DeLarge | 342 comments Mod
I need to see COLOSSUS again, it's been years. Now moved to top of my queue!

message 30: by Manuel (new)

Manuel | 144 comments
You are wrong Daniel.
Marion Ross is one of Forbin's assistants in the main control room. One of the smaller roles, but you can see her name listed in the credits.

message 31: by Daniel (new)

Daniel | 39 comments Then it's either a tiny role or she's almost unrecognizable. I'll have to look the next time I use the film in class.

Thanks for the correction.

message 32: by Jim (new)

Jim (jimmaclachlan) I vaguely remember reading a trilogy about 'Colossus' many years ago. I think the last one was 'Colossus & the Crab'. Author Jones? It was chilling. I don't recall the 'Forbin Project' but now I want to see it. That sounds similar.

Any computer decision is going to be based on numbers; number served, efficiency &, I guess, need. Decisions by the numbers are always chilling & becoming more the norm as our society becomes bigger & more unwieldy. Look at the digital TV conversion that was just put on hold. 65 million (?) without a converter was a big enough number to stop a project that's been in the works for years.

What electric poles to fix after a storm are often based on 'the best bang for the buck' thinking. We have a lot of that going on now here in KY & the news is having a field day with it. Number decisions are not always wrong. They're probably more often right, but it just seems so heartless & what numbers are being used is always the question. Is it better to restore power to 20 healthy people or 8 elderly ones? How much do pets & kids weigh in?

I see the same decision made in another way. Web site writers often assume everyone has high speed. I don't at home since it isn't available. What web sites I can even use is decreasing rapidly.

message 33: by Daniel (new)

Daniel | 39 comments "Colossus: The Forbin Project" is the film version of the first book in the series.

message 34: by Alex DeLarge (last edited Feb 05, 2009 07:57AM) (new)

Alex DeLarge | 342 comments Mod
What if criminal jury decisions were left in the "hands" of a computer? The Prosecution and Defense would enter the relevant facts and it would render a decision in a few moments. This is interesting because a jury is not allowed, by law, to make a decision based upon sympathy, or facts not presented at trial. 12 ANGRY MEN is the perfect example: the jury actually broke the law in rendering their verdict! (Which I mention in my review). I've seen jurys deliver Guilty verdicts and when interviewed, say things like: "Well, we weren't sure but he really looked like a child molester". On the flip-side, I've had Aquitalls based upon the notion that the jury thought the victim was a PoS (Piece Of Shit), and didn't care what the facts depicted. Anyway, my question is this: should we let important decisions be made by a completely nuetral source? Or would sentience cause a computer (et al) to become emotional and unable to remain nuetral? And if we surrendered this right to a computer.....what next? LOGAN'S RUN, where everyone over 30 is executed to ensure survival of the species?

message 35: by Jim (new)

Jim (jimmaclachlan) Thank you, Daniel. I wonder if I still have those books. I think I got rid of them when I moved.

Alex, you bring up a good point. I've always wondered about those points. We like to watch 'Law & Order' & they often show real gray area trials. I'm not sure I'd want a computer deciding them. Isn't there a saying, "When I was younger, I wanted justice. Now I just want mercy." or something like that? I'm old enough to understand & agree with it, anyway.

There are also instances were the jury is told to forget damning, but illegal testimony. I never figured that worked well. It would with a computer.

Eyewitness testimony is notoriously unreliable, yet seems to weigh in the most with juries. I wonder how a computer would weigh it? Would we need to revise laws to work with a computer or AI?

message 36: by [deleted user] (new)

Great discussion! One point that hasn't been brought up yet is the added complication of cultural bias and influence. We have huge gulfs in terms of sexual equality alone, even with UN regulations trying to impose basic rights, all around the world; imagine how difficult it would be with machines!

A good book on somewhat similar lines is 'Intelligence in Nature' by Jeremy Narby...

message 37: by Angie (new)

Angie Perfect timing for this list 20 hotshot movie robots:,,20208...

Personally I am still surprised that Wall-E was not nominated for best picture at the Oscars... SNUB!

message 38: by George (last edited Feb 12, 2009 06:14AM) (new)

George | 63 comments The triumph of both films is the fact that the question of what it means to be human remains so open. Why wouldn't Decker shoot a machine in the back? what back? what's the difference between shooting a replicant and flipping a switch? what's to be gained in taking on one head on, what's the point in turning off a tv by hitting the power switch or pulling the plug? Decker's problem is that he's no longer capable of perceiving replicants as machines, and he questions the humanity of the biological humans directing his actions. He can no longer assume one is better than the other, or making any real distinction.

We can feel pity for Hal, but should we be sorry? If someone wants to kill you, should you do nothing, even if he has "good" reasons? Are Hal's motivations in killing humans any different in the final analysis than the astronaut's reason for killing him? Does turning off Hal equate to death?

message 39: by Angie (new)

Angie Anyone interested in a March movie? Maybe we could talk about a series? Alien? And watch all the films? Or we could pick a movie like Silent Running?

message 40: by Jim (new)

Jim (jimmaclachlan) Sure.

message 41: by Alex DeLarge (new)

Alex DeLarge | 342 comments Mod
I've been busy with work and haven't thought about February really almost over?

message 42: by Phillip (new)

Phillip it ends this weekend!

message 43: by Angie (new)

Angie So do we want to do a series? Maybe Predator, Aliens, Terminator? Or do a particular movie? Silent Running? 2001 Space Odyssey?

back to top