I, Robot
Rate it:
Open Preview
Read between September 9 - September 10, 2025
18%
Flag icon
the three fundamental Rules of Robotics—the three rules that are built most deeply into a robot’s positronic brain.”
18%
Flag icon
“We have: One, a robot may not injure a human being, or, through inaction, allow a human being to come to harm.” “Right!” “Two,” continued Powell, “a robot must obey the orders given it by human beings except where such orders would conflict with the First Law.” “Right!” “And three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”
18%
Flag icon
The conflict between the various rules is ironed out by the different positronic potentials in the brain. We’ll say that a robot is walking into danger and knows it. The automatic potential that Rule 3 sets up turns him back. But suppose you order him to walk into that danger. In that case, Rule 2 sets up a counterpotential higher than the previous one and the robot follows orders at the risk of existence.”
18%
Flag icon
“So Rule 3 has been strengthened—that was specifically mentioned, by the way, in the advance notices on the SPD models—so that his allergy to danger is unusually high. At the same time, when you sent him out after the selenium, you gave him his order casually and without special emphasis, so that the Rule 2 potential set-up was rather weak. Now, hold on; I’m just stating facts.”
18%
Flag icon
“You see how it works, don’t you? There’s some sort of danger centering at the selenium pool. It increases as he approaches, and at a certain distance from it the Rule 3 potential, unusually high to start with, exactly balances the Rule 2 potential, unusually low to start with.”
18%
Flag icon
Rule 3 drives him back and Rule 2 drives him forward—”
54%
Flag icon
All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! Why, without it, the first order you tried to give a robot would result in your death. Unstable? What do you think?”
54%
Flag icon
Frankenstein Complex you’re exhibiting has a certain justification—hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed—merely modified.”
82%
Flag icon
you just can’t differentiate between a robot and the very best of humans.”