SciFi and Fantasy eBook Club discussion

General Topics > 3 Laws of robotics, are they valid?

Comments Showing 1-2 of 2 (2 new)    post a comment »
dateDown arrow    newest »

message 1: by Varun (new)

Varun (varunsayal) | 16 comments 3 Laws of robotics, are they valid? Or were they just a wishful thought Asimov had?

message 2: by Will (new)

Will Once (willonce) | 121 comments The three laws aren't needed just yet and may never be. We don't yet have AI with sufficient sentience to understand the concepts of harming a human or protecting its own existence. Existing autonomous or semi autonomous systems, like say a self-driving car, are programmed to avoid operations which can harm humans or themselves but they have no concept of "humans" or "existence".

When we do develop systems that are truly capable of "thinking" for themselves we will indeed need to program them to obey a wide range of restrictions, such as protection of the environment, avoiding damage to property, noise etc. For example a self-driving car will be programmed not to exceed the speed limit or to run red lights or ... you get the picture.

Military AIs will presumably not have restrictions on harming humans or allowing themselves to be destroyed. And they almost certainly won't be programmed to allow any human to give them instructions.

So I'm afraid that the three laws are mostly a literary device. It's a way of creating enough tension to generate stories but actual AI is likely to be far more complex.

BTW, E.C., I think the short story you are looking for is called "Runaround".

back to top