Stupid Worse Than Evil? #science #robotics #technology #artificialintelligence #AI

GLORY ebook with CAT EYES (201x300)In my science fiction story about colonizing Mars, I send robots ahead to prove out construction methods and then work with the colonists. They have an Artificial Intelligence, too, to rely on. It seems unreasonable to write about humanity’s near future in space without robots and AIs.


But are they a good idea?


Some have speculated robots might kill us off even if programmed to create maximum happiness – since humans are a sorrowful lot, an AI might decide eliminating humans would make the world a happier place. Or maybe they’ll just decide we’re too stupid to keep around (which is sorta what happens in The Terminator.)


Or disaster may be our own fault.


Humans can be way too trusting of robots—and that our


HAL - not evil but deadly

HAL – not evil but deadly


inclination to follow our robotic overlords could actually be a very dangerous human behavior that needs to be taken into account when designing everything from autonomous vehicles to emergency evacuation tech.


Scientists developing robots to lead people in high-rises to safety in case of a fire discovered people would follow the robot even when it made obviously dangerous and ridiculous errors. We seem all too ready to shift our brains into neutral and follow orders.


I see echoes of this problem in myself. I can no longer remember the date because my phone will tell me. I hop in a car without a thought to where I’m going because the GPS will tell me. I’m ready to hop into a self-driving car, too. Good luck to me.


How about you – ready to let an AI drive you to the grocery store? or fly you to Mars?


Thanks to fastcodesign.com


Filed under: Kate's Books, Neat Science News, Science Fiction Tagged: AI, Artificial intelligence, best scif fi, hardsf, Mars colony, near future in space, programming, robot, science fiction, sf
 •  0 comments  •  flag
Share on Twitter
Published on March 12, 2016 06:47
No comments have been added yet.