Micah Newman

37%
Flag icon
Your brain isn’t hardwired with a specific, prerecorded statement that “Blowing up a burning building containing my mother is a bad idea.” And yet you’re trying to prerecord that exact specific statement in the Outcome Pump’s future function. So the wish is exploding, turning into a giant lookup table that records your judgment of every possible path through time. You failed to ask for what you really wanted. You wanted your mother to go on living, but you wished for her to become more distant from the center of the building.
Micah Newman
AI
Rationality: From AI to Zombies
Rate this book
Clear rating
Open Preview