3 Answers from Robin, by Bryan Caplan

Robin's answers to my three questions:


Let me emphasize yet again that I present myself in the
book as an expert on facts, not on values, so my personal values should
not be relevant to the quality of my book.



The fact that you ask #3 suggests you seek an integral over the value
of the entire future. In which case I guess I need to interpret #1 as
no age of em or any form of AI ever appearing, and it is hard to
interpret "hell" as I don't know how many people suffer it for how long.



But if 0 is the worst case and 10 is the best case, then humans
continuing without any AI until they naturally go extinct is a 2, while
making it to the em stage and therefore having a decent chances to
continue further is a 5. Biological humans going extinct a year later
is 4.9.

This is as I expected and feared.  Given Robin's values, The Age of Em's disinterest in mankind makes perfect sense.  But to be blunt, treating imminent human extinction as a minor cost of progress makes Robin's other baffling moral views seem sensible by comparison. 

In practice, I'm sure Robin would be horrified to see robots wipe out everyone he knows.  Why he sees no need to reconcile his near-horror with his far-optimism is a deep mystery to me.

To be clear, I'm not even slightly worried about robots wiping out mankind.  But if it happened, it would be the worst thing that ever happened, and an infinite population of ems would not mitigate this unparalleled disaster.

(2 COMMENTS)
 •  0 comments  •  flag
Share on Twitter
Published on June 13, 2016 22:08
No comments have been added yet.


Bryan Caplan's Blog

Bryan Caplan
Bryan Caplan isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Bryan Caplan's blog with rss.