What do you think?
Rate this book
303 pages, Paperback
First published July 29, 2015
'The arrival of superintelligence, if and when it happens, would represent a technological singularity... and would be the most significant event in human history, bar none.' (p. xviii)
That post-scarcity society which Banks called the Culture.
'An economic singularity [caused by massive automation] might lead to an elite owning the means of production and suppressing the rest of us in a dystopian technological authoritarian regime. Or it could lead to an economy of radical abundance, where nobody has to work for a living, and we are all free to have fun, and stretch our minds and develop our faculties to the full. I hope and believe that the latter is possible...' (p. xvii-xviii)
'So perhaps we should wait a decade or two and hope that there will be a "Sputnik moment" when it becomes evident that AGI is getting close - and that this warning sounds comfortably before AGI actually arrives. We could then take stock of progress towards Friendly AI, and if the latter was insufficiently advanced we could impose the ban on further AI research at that point. With luck we might be able to identify the specific elements of AI research without which the first AGI could not be created, and other types of AI research could continue as normal.' (p.153).
'Some of what is written on the subject is too academic to appeal to a general audience, and some of it is partisan or fanciful. I have tried to make this book balanced and informative, comprehensive and concise. It is intended for newcomers to the subject as well as for those who are already familiar with a lot of the current thinking about surviving AI.' (p.185).