An argument that achieving millennial life spans or monumental intellects will destroy values that give meaning to human lives. Proposals to make us smarter than the greatest geniuses or to add thousands of years to our life spans seem fit only for the spam folder or trash can. And yet this is what contemporary advocates of radical enhancement offer in all seriousness. They present a variety of technologies and therapies that will expand our capacities far beyond what is currently possible for human beings. In Humanity's End, Nicholas Agar argues against radical enhancement, describing its destructive consequences. Agar examines the proposals of four prominent radical Ray Kurzweil, who argues that technology will enable our escape from human biology; Aubrey de Grey, who calls for anti-aging therapies that will achieve “longevity escape velocity”; Nick Bostrom, who defends the morality and rationality of enhancement; and James Hughes, who envisions a harmonious democracy of the enhanced and the unenhanced. Agar argues that the outcomes of radical enhancement could be darker than the rosy futures described by these thinkers. The most dramatic means of enhancing our cognitive powers could in fact kill us; the radical extension of our life span could eliminate experiences of great value from our lives; and a situation in which some humans are radically enhanced and others are not could lead to tyranny of posthumans over humans.
Humanity's End offers one of the more sensible arguments against the pursuit of radical enhancement. Agar develops his argument (what he calls the species-relative argument) in the first 2 chapters then goes on to offer an interesting wager argument against mind-uploading (Kurzweil). The chapter on de Grey is not all too critical, even though SNES is highly speculative. The chapter on Bostrom wasn't that memorable, but the chapter on Hughes was useful (moreso than anything Fukuyama has written on moral status and human dignity). Final thoughts: Agar writes well and is very clear but needs to do some more work to make his species argument more compelling (i.e. show that we will lose some objective, special value if we do transcend the human species). I also find the strictness of the precautionary approach problematic.
2 and a half stars. If he had cut the subtitle, and left out his opinions, it would have been 5/5, and a wonderfully depressing but well informed prediction of, indeed, humanity's end.
Would serve as an alright introduction to the whole future tech craziness, as he nicely presents a little detail of it all, a good sprinkling of the latest advances in science/tech, and the soon predicted ones, and all in quite easily parsable prose too. He quite rightly presents the worrying aspects of longevity, radical enhancement, AI, and 'posthuman' society in general... but he derails often. He's injected sentimentality about his feelings of being human into the whole thing, though he's not sure what being human really is (not that I claim to either), and it comes off very weak and woolly. For half the time he claims to be presenting his 'rational' arguments about why we shouldn't do all these things, but I'm always waiting for the punchline of these arguments, and it never comes. Pop culture references, weird analogies, frequent references to others opinion, repetition, but in the end, its just his gut personal feeling that we should stay human, whatever that means.
It is scary thinking about the future. It seems unavoidable that posthumans will come about, and democracy may well becomes a more difficult thing. We may now be living in the brief window of closest approach to a fully democratic society.