Gendou's Reviews > The Singularity is Near: When Humans Transcend Biology

The Singularity is Near by Ray Kurzweil
Rate this book
Clear rating

's review
Jun 16, 2011

did not like it
bookshelves: non-fiction
Read from June 16 to 24, 2011

This starts with the thesis: Technological change is exponential!
This has been true for many measures such as micro-processor size, cost of mass-produced goods, etc.
It is not, however, a general rule of thumb to apply blindly to all things "technological"!
This seems to be Kurzweil's big mistake.
He extrapolates features of technology to an unrealistic infinity.

For example, Moor's law is running up against the quantum limit, so micro-processor size is exponential up to a fast-approaching limit.
To take another example, the cost of an iPod may drop exponentially as you scale up production, but you can only sell so many iPods.
Once everyone has an iPod, you read a production limit, and the price becomes stable.

The major claim in the book is that brains will merge with computers.
Kurzweil argues that since transistors are faster than neurons, they will make better brains.
The fallacy here is that you would ever WANT to build a brain out of transistors in the first place!
Neurons can network widely at a low price, but wide networks of transistors are slow and costly in power and heat.
No engineer would try to build a brain out of transistors.
The lesson here is that biological evolution, while it's scope has been limited, will ALWAYS win in a contest with human engineers.
The better your artificial brain performs, the more it will look like a human brain.
This is not coincidence.

One important point made in the book is that our only problem need be to produce an intelligence greater than our own. Once this is accomplished, all other tasks can be up to that greater intelligence. While this is absurd applied to most practical problems like baking a cake, it makes some sense in the realm of AI. Here I disagree with Kurzweil, who asserts any such improved intelligence would be "non-biological". Heck, many parents achieve this goal by giving their child a good education!

Another thing he gets right is the demystification of Searle's Chinese Room thought experiment. Searle's objection to an artificial brain is wrong. Mind is platform-independent, but the port is yet to be written.

Ray makes the same tired arguments over and over, redundantly redundant.
You can almost hear him TRYING to keep alive his delusional dream of living forever!
It's sad.

Ray argues that a future AI will be produced with the ability to iteratively improve its own intelligence.
Among his many skills, he's an accomplished software engineer, so he should know better.
This will never, ever happen in this millennium.
Even if we assume computing resources increased by a dozen powers of ten.
Even if we "reverse-engineer the human brain" (whatever that means).

I know this review is getting long, but he does make some speculations about the year 2010, most of which never came to pass. He predicts we will have virtual assistants that can look-up movie actors, etc. that will respond to our vocal queues in virtual vision contact lenses. Instead, we have Wikipedia on our iPhones. Pretty far off the mark, if you ask me.

By far the most annoying thing Ray brings up no fewer than 10 times is that the speed of light may be "circumvented". I swear, there's nothing sacred to this man! He embarrassingly bungles an explanation of quantum entanglement, calling it "quantum disentanglement" and mistaking spin axis for wave function phase. Yikes.

This book is a big house of mirrors meant to disguise the lunacy of the thesis.
Don't get me wrong, the mirrors are interesting to look at in their own right:
Nano technology, genetic engineering, genetic algorithms, neural networks.
It's fun stuff!
But the singularity is not near, it is the delusion of an old man who would like very much to live forever.
29 likes · flag

Sign into Goodreads to see if any of your friends have read The Singularity is Near.
Sign In »

Reading Progress

02/24/2016 marked as: read

Comments (showing 1-7 of 7) (7 new)

dateDown arrow    newest »

message 1: by David (new)

David I agree with your review. I tried reading this book about a year ago. I gave up on it, because it seemed too optimistic and swimming in fantasy.

message 2: by Thomas (new)

Thomas So yeah, they recently found that neutrinos travel faster than light...

message 3: by Yaser (new)

Yaser Sulaiman ... and they recently found that the anomalous results may have been caused by a loose fiber optic cable. That is not to say that the speed of light is forever "sacred." It's not always foolish to challenge "sacred" assumptions or experimental scientific facts, but Kurzweil takes it to a whole new level.

Wayne And now it's 2013 and we have virtual assistants that can look up movie actors. Siri on iPhones and google voice search/Google now on Android phones have been out for ~1-2 years. So 1-2 years is not far off the mark in the grand scheme of things.

Elling Borgersrud Viritual assistans is crap outside of the english speaking world, tho.
Anyways: great bookreview! I have been looking for some seriious critisicm of the book. I was irritated by the breaking of the speed of light and the use of black holes as computers, but couldnt really find flaws in his other arguments. But I wanted to. This stuff is extremely radical, and if I am going to beleave Kurtweil is right, I would have to rethink a couple of things and perhaps prioritize my life differently.
I didnt quite get the way you dismissed the argument about reverse-engeneering the brain, tho? Why is that stupid? The more we learn about the brain, the closer we will get to mimmic it, surely?

Jhora Zakaryan You haven't been reading the book accurately! Most of the "issues" with the author's logic that you provided here have answers inside the book...

It's true that Moore's law would not stay true forever, and major companies like Intel understand that. Just try searching for Intel roadmap, and you'll see that transistor size shrinking will reach its limit in 2021, when 5nm processors will be produced. After that, it would take a paradigm shift to keep exponentially improving processor designs, and the author suggests that the next paradigm shift will be transition to molecular computers. As the author has predicted, the topic is now heavily researched and we are now approaching the verge of that transition.

You argue that the fact that transistors are more efficient than neurons does not mean that we can build better brains. That wasn't his point: he is not talking about 1:1 low level brain emulation. He claims that the biological brain is the product of evolution, which is basically based on natural selection, and he claims that we can come with a better approach to build a brain, and we just need a simulation system (a system that would solve the same problems, but will not necessarily share biological brain design), not a low level brain emulator.

From what I have read in your review, my guess is that you have background in biology and/or physics. I am a software engineer, and I pretty much understand most of his thoughts (reverse engineering, for example). I'm not claiming that this is "the ultimate book" and I agree that the author extrapolates too much and does not provide proofs where necessary. Yes, the book has its flaws, but you can't just say that "it is the delusion of an old man who would like very much to live forever".

P.S. After all, he wrote that he didn't want to live "forever", he wanted to live "as long as he wanted" (and those two are very different things).

Jonathan S. Harbour All I can say, without sounding like a fanboy, is go back to the 1960s and read speculative fiction. They imagined smaller mainframes with super fast card readers. No one--or few--could imagine a PC, GPS nav, smartphones, because these things were inconceivable. So are arguments against progress. Stating that something will never happen, not in the millenium, reminds me of computer science 50 years ago. They had NO idea what was coming. In fact, Mr. Smug Reviewer, go back to 2005. 2006. Could anyone have imagined smartphones, app stores, TABLETS, and the like? I say not. Imagine, yes; like a Microsoft Tablet--something no one wants to use. It takes a Steve Jobs to throw a wrench into things. For the record, I didn't like this book either, but I also recognize closed-minded pessimism when I see it, as I am an expert.

back to top