The Singularity is Near
discussion
The Likelihood of the Singularity
date
newest »

message 1:
by
Jordan
(new)
-
added it
Jan 06, 2009 07:06PM

reply
|
flag

I'm not completely through reading the book and have only stumbled into the 3rd chapter, but I am inclined to get through the rest and see what he's thinking.
To answer your question of likelihood... well, that's a bit tough. I do believe that he's accurate in saying that Biotech, Nanotech and Super AI will nevertheless lead to a greater humanity, but I'm skeptical of the way society will adopt these changes even when there's nothing to hold it back.
As for his estimate of reaching this Singularity within a few decades, well... I don't know. I hope it's within the next 30 years since I may be able to take advantage of it myself :)

Space here doesn't allow for a lot of elaboration. But, the truly engaged reader may want to get a copy of the newly published DVD documentary THE TRANSCENDENT MAN: it's an excellent profile on Kurzweil but even-handed enough to disclose some of the seams in his "reasoning."

Unfortunately, nature may not be willing to live in harmony with us indefinitely. More specifically, here are existential threats such as large asteroids and super volcanoes that are not of mankind's making. Therefore, although we could probably do with less mindless consumption, it behoves us to buildup our scientific know how, our mastery of nature, if we want to be able to deal with whatever nature dishes out.
One important way to increase our know how is to increase our intelligence. Also, since things could go very wrong for us on earth, if we want to preserve our knowledge we shouldn't keep all of our eggs in one basket. Therefore, building settlements in space should be one of our objects. And, since it appears that humans might have a hard time living in space, we may need to adapt the human form to make this possible. Hence the interest in transhumanism.
If one adopts a transhumanist view of economics, the traditional factors of production (land, labor, capital) should be replaced with more general factors such as knowledge, energy, and materials. The build up of knowledge gives us greater power to capture and direct energy to transmute matter into more useful forms. The build up of knowledge also allows us to deal with the grand challenges that were introduced many years ago in books like the Limits to growth.
One way to accelerate the build up of knowledge, and to integrate knowledge from diverse sources, is to develop artificial intelligence software. Of course, since many fear that AI software may become smarter than us, we look for way to keep up. In his latest book - How to Create a Mind - Kurzweil predicts that we will develop brain-computer interfaces that expand the capacity of our neocortex. In his vision of the future, eventually enhanced humans will do most of their thinking in the clouds.
If I'm still alive when/if this becomes an option, I'd probably want to be one of the enhanced humans. And, if immortality became an option, I'd probably go for that, too. Of course, earth could become fairly crowded if we all became immortal beings capable of replicating ourselves. Therefore, space colonization will also be in the cards. Since it may be very hard for unenhanced humans to live long in space, my guess is that those who do live in space will become cyborgs or robots. Now, we may want to be robots that retain human form (or we may want to switch from one form to another). This idea has been around for a long time in science fiction - e.g. Blade Runner, Battlestar Galactica, etc.
In any event, that's my take on why some of us may indeed end up as robots in a post-singularity world far, far away.


That human factor is called fear. Fear begets all sorts of other things, like over regulation and social stigma for folks who even contemplate such technological advances as being good for humanity. I think it is possible for fear to push us in the opposite direction, to create a negative singularity with it's own kind of gravity. The negative singularity could pull us away from the exponential growth described in The Singularity is Near.
Right now, there are people who are afraid ChatGPT will take over their lives and potentially pollute the flow of information because it has injected garbage from the internet. The problem is that they are not wrong to believe that, but their thinking on the matter is incomplete.
I'm an author. I wrote an entire science fiction space opera about a technological singularity and ChatGPT emerged during the writing process. I was not afraid that it would write a better story than mine, because it has no meaningful way to ideate. That said, I can certainly see how it could one day be much more successful at replacing authors. Instead, I use it as a tool. I ask it questions to test ideas and themes. I even ask it basic questions about editing. You'd be surprised how many simple editing problems can be resolved by asking a simple question about a sentence that for some reason doesn't feel right. My point is, like all technology, AI is a tool. If it becomes sentient, it can be a friend if we choose to be friends with it, or an enemy if we allow our fear of it to manifest into hate.
Nothing about life is static. Change is inevitable. Learn to adapt and you'll be able to navigate the chaos.
The eom Expression: Beautiful Chaos
