The Singularity is Near: When Humans Transcend Biology
Rate it:
Open Preview
Read between March 29 - April 7, 2023
12%
Flag icon
The full-immersion visual-auditory virtual-reality environments, which will be ubiquitous during the second decade of this century, will hasten the trend toward people living and working wherever they wish. Once we have full-immersion virtual-reality environments incorporating all of the senses, which will be feasible by the late 2020s, there will be no reason to utilize real offices. Real estate will become virtual.
12%
Flag icon
another ramification of the law of accelerating returns is the exponential growth of human knowledge, including intellectual property.
12%
Flag icon
None of this means that cycles of recession will disappear immediately. Recently, the country experienced an economic slowdown and technology-sector recession and then a gradual recovery.
12%
Flag icon
because the rapid dissemination of information, sophisticated forms of online procurement, and increasingly transparent markets in all industries have diminished the impact of this cycle, “recessions” are likely to have less direct impact on our standard of living.
12%
Flag icon
The overall growth of the economy reflects completely new forms and layers of wealth and value that did not previously exist, or at least that did not previously constitute a significant portion of the economy, such as new forms of nanoparticle-based materials, genetic information, intellectual property, communication portals, Web sites, bandwidth, software, databases, and many other new technology-based categories. The overall information-technology sector is rapidly increasing its share of the economy and is increasingly influential on all other sectors,
12%
Flag icon
There has been a hundredfold increase in the number of college students. Automation started by amplifying the power of our muscles and in recent times has been amplifying the power of our minds. So for the past two centuries, automation has been eliminating jobs at the bottom of the skill ladder while creating new (and better-paying) jobs at the top of the skill ladder. The ladder has been moving up, and thus we have been exponentially increasing investments in education at all levels
12%
Flag icon
these views on exponential growth will ultimately prevail but only over time, as more and more evidence of the exponential nature of technology and its impact on the economy becomes apparent. This will happen gradually over the next decade, which will represent a strong long-term updraft for the market.
12%
Flag icon
Although people realized that stock values would increase rapidly, that same realization also increased the discount rate (the rate at which we need to discount values in the future when considering their present value). Think about it. If we know that stocks are going to increase significantly in a future period, then we’d like to have the stocks now so that we can realize those future gains. So the perception of increased future equity values also increases the discount rate. And that cancels out the expectation of higher future values.
12%
Flag icon
the law of accelerating returns are remarkably smooth, that doesn’t mean we can readily predict which competitors will prevail.
12%
Flag icon
if you can build genuine AI, there are reasons to believe that you can build things like neurons that are a million times faster. That leads to the conclusion that you can make systems that think a million times faster than a person. With AI, these systems could do engineering design.
12%
Flag icon
Combining this with the capability of a system to build something that is better than it, you have the possibility for a very abrupt transition. This situation may be more difficult to deal with even than nanotechnology, but it is much more difficult to think about it constructively at this point. Thus, it hasn’t been the focus of things that I discuss, although I periodically point to it and say: “That’s important too.”
13%
Flag icon
“The future of integrated electronics is the future of electronics itself. The advantages of integration will bring about a proliferation of electronics, pushing this science into many new areas.”
13%
Flag icon
But the basic idea—the exponential growth of the price-performance of electronics based on shrinking the size of transistors on an integrated circuit—was both valid and prescient.2 Today, we talk about billions of components rather than thousands. In the most advanced chips of 2004, logic gates are only fifty nanometers wide, already well within the realm of nanotechnology (which deals with measurements of one hundred nanometers or less). The demise of Moore’s Law has been predicted on a regular basis, but the end of this remarkable paradigm keeps getting pushed out in time.
13%
Flag icon
nanotechnology offers many new knobs we can turn to continue improving the number of components on a die.”
13%
Flag icon
Computing with DNA. DNA is nature’s own nanoengineered computer, and its ability to store information and conduct logical manipulations at the molecular level has already been exploited in specialized “DNA computers.” A DNA computer is essentially a test tube filled with water containing trillions of DNA molecules, with each molecule acting as a computer.
13%
Flag icon
The goal of the computation is to solve a problem, with the solution expressed as a sequence of symbols.
13%
Flag icon
Here’s how a DNA computer works. A small strand of DNA is created, using a unique code for each symbol. Each such strand is replicated trillions of times using a process called “polymerase chain reaction” (PCR).
13%
Flag icon
Because DNA has an affinity to link strands together, long strands form automatically, with sequences of the strands representing the different symbols, each of them a possible solution to the problem.
13%
Flag icon
Since there will be many trillions of such strands, there are multiple strands for each possible answer (that is, ea...
This highlight has been truncated due to consecutive passage length restrictions.
14%
Flag icon
when an electrical charge moves, it causes a magnetic field, which is real and measurable. An electron can spin in one of two directions, described as “up” and “down,” so this property can be exploited for logic switching or to encode a bit of memory.
14%
Flag icon
The prediction that the Singularity—an expansion of human intelligence by a factor of trillions through merger with its nonbiological form—will occur within the next several decades does not depend on the precision of these calculations.
14%
Flag icon
To accomplish this, Watts explains, means “precisely measuring the time delay between sound sensors that are separated in space and that both receive the sound.” The process involves pitch analysis, spatial position, and speech cues, including language-specific cues. “One of the important cues used by humans for localizing the position of a sound source is the Interaural Time Difference (ITD), that is, the difference in time of arrival of sounds at the two ears.”
14%
Flag icon
Functional simulation of the brain is sufficient to re-create human powers of pattern recognition, intellect, and emotional intelligence. On the other hand, if we want to “upload” a particular person’s personality (that is, capture all of his or her knowledge, skills, and personality, a concept I will explore in greater detail at the end of chapter 4), then we may need to simulate neural processes at the level of individual neurons and portions of neurons, such as the soma (cell body), axon (output connection), dendrites (trees of incoming connections), and synapses (regions connecting axons ...more
15%
Flag icon
We will also be able to amplify the power of personal computers by harvesting the unused computation power of devices on the Internet. New communication paradigms such as “mesh” computing contemplate treating every device in the network as a node rather than just a “spoke.”43 In other words, instead of devices (such as personal computers and PDAs) merely sending information to and from nodes, each device will act as a node itself, sending information to and receiving information from every other device. That will create very robust, self-organizing communication networks.
15%
Flag icon
another approach to accelerate the availability of human-level computation in a personal computer is to use transistors in their native “analog” mode. Many of the processes in the human brain are analog, not digital. Although we can emulate analog processes to any desired degree of accuracy with digital computation, we lose several orders of magnitude of efficiency in doing so.
15%
Flag icon
the engineering design time required for such native analog computing is lengthy, so most researchers developing software to emulate regions of the brain usually prefer the rapid turnaround of software simulations.
15%
Flag icon
The number of “chunks” of knowledge mastered by an expert in a domain is approximately 105 for a variety of domains. These chunks represent patterns (such as faces) as well as specific knowledge.
15%
Flag icon
Based on my own experience in designing systems that can store similar chunks of knowledge in either rule-based expert systems or self-organizing pattern-recognition systems, a reasonable estimate is about 106 bits per chunk (pattern or item of knowledge), for a total capacity of 1013 (10 trillion) bits for a human’s functional memory.
15%
Flag icon
As we reverse engineer our bodies and brains, we will be in a position to create comparable systems that are far more durable and that operate thousands to millions of times faster than our naturally evolved systems. Our electronic circuits are already more than one million times faster than a neuron’s electrochemical processes, and this speed is continuing to accelerate.
15%
Flag icon
Most of the complexity of a human neuron is devoted to maintaining its life-support functions, not its information-processing capabilities. Ultimately, we will be able to port our mental processes to a more suitable computational substrate. Then our minds won’t have to stay so small.
15%
Flag icon
Because computation underlies the foundations of everything we care about, from the economy to human intellect and creativity, we might well wonder: are there ultimate limits to the capacity of matter and energy to perform computation? If so, what are these limits, and how long will it take to reach them?
15%
Flag icon
A major factor in considering computational limits is the energy requirement. The energy required per MIPS for computing devices has been falling exponentially,
15%
Flag icon
Until just recently Intel emphasized the development of faster and faster single-chip processors, which have been running at increasingly high temperatures. Intel is gradually changing its strategy toward parallelization by putting multiple processors on a single chip.
15%
Flag icon
Reversible Computing. Ultimately, organizing computation with massive parallel processing, as is done in the human brain, will not by itself be sufficient to keep energy levels and resulting thermal dissipation at reasonable levels. The current computer paradigm relies on what is known as irreversible computing, meaning that we are unable in principle to run software programs backward.
15%
Flag icon
Programs generally do not retain all intermediate results, as that would use up large amounts of memory unnecessarily. This selective erasure of input information is particularly true for pattern-recognition systems.
15%
Flag icon
When a bit of information is erased, that information has to go somewhere. According to the laws of thermodynamics, the erased bit is essentially released into the surrounding environment, thereby increasing its entropy, which can be viewed as a measure of information (including apparently disordered information) in an environment.
15%
Flag icon
If, on the other hand, we don’t erase each bit of information contained in the input to each step of an algorithm but instead just move it to another location, that bit stays in the computer, is not released into the environment, and therefore generates no heat and requires no energy from outside the computer.
15%
Flag icon
Fredkin goes on to show that the efficiency of a computer built from reversible logic gates can be designed to be very close (at least 99 percent) to the efficiency of ones built from irreversible gates.
15%
Flag icon
How much energy does a perfectly efficient computer have to dissipate in order to compute something? The answer is that the computer does not need to dissipate any energy.
15%
Flag icon
It is hard to overstate the significance of this insight. A key observation regarding the Singularity is that information processes—computation—will ultimately drive everything that is important. This primary foundation for future technology thus appears to require no energy.
15%
Flag icon
The practical reality is slightly more complicated. If we actually want to find out the results of a computation—that is, to receive output from a computer—the process of copying the answer and transmitting it outside of the computer is an irreversible process, one that generates heat for each bit transmitted. However, for most applications of interest, the amount of computation that goes into executing an algorithm vastly exceeds the computation required to communicate the final answers, so the latter does not appreciably change the energy equation.
16%
Flag icon
The potential of matter to compute is also governed by a very small number, Planck’s constant: 6.6 X 10−34 joule-seconds (a joule is a measure of energy). This is the smallest scale at which we can apply energy for computation. We obtain the theoretical limit of an object to perform computation by dividing the total energy (the average energy of each atom or particle times the number of such particles) by Planck’s constant.
16%
Flag icon
I set the date for the Singularity—representing a profound and disruptive transformation in human capability—as 2045. The nonbiological intelligence created in that year will be one billion times more powerful than all human intelligence today. That will indeed represent a profound change, and it is for that reason that I set the date for the Singularity—representing a profound and disruptive transformation in human capability—as 2045.
16%
Flag icon
By the second decade of this century, however, most computing will not be organized in such rectangular devices but will be highly distributed throughout the environment. Computing will be everywhere: in the walls, in our furniture, in our clothing, and in our bodies and brains.
16%
Flag icon
the amount of time required for our human civilization to achieve scales of computation—and intelligence—that go beyond our planet and into the universe may be a lot shorter than you might think.
16%
Flag icon
Memory and Computational Efficiency: A Rock Versus a Human Brain. With the limits of matter and energy to perform computation in mind, two useful metrics are the memory efficiency and computational efficiency of an object. These are defined as the fractions of memory and computation taking place in an object that are actually useful. Also, we need to consider the equivalence principle: even if computation is useful, if a simpler method produces equivalent results, then we should evaluate the computation against the simpler algorithm.
16%
Flag icon
Our brains have evolved significantly in their memory and computational efficiency from pre-biology objects such as stones. But we clearly have many orders of magnitude of improvement to take advantage of during the first half of this century.
16%
Flag icon
At these scales, we would require computing with subatomic particles. With such smaller size comes the potential for even greater speed and density.
16%
Flag icon
At this time, we have to regard the feasibility of pico- and femtocomputing as speculative. But nano-computing will provide massive levels of intelligence, so if it’s at all possible to do, our future intelligence will be likely to figure out the necessary processes. The mental experiment we should be making is not whether humans as we know them today will be capable of engineering pico- and femtocomputing technologies, but whether the vast intelligence of future nanotechnology-based intelligence (which will be trillions of trillions of times more capable than contemporary biological human ...more
16%
Flag icon
In addition to making computing smaller, we can make it bigger—that is, we can replicate these very small devices on a massive scale. With full-scale nanotechnology, computing resources can be made self-replicating and thus can rapidly convert mass and energy into an intelligent form. However, we run up against the speed of light, because the matter in the universe is spread out over vast distances.
1 6 15