Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos
Rate it:
Open Preview
1%
Flag icon
At first, one might think that an apple embodies an infinite number of bits, but this is not so. In fact, the laws of quantum mechanics, which govern all physical systems, make finite the number of bits required to specify the microscopic state of the apple and its atoms. Each atom, by its position and velocity, registers only a few bits; each nuclear spin in an atom’s core registers but a single bit.
3%
Flag icon
The digital revolution under way today is merely the latest in a long line of information-processing revolutions stretching back through the development of language, the evolution of sex, and the creation of life, to the beginning of the universe itself. Each revolution has laid the groundwork for the next, and all information-processing revolutions since the Big Bang stem from the intrinsic information-processing ability of the universe. The computational universe necessarily generates complexity. Life, sex, the brain, and human civilization did not come about by mere accident.
12%
Flag icon
In general, the best way to get more information is not to increase the precision of measurements on a continuous quantity, but rather to put together measurements on more and more quantities, each one of which may register only a few bits. This compiling of bits—or digital representation—is effective because the number of total alternatives described grows much faster than does the number of bits.
14%
Flag icon
Ideas also make machines. First rock, then wood: what material would supply the next advance in information processing? Bone. In the early seventeenth century, the Scottish mathematician John Napier discovered a way of changing the process of multiplication into addition. He carved ivory into bars, ruled marks corresponding to numbers on the bars, and then performed multiplication by sliding the bars alongside each other until the marks corresponding to the two numbers lined up. The total length of the two bars together then gave the product of the two numbers. The slide rule was born.
18%
Flag icon
This lovely inscrutability of pure reason harks back to an earlier account of the role of logic in the universe. From his home in Cordova, the twelfth-century Muslim philosopher Averroës (Ibn Rushd) in his studies of Aristotle concluded that what is immortal in human beings is not their soul but their capacity for reason. Reason is immortal exactly because it is not specific to any individual; instead, it is the common property of all reasoning beings.
19%
Flag icon
The conventional history of the universe pays great attention to energy: How much is there? Where is it? What is it doing? By contrast, in the story of the universe told in this book, the primary actor in the physical history of the universe is information. Ultimately, information and energy play complementary roles in the universe: Energy makes physical systems do things. Information tells them what to do.
19%
Flag icon
we could look at matter at the atomic scale, we would see atoms dancing and jiggling every which way at random. The energy that drives this random atomic dance is called heat, and the information that determines the steps of this dance is called entropy. More simply, entropy is the information required to specify the random motions of atoms and molecules—motions too small for us to see. Entropy is the information contained in a physical system that is invisible to us. Entropy is a measure of the degree of molecular disorder existing in a system: it determines how much of the system’s thermal ...more
20%
Flag icon
The laws of thermodynamics guide the interplay between our two actors, energy and information. To experience another example of the first and second laws, take a bite of an apple. The sugars in the apple contain what is called free energy. Free energy is energy in a highly ordered form associated with a relatively low amount of entropy. In the case of the apple, the energy in sugar is stored not in the random jiggling of atoms but in the ordered chemical bonds that hold sugar together. It takes much less information to describe the form energy takes in a billion ordered chemical bonds than it ...more
20%
Flag icon
While you run, the free energy in the sugar is converted into motion by your muscles. By the time you’re finished running, you’re hot: the free energy in the sugar has been converted into heat and work. The number of calories of heat and work exactly matches the calories of free energy in the apple’s sugar. In obedience to the first law of thermodynamics, the total amount of energy remains the same. (In obedience to the second law, the amount of information required to describe the extra jiggling of molecules in your hot muscles and sweaty skin is much greater than the amount of information ...more
21%
Flag icon
In either scenario, though, it’s clear that energy and information (visible and invisible) are the two primary actors in the universal drama. The universe we see around us arises from the interplay between these two quantities, interplay governed by the first and second laws of thermodynamics. Energy is conserved. Information never decreases. It takes energy for a physical system to evolve from one state to another. That is, it takes energy to process information. The more energy that can be applied, the faster the physical transformation takes place and the faster the information is ...more
21%
Flag icon
As soon as it began, though, the universe began to expand. As it expanded, it pulled more and more energy out of the underlying quantum fabric of space and time. Current physical theories suggest that the amount of energy in the early universe grew very rapidly (a process called “inflation”), while the amount of information grew more slowly. The early universe remained simple and orderly: it could be described by just a few bits of information. The energy that was created was free energy.
22%
Flag icon
As the universe expanded, it cooled down. The elementary particles jiggled around more slowly. The amount of information required to describe their jiggles stayed almost the same, though, increasing gradually over time. It might seem that slower jiggles would require fewer bits to describe, and it’s true that fewer bits were required to describe their velocities. But, at the same time, the amount of space in which they were jiggling was increasing, requiring more bits to describe their positions. Thus, the total amount of information remained constant or increased in accordance with the second ...more
23%
Flag icon
At the small scale, quantum mechanics describes the behavior of molecules, atoms, and elementary particles. At larger scales, it describes the behavior of you and me. Larger still, it describes the behavior of the universe as a whole. The laws of quantum mechanics are responsible for the emergence of detail and structure in the universe. The theory of quantum mechanics gives rise to large-scale structure because of its intrinsically probabilistic nature. Counterintuitive as it may seem, quantum mechanics produces detail and structure because it is inherently uncertain.
23%
Flag icon
The early universe was uniform: the density of energy was everywhere almost the same. But it was not exactly the same. In quantum mechanics, quantities such as position, velocity, and energy density do not have exact values. Instead, their values fluctuate. We can describe their probable values—the most likely location of a particle, for example—but we cannot claim perfect certainty. Because of these quantum fluctuations, some regions of the early universe were ever so slightly more dense than other regions. As time passed, the attractive force of gravity caused more matter to move toward ...more
23%
Flag icon
Every galaxy, star, and planet owes its mass and position to quantum accidents of the early universe. But there’s more: these accidents are also the source of the universe’s minute details. Chance is a crucial element of the language of nature. Every roll of the quantum dice injects a few more bits of detail into the world. As these details accumulate, they form the seeds for all the variety of the universe. Every tree, branch, leaf, cell, and strand of DNA owes its particular form to some past toss of the quantum dice. Without the laws of quantum mechanics, the universe would still be ...more
26%
Flag icon
In the latter part of the nineteenth century, three physicists—James Clerk Maxwell, Ludwig Boltzmann, and Josiah Willard Gibbs—discovered that the thermodynamic quantity known as entropy was, as we’ve noted, a form of information: namely, information that isn’t known.
30%
Flag icon
At the end of the nineteenth century, the founders of statistical mechanics—Maxwell, Boltzmann, and Gibbs—realized that entropy was also a form of information: entropy is a measure of the number of bits of unavailable information registered by the atoms and molecules that make up the world. The second law of thermodynamics comes about, then, by combining this notion with the fact that the laws of physics preserve information, as we will soon discuss. Nature does not destroy bits. But surely it takes an infinite number of bits of entropy to specify the positions and velocities of even a single ...more
32%
Flag icon
The quantitative trade-off between mechanical energy and heat was even more firmly established by the mid-nineteenth century and enshrined as the first law of thermodynamics: energy is conserved when mechanical energy is converted to heat. Unlike mechanical energy, however, energy in the form of heat seemed to possess the mysterious property called entropy, which prevented some of the heat from being transformed into useful work. Like energy, entropy could be quantified experimentally: Whenever mechanical energy was turned into heat, an amount of entropy equal to the energy divided by the ...more
32%
Flag icon
Just what is this entropy stuff, anyway? The atomic hypothesis provides an answer. Heat is a form of energy, and entropy is associated with heat. If things are made out of atoms, then there is a simple explanation of heat: heat is just the energy in the jiggling of atoms. Entropy, then, has a simple interpretation, too: To describe the motion of atoms requires a large number of bits of information. The quantity called entropy is proportional to the number of bits required to describe the way atoms are jiggling.
32%
Flag icon
In the middle of the nineteenth century, James Clerk Maxwell developed a detailed theory of heat in terms of the motion of atoms. He figured out how fast the atoms were moving as a function of temperature: the kinetic energy of an atom is proportional to its temperature. The hotter something is, the faster its atoms are jiggling around. This jiggling is also associated with entropy: the faster the atoms jiggle, the more information is required to describe their jiggling, and thus, the more entropy they possess. Temperature is a measure of the trade-off between information and energy: atoms at ...more
33%
Flag icon
This demon-mediated flow of heat from the cold side to the hot side apparently violates the second law of thermodynamics, which implies that heat flows from hot to cold but not from cold to hot. It is the demon’s ability to get information about the atoms that allows him to accomplish this apparent violation of physical law.
33%
Flag icon
As the nineteenth century wound on, Boltzmann, Gibbs, and the German physicist Max Planck refined their formulas describing the energy and entropy of systems made up of atoms. In particular, they discovered that the entropy of a system was proportional to the number of bits required to describe the microscopic state of the atoms. This result was so useful in describing the trade-offs between heat and energy that the formula that encompasses it is inscribed on Boltzmann’s tomb. Entropy is traditionally written S, and the number of different possible microscopic states (or “complexions,” as ...more
34%
Flag icon
Information can be created but it can’t be destroyed. Consider flipping a bit. Flipping a bit transforms information: 0 goes to 1 and vice versa. It also preserves information: if you knew that the bit was 0 before the flip, then you know that it is 1 after the flip. By contrast, erasure is a process that destroys information. During erasure, a bit that is initially 0 stays 0, and a bit that is initially 1 goes to 0. Erasure destroys the information in the bit. But the laws of physics do not allow processes that do nothing but erase a bit. Any process that erases a bit in one place must ...more
34%
Flag icon
To see Landauer’s principle in action, look at how bits are erased in computers. As noted in chapter 2, in a contemporary electronic computer a bit is stored on a capacitor. A capacitor is a bucket for electrons. When you charge up the capacitor, you put electrons in the bucket; when you discharge it, you dump the electrons out of the bucket. In a computer, an uncharged capacitor registers a 0 and a charged capacitor registers a 1. To erase a bit in an electronic computer, just empty the bucket: close a switch and let the electrons on the capacitor flow out. When the capacitor has been ...more
34%
Flag icon
Another way to erase a bit is to swap it with another bit that reads 0. Swapping information between bits preserves information; to get back the original values of the bits, just swap them again. Before the swap, the first bit could read either 0 or 1; it has a bit’s worth of entropy. The second bit reads 0; it has no entropy. After the swap, the first bit reads 0; it has been restored to 0, or erased. The second bit reads 0 or 1; it has a bit’s worth of entropy—the same entropy that the first bit had before the swap. Swapping moves information and entropy from one place to another, but the ...more
This highlight has been truncated due to consecutive passage length restrictions.
35%
Flag icon
As originally conceived, entropy is a quantity that measures how useful energy is. Energy with a small amount of entropy is useful (free) energy; energy with lots of entropy is useless. It is perhaps easier to conceive of an increase in entropy in these terms: energy degrading from useful to useless forms.
39%
Flag icon
The collision of snooker balls is also a chaotic process. Suppose you make a small error in striking the cue ball, so that its initial speed and direction is a bit off. That error is amplified when the cue ball strikes the red ball. The direction in which the red ball now moves has a greater error than the error in the initial speed and direction of the cue ball. The more collisions that take place, the more the initial error is magnified. If you planned to knock the red ball off the pink ball and knock that off the third to pot the third ball, you will probably have failed: by the third ...more
43%
Flag icon
When two atoms in a gas collide, the information they register is transformed and processed. How does the information processing performed during atomic collision relate to the information processing performed by the logic gates described in the first part of this book? In fact, as pointed out by Edward Fredkin of Carnegie Mellon University and Tommaso Toffoli of Boston University, atomic collisions naturally perform AND, OR, NOT, and COPY logic operations. In the language of information processing, atomic collisions are computationally universal.
43%
Flag icon
In Fredkin and Toffoli’s model, each possible atomic collision performs AND, OR, NOT, or COPY operations on suitably defined input and output bits. By assigning the proper initial positions and velocities to atoms in a gas, it is a straightforward matter to “wire up” any desired logic circuit. Atoms bouncing in a gas are, in principle, capable of universal digital computation. In practice, of course, it is rather difficult to make a gas of atoms perform a computation. Even if we did have control over the position and velocity of individual atoms, quantum mechanics limits the accuracy to which ...more
43%
Flag icon
Although practical limitations prevent using collisions between atoms in a gas to compute, the fact that atomic collisions in principle allow computation implies that the long-term future of a gas of atoms is intrinsically unpredictable. The halting problem (see chapter 2) foils not only conventional digital computers but any system capable of performing digital logic. Since co...
This highlight has been truncated due to consecutive passage length restrictions.
47%
Flag icon
The bigger something is, however, the harder it is to coax it into existing in two places at once. (Big things tend to behave more “classically,” and less quantum-mechanically.) The reason lies not so much with the physical size of the object as with its visibility. The bigger something is, the more interactions it tends to have with its surroundings, thus the easier it is to detect. In order to go through both slits at once and produce an interference pattern, a particle must pass through the slits undetected.
47%
Flag icon
It is now clear why big things tend to show up in one place or another, but not both. Pebbles, people, and planets are constantly interacting with their surroundings. Each interaction with an electron, a molecule of air, a particle of light tends to localize a system. Big things interact with lots of little things, each of which gets information about the location of the big thing. As a result, big things tend to appear here or there instead of here and there at the same time.
48%
Flag icon
The state |0> + |1> has a definite value of spin along the sideways axis. If you measure which direction it is spinning about that axis, you always find that it is spinning clockwise. But when you take this same spin and try to determine its value of spin about the vertical axis, the result will be completely random; half the time you will find that it is clockwise (that is, you find the state spin-up, or |0>) and half the time you will find that it is counterclockwise (spin-down, or |1>). When the value of spin about the sideways axis is completely certain, the value of spin about the ...more
49%
Flag icon
Apparently it is not possible to have a definite value of spin about two different axes at the same time. This intrinsically chancy nature of quantum mechanics was immortalized by Werner Heisenberg, one of the founders of quantum mechanics, as the “uncertainty principle.” The uncertainty principle states that if the value of some physical quantity is certain, then the value of a complementary quantity is uncertain. Spin about the vertical axis and spin about the sideways axis are just such complementary quantities: if you know one, you can’t know the other. Another pair of complementary ...more
50%
Flag icon
Now look at interactions between qubits. Consider a two-qubit transformation that is a quantum analog of the controlled-NOT logic operation described earlier. Recall that the controlled-NOT operation flips one bit if and only if the other bit is 1. That is, the controlled-NOT takes 00 to 00, 01 to 01, 10 to 11, and 11 to 10. The controlled-NOT operation is one-to-one and can be reversed simply by applying it twice. The quantum controlled-NOT takes the quantum states |00> to |00>, |01> to |01>, |10> to |11>, and |11> to |10>. Here, the state |00> corresponds to the “joint wave” of the two ...more
58%
Flag icon
The fact that atoms respond to light only at frequencies corresponding to their spectrum is useful if you want to send instructions to one kind of atom but not to another, as we will see.
61%
Flag icon
Recall the double-slit experiment. In that model, the electron does two things at once: it goes through both slits simultaneously. When you take a measurement to determine which slit the electron has gone through, it will show up at one slit or the other at random. Similarly, when you take a quantum computer that is doing two things at once and measure to see what it’s doing, you will find it doing one or the other of those things at random. If you want to see the interference pattern in the double-slit experiment, you must wait until the electron has hit the screen, so that the two waves—one ...more
62%
Flag icon
Note that a full-blown measurement is not necessary to decohere a quantum computation. Any passing electron or atom that interacts with the quantum computer in such a way as to get information about what the quantum computer is doing can decohere the computer as effectively as a full-blown measurement using a macroscopic measuring device. Great care must be taken to insulate quantum computers from their surroundings while they are performing quantum computations.
76%
Flag icon
Imagine the quantum computation as embedded in space and time. Each logic gate now sits at a point in space and time, and the wires represent physical paths along which quantum bits flow from one point to another. The first feature to note is that there are many ways to embed the quantum computation in space and time. Each quantum logic gate can be put down at any point where there is not another quantum logic gate, and the wires can squiggle all over the place to connect the logic gates. What happens to quantum information in the computation is independent of how the quantum computation is ...more
81%
Flag icon
The medieval philosopher William of Occam was interested in finding the simplest explanation for observed phenomena. Pluralitas non est ponenda sin necessitate, he declared: “Plurality should not be posited without necessity.” Occam urged us to accept simple explanations for phenomena over complex ones.
85%
Flag icon
Effective complexity is a simple and elegant measure of complexity. Every physical system has associated with it a quantity of information—the amount required to describe the physical state of the system to the accuracy allowed by quantum mechanics. The basic way to measure something’s effective complexity is to divide that amount into two parts: information that describes the regular aspects of the thing and information that describes its random aspects. The amount of information required to describe a system’s regularities is its effective complexity.
87%
Flag icon
Even though the very early universe is simple, neither effectively complex nor logically deep, it has a glorious future ahead. The early universe is what Charles Bennett calls an “ambitious” system: even if it is not initially complex, it is intrinsically able to generate large amounts of complexity over time.