More on this book
Community
Kindle Notes & Highlights
Read between
April 12 - April 17, 2020
Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation.
These three things—superposition, measurement, and entanglement—are the key quantum mechanical ideas.
The chapter concludes by showing that we can’t use entanglement to communicate faster than the speed of light.
Once we have defined something called query complexity, we study three quantum algorithms and show that they are faster with respect to this type of complexity than their classical counterparts.
Shor’s algorithm gives a way of factoring a large number into the product of its prime factors.
The book concludes with the realization that quantum computation is not a new type of computation but is the discovery of the true nature of computation.
In classical mechanics, we might consider the path of a ball thrown into the air. The path can be calculated using calculus, but in order to perform the calculation we need to know certain quantities such as the mass of the ball and its initial velocity. How we measure these is not part of the theory. We just assume that they are known.
The implicit assumption is that the act of measuring is not important to the problem—that taking a measurement does not affect the system being modeled.
In quantum mechanics, we are often considering tiny particles like atoms or electrons. Here bouncing photons off them has an effect that is no longer negligible. In order to perform some measurement, we have to interact with the system. These interactions are going to perturb our system, so we can no longer ignore them. It should not seem surprising that measurement becomes a basic component of the theory, but what is surprising is how this is done.
Incorporating measurements into the theory is one on the differences between classical and quantum mechanics. Another difference concerns randomness.
There is no real randomness in classical mechanics, just what is often called sensitive dependence to initial conditions—a small change in the input can get amplified and produce an entirely different outcome.
Spin is quantized: When measured, it gives just two possible answers—not a continuous range of answers.
Consequently measuring spins of electrons first in the vertical and then in the horizontal direction gives a random string of 0s and 1s. This is probably the simplest thing that we can do with qubits, but surprisingly this is something that cannot be done with a classical computer.
Classical computers are deterministic. They can compute strings that pass various tests for randomness, but these are pseudorandom, not random. They are computed by some deterministic function, and if you know the function and the initial seed input, you can calculate exactly the same string. There are no classical computer algorithms that generate truly random strings. Thus, already we can see that quantum computations have some advantages over classical ones.
Dirac was one of the founders of quantum mechanics, and his notation is used extensively throughout both quantum mechanics and quantum computing. It is not widely used outside these disciplines, which is surprising given how elegant and useful it is.
We will associate the basis vector with the bit 0 and the basis vector with the bit 1. So when we measure the qubit we will obtain 0 with probability and 1 with probability
Superluminal communication is communication faster than the speed of light.
The theory also tells us that as we approach the speed of light our mass increases without bound, which means that we can never reach that speed.
This book is about the mathematics that underlies quantum computing. It is not about how to physically create a quantum computer.
Newton’s law transformed physics. It can be used, for example, to show that a planet orbiting a star moves in an elliptical orbit. But though it tells us the value of the force, it does not tell us the mechanism that connects the planet to the sun.
Although Newton’s law of gravitation was useful for calculations, it did not explain how gravity worked. Newton, himself, was concerned about this.
Einstein’s theory was not only more precise, but it also gave a description of how gravity worked, and this description was local. A planet moves according to the shape of space in its vicinity.
All we can say is that no quantum mechanical generated string has failed a test for randomness.
In the general theory of quantum mechanics it is the solution of the Schrödinger’s wave equation that collapses when a measurement is made. Erwin Schrödinger, of the eponymous equation, was very uncomfortable with this idea of waves collapsing to states given by probabilities.
The standard explanation is that the measurement involves an interaction with a macroscopic device. The measuring device is large enough that it can be described using classical physics and does not have to be incorporated into the quantum theoretical analysis—that whenever we make a measurement we have to interact physically with the object being measured, and this interaction causes the jump. But this explanation is not entirely satisfactory. It seems a plausible description, but it lacks mathematical precision.
The many-worlds interpretation deals with the measurement problem by saying that it only appears that the state vector jumps to one of the possibilities, but in fact there are different universes and each of the possibilities is an actual occurrence in one of the many universes. The version of you in this universe sees one outcome, but there are other versions of you in other universes that see the other outcomes.
Feynman’s interest in computation was partly due to his interaction with Edward Fredkin and Fredkin’s idiosyncratic views of physics and computation. Fredkin believes that the universe is a computer, and that since the laws of physics are reversible we ought to study reversible computation and reversible gates.
The argument continued by showing that any boolean function could be constructed using combinations of not and and. Consequently, we can construct a circuit that computes any boolean function using just NOT and AND gates.
The study of reversible gates and reversible computation began by looking at the thermodynamics of computation. Shannon defined entropy for information. Entropy is also defined in thermodynamics. In fact, this is where Shannon got the idea.
Rolf Landauer proved the result and gave the minimum possible amount of energy to erase one bit of information.
Billiard Ball Computing
Alice needs to send Bob two classical bits, but notice that there are infinitely many possibilities for the initial state of her electron. It’s impressive that we can send one of an infinite number of possibilities using only two classical bits.
quantum computation is the more fundamental form of computation.
Popular descriptions of quantum algorithms describe them as being much faster than regular algorithms. This speedup, it is explained, comes from being able to put the input into a superposition of all possible inputs and then performing the algorithm on the superposition.
Consequently, instead of running the algorithm on just one input, as you do classically, you can run the algorithm using “quantum parallelism” on all possible inputs at the same time.
David Deutsch published his algorithm in his landmark paper of 1985. This was the first quantum algorithm, and it showed that a quantum algorithm could be faster than a classical one.
It seems to be only a matter of time until the RSA encryption scheme will no longer be secure.
The most impressive results, so far, involve a Chinese satellite that is devoted to quantum experiments. It’s named Micius after a Chinese philosopher who did work in optics. This is the satellite that was used for the quantum teleportation we mentioned in an earlier chapter.
In 1929, Paul Dirac wrote about quantum mechanics, saying, “The fundamental laws necessary for the mathematical treatment of a large part of physics and the whole of chemistry are thus completely known, and the difficulty lies only in the fact that the application of these laws leads to equations that are too complex to be solved.”
Feynman thought that one of the main applications of quantum computers would be to simulate quantum systems. Using quantum computers to study chemistry that belongs to the quantum world is a natural idea that has great potential.
What does quantum computation tell us about us, the universe, and what computation is at its most fundamental level?
With quantum computation the focus changes from how humans compute to how the universe computes.
Can quantum computers simulate classical computers? The answer is that they can. Any classical computation can be done on a quantum computer. Consequently, quantum computation is more general than classical computation.
Computation is really quantum computation. Classical computations are just special cases of quantum ones.
In this light, classical computation seems an anthropocentric version of what computation really is. Just as Copernicus showed that the Earth wasn’t the center of the universe and Darwin showed that humans evolved from other animals, we are now beginning to see that computations are not centered on humans. Quantum computing represents a true paradigm shift.
We are entering a new era, with a new way of thinking about what computation really is. What we are going to discover is impossible to say, but now is the time for exploration and innovation. The greatest years for quantum computation are ahead of us.

