Quantum Computing since Democritus
Rate it:
Open Preview
Read between December 25 - December 28, 2017
13%
Flag icon
There's another thing we appreciate now that people in Turing's time did...
This highlight has been truncated due to consecutive passage length restrictions.
13%
Flag icon
This is that, in trying to write programs to simulate human intelligence, we're competing against a billion years of evolution. And that's damn hard. One counterintuitive consequence is that it's much easier to program a computer to beat Garry Kasparov at chess than to program a computer to recognize faces under varied lighting conditions. Often the hardest tasks for AI are the ones that are trivial for a five...
This highlight has been truncated due to consecutive passage length restrictions.
13%
Flag icon
To a computer scientist, it seems perfectly reasonable to say that one computation (say, a LISP interpreter) can conjure into existence a different, unrelated computation (say, a spaceship game) just by dutifully executing rules.
13%
Flag icon
talking to a mathematician, you might not be talking to someone who fears the real world and who's therefore retreated into intellectual masturbation. You might be talking to someone for whom the real world was never especially real to begin with!
14%
Flag icon
David Chalmers.
14%
Flag icon
See David J. Chalmers, The Conscious Mind: In Search of a Fundamental Theory, Oxford University Press, 1997.
14%
Flag icon
By any objective standard, the theory of computational complexity ranks as one of the greatest intellectual achievements of humankind – along with fire, the wheel, and computability theory. That it isn't taught in high schools is really just an accident of history.
14%
Flag icon
It's useful to divide complexity theory into historical epochs: 1950s: Late Turingzoic 1960s: Dawn of the Asymptotic Age 1971: The Cook–Levin Asteroid; extinction of the Diagonalosaurs Early 1970s: The Karpian Explosion 1978: Early Cryptozoic 1980s: Randomaceous Era 1993: Eruption of Mt Razborudich; extinction of the Combinataurs 1994: Invasion of the Quantodactyls Mid-1990s to present: Derandomaceous
15%
Flag icon
So, these days we talk about complexity. (Or at least most of us do.) The idea is, you impose an upper bound on how much of some resource your computer can use. The most obvious resources are (1) amount of time and (2) amount of memory, but many others can be defined. (Indeed, if you visit my Complexity Zoo website,3 you'll find about 500 of them.)
15%
Flag icon
how much can be computed in an amount of time that grows linearly (or quadratically, or logarithmically) with the problem size? Asking this sort of question lets you ignore constant factors.
16%
Flag icon
could it be that, for every ε > 0, there exists an algorithm to multiply n-by-n matrices in time O(n2+ε), but as ε approaches zero, these algorithms become more and more complicated without end?
16%
Flag icon
Computational Complexity by Christos Papadimitriou (Addison-Wesley, 1994);
16%
Flag icon
Computational Complexity: A Modern Approach, by Sanjeev Arora and Boaz Barak (Cambridge University Press, 2009);
16%
Flag icon
Build a baby that acts on the following instructions, and also contains a copy of those instructions in its reproductive organs. “Build a baby that acts on the following instructions, and also contains a copy of those instructions in its reproductive organs.”
17%
Flag icon
P vs. NP is one of the deepest questions that human beings have ever asked.
17%
Flag icon
If NP problems were feasible, then mathematical creativity could be automated. The ability to check a proof would entail the ability to find one. Every Apple II, every Commodore, would have the reasoning power of Archimedes or Gauss.
17%
Flag icon
But if that's the case, then why isn't it obvious that P doesn't equal NP? Surely, God wouldn't be so benign as to grant us these extravagant powers! Surely, our physicist-intuition tells us that brute-force search is unavoidable!
17%
Flag icon
(Leonid Levin told me that Feynman – the king, or possibly court jester, of physicist-intuition – had trouble even being convinced that P vs. NP was an open problem!)
18%
Flag icon
The discovery of Cook, Karp, and Levin was not that there exist NP-complete problems – that's obvious – but rather that many natural problems are NP-complete.
21%
Flag icon
Alright, so what is randomness? Well, that's a profound philosophical question, but I’m a simpleminded person. So, you've got some probability p, which is a real number in the unit interval [0, 1]. That's randomness.
21%
Flag icon
we've already got our hands full with 2n choices. We need choices like we need a hole in the head.
21%
Flag icon
Third, an immediate consequence of the above, called the union bound:
21%
Flag icon
Even the ancients – Turing, Shannon, and von Neumann – understood that a random number source might be useful for writing programs.
21%
Flag icon
There are many, many reasons you might want randomness – for foiling an eavesdropper in cryptography, for avoiding deadlocks in communication protocols, and so on.
22%
Flag icon
since the fastest way to get big numbers is by repeated squaring.
25%
Flag icon
The history of the primality-testing problem can only be seen as a spectacular success of this project. But with such success comes an obvious question: can every randomized algorithm be derandomized? In other words, does P equal BPP?
25%
Flag icon
the seed is k bits long, then there are only 2k possible output strings, regardless of how long those output strings are.
26%
Flag icon
However, the probability that the number of red hats differs from the number of blue hats by less than 2 is small –
26%
Flag icon
The Codebreakers by David Kahn,
26%
Flag icon
Like pretty much all of Shannon's results, this one is trivial in retrospect.
27%
Flag icon
“Rule 110” cellular automaton, as advocated by Stephen Wolfram in his groundbreaking, revolutionary, paradigm-smashing book.
28%
Flag icon
It's amazing, if you think about it, that so basic an idea had to wait until the 1970s to be discovered. Physicists were tidying up the Standard Model while cryptographers were still at the Copernicus stage!
28%
Flag icon
Do any of you know how RSA was first revealed to the world? Right: as a puzzle in Martin Gardner's Mathematical Games column9 for Scientific American!
28%
Flag icon
Classically, however, the best-known factoring algorithm is the Number Field Sieve, which takes about
29%
Flag icon
Fully homomorphic cryptography is a possibility that was first raised in the 1970s, but no one knew how to do it until a few years ago. So, this is one of the biggest developments in theoretical cryptography for decades.
30%
Flag icon
In the usual “hierarchy of sciences” – with biology at the top, then chemistry, then physics, then math – quantum mechanics sits at a level between math and physics that I don't know a good name for.
30%
Flag icon
Basically, quantum mechanics is the operating system that other physical theories run on as application software (with the exception of general relativity, which hasn't yet been successfully ported to this particular OS).
30%
Flag icon
Quantum mechanics is what you would inevitably come up with if you started from probability theory, and then said, let's try to generalize it so that the numbers we used to call “probabilities” can be negative numbers. As such, the theory could have been invented by mathematicians in the nineteenth century without any input from experiment. It wasn't, but it could have been.
30%
Flag icon
More often than not, the only reason we need experiments is that we're not smart enough. After the experiment has been done, if we've learned anything worth knowing at all, then we hope we've learned why the experiment wasn't necessary to begin with – why it wouldn't have made sense for the world to be any other way. But we're too dumb to figure it out ourselves!
32%
Flag icon
complex numbers were seen for centuries as fictitious entities that human beings made up, in order that every quadratic equation should have a root.
33%
Flag icon
we don't know whether NP-complete problems are efficiently solvable in the physical world.
33%
Flag icon
But in a survey6 I wrote years ago, I explained why the ability to solve NP-complete problems would give us “godlike” powers – arguably, even more so than the ability to transmit superluminal signals or reverse the Second Law of Thermodynamics.
33%
Flag icon
NP-complete problems are not efficiently solvable by physical means, and that if a theory suggests otherwise, more likely than not that indicates a problem with the theory.
34%
Flag icon
In order for |ψ to be teleported, two classical bits need to get from Alice to Bob, and those bits can only travel at the speed of light.
35%
Flag icon
BPP is the class of computational problems that are efficiently solvable in the physical world if classical physics is true.
35%
Flag icon
It makes you wonder, what similarly obvious questions are there today that no one's asking?
36%
Flag icon
BQP is the class of languages L {0, 1}* for which there exists a uniform family of polynomial-size quantum circuits, {Cn}, such that for all x {0, 1}n: if x L, then Cn accepts input |x|0...0 with probability at least . if x L, then Cn accepts input |x|0...0 with probability at most .
36%
Flag icon
quantum computers can provide at most an exponential advantage over classical computers. Why is that?
37%
Flag icon
(This was also the first complexity result I proved. Had I known that Adleman et al. had proved it a year before, I might never have gotten started in this business! Occasionally, it's better to have a small academic light cone.)
37%
Flag icon
that is, whether quantum computing is more powerful than classical. Today, we have evidence that this is indeed the case, most notably Shor's algorithm for factoring and discrete log.