3.1.2 The Quantum World
Today’s computers perform calculations and process information using the standard (or as a physicist would say, “classical”) model of computation, which dates back to Turing and von Neumann. In this model, all information is reducible to bits, which can take the values of either 0 or 1 -- and all processing can be performed via simple logic gates (AND, OR, NOT, NAND) acting on one or two bits at a time. At any point in its computation, a classical computer’s state is entirely determined by the states of all its bits, so that a computer with n bits can exist in one of 2n possible states, ranging from 00...0 to 11…1 .
The power of the quantum computer, meanwhile, lies in its much richer repertoire of states. A quantum computer also has bits, just like any computer. But instead of 0 and 1, its quantum bits, or qubits, can represent a 0, 1, or both at once, a property known as superposition. This by itself is not much help, since a computer whose bits can be intermediate between 0 and 1 is just an analog computer, scarcely more powerful than an ordinary digital computer. A quantum computer takes advantage of a special kind of superposition that allows for exponentially many logical states at once, all the states from |00...0⟩ to |11…1⟩ . This is a powerful feat, and no classical computer can achieve it. The vast majority of these quantum superpositions, and the ones most useful for quantum computation, are entangled—they are states of the whole computer that do not correspond to any assignment of digital or analog states of the individual qubits. While not as powerful as exponentially many classical computers, a quantum computer is significantly more powerful than any one classical computer -- whether it be deterministic, probabilistic, or analog. For a few famous problems (such as factoring large numbers), a quantum computer is clearly the winner over a classical computer. A working quantum computer could factor numbers in a day that would take a classical computer millions of years.
One might think that the difficulty in understanding quantum computing or quantum physics lies in "hard math"... but in fact, mathematically, quantum concepts are only a bit more complex than high school algebra. Quantum physics is hard because, like Einstein’s theory of relativity, it requires internalizing ideas that are simple but very counterintuitive. With relativity, the strange idea is that time and space are interconnected, when common sense tells us they should act independently. If you try to explain relativity to someone by starting with time and space, you may be greeted by blank stares. A better way to start is as Einstein did, explaining that relativity follows from a simple physical principle: the speed of light is the same for all uniformly moving observers. This one modest idea then becomes extremely profound and leads, by inexorable logic, to Einsteinian spacetime.
With quantum physics, the counterintuitive ideas one must accept are 1) a physical system in a perfectly definite state can still behave randomly, and 2) two systems that are too far apart to influence each other can nevertheless behave in ways that, though individually random, are somehow strongly correlated. Unfortunately, unlike relativity, there is no single simple physical principle from which these conclusions follow. The best we can do is to distill quantum mechanics down to a few abstract-sounding mathematical laws, from which all the observed behavior of quantum particles (and qubits in a quantum computer) can be deduced and predicted. And, as with relativity, we must guard against attempting to describe quantum concepts in classical terms.