Quantum computing is largely a mystery to many, and this post is an attempt to shed some light on how it works. Beyond the Heisenberg uncertainty principle, there are four other important principles or phenomena that are used in quantum computing. These are superposition, entanglement, no-cloning, and no signaling. We’ll describe these and their application in short order
Quantum Computing endures as the emerging technology for futuristic artificial intelligence. It is one of the primary areas of research and development that remains the holy grail of the computing revolution. It is unique as compared to classical computing. It promises the technology in solving the problems, which are intractable with modern technologies. With quantum computing, solving any problems will have a direct influence on the global economy and scientific breakthroughs. Thus, numerous research and efforts are invested in this technology to resolve real-world problems and exploring new possibilities with opportunities. Quantum Computing is widely a mystery to many. It requires in-depth research to understand its operations and solutions for real-time problems. This article focuses on quantum computing theories, concepts, algorithms, and applications.
The quantum world is rife with surprises and non-intuitive behavior, such as quantum superposition and entanglement. An experiment of Schrodinger’s cat being alive and dead at the same time is an example of quantum superposition. What this means is that a quantum particle is in a superposition of multiple states at the same time until it is observed (measured). On observation, the superposition “collapses” and returns to the original state that takes a state from the superposition based on some probability. Like the cat being found alive (or dead) with a 50% probability for each outcome. Consider the spin of an electron, which can be either up or down when measured. However, a pristine electron’s spin remains in a state where it spins up and down at the same time. The electron on observation changes its state either an up spin or a down spin with a probability for each outcome. Such “collapse” is often called a wave function collapse. The particle’s state is described mathematically using wave functions. The wave function expresses probabilities for different possible outcomes of an observation.
The “collapse” refers to particle determination in one of those states with possibility. Such probabilistic behavior built into quantum mechanics with its outcomes of observation (measurement) qualified alongside probabilities. Nature appears to be surprisingly probabilistic and non-deterministic at its deepest level.
Quantum entanglement is the scenario where two particles are deeply correlated in the strangest of ways, such that an induced behavior in one particle instantly affects the other particle, even if these particles separated across galaxies or the whole universe! For example, a beam of light sent through a BBO crystal can generate entangled pairs of photons that may be sent in different directions and have complementary attributes (like horizontal versus vertical polarizations). Measuring one photon for its polarization (causing a collapse of its wave function so to speak) instantly collapses the state of the complimentary photon as well. This “spooky action at a distance” as Einstein called it, was first brought to attention in the now well-known EPR (Einstein-Podolsky-Rosen) paper that attempted to attack the foundations of quantum physics, with little success.
For superposition and entanglement are quite real phenomena, experimentally verified many times, and are today incorporated into quantum computing systems. Indeed these mysterious phenomena hold the right keys that help unlock the power and potential of quantum computing.
The Quantum Bit or Qubit
“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical!”
This quote is associated with the celebrated physicist Richard Feynman, who is credited to have started the ball rolling on quantum computing. Any computing machinery needs to store and manipulate information in the form of bits. Quantum computing uses quantum bits (called qubits). A qubit is in a superposition of both states while a classical bit can be in a '0' or a '1' state. A qubit is both '0' and '1' at the same time until it is “measured.” During measurement or reading, the qubit “collapses” to either a '0' or a '1' with some probability.
A 2-qubit register will be in a superposition of four values, viz 00, 01, 10, and 11 all at the same time. As these two qubits are entangled, and the resulting state space becomes a superposition of those individual qubit states. Therefore, a 64-qubit register can be in a superposition of 2^64 values, all at the same time while a 64-bit classical register can hold only a single 64-bit value. As the number of entangled qubits increases, the potential value space simultaneously occupied grows exponentially. It results in tremendous implications on computing power, as basic computing operations can work with these exponentially large value sets as if they were but a single quantum register.
Quantum algorithms may make use of superposition and entanglement, and operate on the totality of qubit values all at once. The hitch is that the associated superposition collapses on measuring (i.e., read) the qubits, instead of a large set of values. The result is a single value taken from the possible values in that superposition with a probability. The underlying question in quantum computing is how to create meaningful results from computations provided all the superpositions vanish on a measurement, along with all the computed values. As it turns out, there are smart algorithms that can get unwanted superpositions to cancel each other out so that only the desired results remain. They could increase the probability amplitudes of specific states of interest so that the measured outcome has a high probability of being the required result. Shor’s algorithm for factorization and Grover’s search algorithm are examples of such quantum algorithms. The discovery of such algorithms changed the real landscape of quantum computing.
How do you construct a qubit? Quantum phenomena are observed widely at microscopic levels, where system interactions with its macroscopic environment are limited. When isolated from the classical world, photons, electrons, and even atoms and molecules can exhibit quantum behavior with all its spookiness. An electron has a spin attribute (up or down) which can be used as a qubit value of '0' and '1' respectively. The electron (until measured) in a superposition of both spins denotes a quantum qubit. Photon polarization is another example, where horizontal and vertical polarizations can stand in for '0' and '1.' Likewise, an atom in its ground and excited states can also be used to represent a qubit. The moment such systems interact with the 'macro' or 'classical environment' or their state is read out by a device, which is known as a 'measurement,' and all quantum phenomena collapse into classical behavior of everyday objects. Therefore, it is necessary to create qubits at the microscopic scale in a highly controlled fashion so that superposition and entanglement can exist. It can be applied in algorithms until the computed qubits can be readout. The prerequisite for near-complete isolation of qubits and quantum circuitry (called quantum gates) implies extreme engineering and monumental challenges.
Here are some examples of evolving technologies. Trapped atomic ions (such as Ca+) stabilized using Coulomb forces between them can function as qubits. RF electric field confines these ions. Lasers, or electrodes pumping microwaves, are used to manipulate the qubits. Another approach manipulates electrons in a superconducting electronic circuit, which requires the system to be cooled to near absolute zero (degree Kelvin) to achieve superconductivity. There are also approaches that try to use an adaptation of traditional silicon manufacturing processes to create qubits. All these and more are under development and refinement as of this writing.
It is not only enough to create qubits, but also needs to work on Computational Logic. There are gates like XOR, NAND, and NOT in classical computing, quantum computing also has the concept of logic gates. Any classical computation can be reduced to operations on a logic circuit using classical gates (like the NAND gate), quantum computations are also composed of quantum gates. There are a number of these gates, such as the Hadamard gate, CNOT gate, and the CSWAP gate. However, the difference between classical logic circuits is quite striking.
Quantum computation using a quantum logic circuit typically ends with a measurement, which produces the output into a classical state (collapses the wave function generating a value) denoting quantum gates and circuits are reversible. For any given logic gate, there exists another logic gate which reverses that operation not true of classical gates. An AND gate receives two inputs and yields one output. It is not possible to reconstruct the original inputs by providing gate output. However, quantum gates are fully reversible.
A gate that destroys information (like the AND gate) is tantamount to a quantum measurement, which causes a wave function collapse. There is no destruction or loss of information in the quantum world. In a quantum computing system of logic gates, one would not, in general, proceed with a measurement until the data has passed through various gates of the circuit and we are ready to observe the result. Therefore, quantum gates need to be reversible. This further implies that all such gates would have an equal number of outputs versus inputs, which raises the question of how one can implement as an AND gate. The solution is to have a more complex logic gate with an equal number of inputs and outputs, and one of the outputs provides the required AND operation. The unwanted outputs can be discarded using clever tricks. The bottom line is quantum computing circuits, which are reversible and follows from a principle of quantum mechanics known as unitary evolution. However, all classical computing logic can also be implemented using quantum logic gates and circuits.
Non-classic logic circuits can support superposition and entanglement. Hadamard gates can be used to create a superposition of values of one or more qubits. CNOT and Hadamard's gates are joined to form entangled qubits or quantum registers. These techniques widely used in Shor’s algorithm for prime factorization.
Since the outcome of any quantum measurement is probabilistic, it may be required to ascertain the probability of the result(s) so measured that could be done by repeating the computation multiple times and arriving at a conclusion. However, there also exist quantum algorithms that result in fully deterministic outcomes. In such cases, other superpositions cancel each other out, producing a result, which has a probability of 1.
One of the unique aspects of quantum entanglement (which is odd connection-at-a-distance between two particles), is that measuring one particle and observing its property (an electron’s spin direction) determines the corresponding property of the entangled particle instantly. So if Alice were to create a set of entangled qubit pairs, send one of each pair to Bob, and then measure each of her qubits to reveal a random series of bits, her measurement also determines what Bob will see when he measures his qubits. This mechanism can be used to establish a shared secret key of random bits.
Note that entanglement cannot be used to send any information from Alice to Bob. Doing so would violate Einstein’s principle of relativity that prohibits faster than light communication. Alice can only reveal a random probabilistic property of her particle through measurement, and thereby know what Bob will measure on his entangled particle. She cannot transmit any information that she wants through this mechanism. This prohibition on sending signals through entanglement is known as the quantum no-signaling principle.
The unusual thing about entanglement is the correlation of attributes of two entangled particles is much like a pair-bond. To invoke a classical analogy, “quantum entanglement is monogamous.” If A and B (quantum particles) are entangled, the relationship remains unique to the pair. If C comes along and attempts to interfere, the connection is corrupted and becomes apparent in measurements of A or B. The characteristic of entanglement can be used to establish secure encryption keys between Alice and Bob, with both coming to know if Charlie or any third party has intercepted the keys. It is not possible for Charlie to make a copy of qubits to be transmitted between Alice and Bob and send the originals on their way. The quantum no-cloning theorem states that it is impossible to clone or copy an unknown quantum state. If Charlie measures the qubits before sending to Bob, it destroys the relationship, and establishment with Bob fails. Therefore, the establishment is guaranteed to be secure or to fail, by the real nature of quantum phenomena.
Quantum computing has the potential to yield perfect cryptography that helps decimate many current classical cryptographic techniques. Shor’s algorithm for prime factorization is a good example. The public key cryptography (think RSA) today uses the fact that any number (a key) can decompose into a product of prime numbers. It is trivial to determine a large number from prime numbers and their multiples. Determining the primes from provided large number is extremely intense as the number increases. This observation is essential to deployed classical cryptography. However, Shor’s quantum algorithm for factorization can do this prime decomposition in polynomial time instead of the classical exponential time, which implies that given a quantum computer of sufficient qubits, Shor’s algorithm can be used to identify the associated primes from the key, and decrypt any encrypted messages quickly.
A Quantum computer cannot accelerate all classical algorithms and operations as there are algorithms, which are known to be quantum-proof (i.e., identically difficult on a quantum computer). However, several current classical cryptography algorithms need to be changed to be quantum-proof or discarded. RSA public-key cryptography used on the Internet is one that also covers digital signatures in electronic commerce. This vulnerability will also apply to the existing stored data, depending on the algorithms used for encryption and authentication. Rethinking cryptography or post-quantum cryptography is an active research area. Imagine a nation or state with ample resources investing in quantum computing and involving in active espionage on the rest of the world. However, the scenario is not yet realistic as of this writing, and we shall see why below.
Quantum Supremacy and its Achilles Heel
Quantum Supremacy refers to a stage in the development of quantum computing where a quantum computing system is capable of competing for the prevailing best classical computer. Though estimates vary, a 50+ qubit system would be required to achieve this stage today as per Google. However, all qubits are not the same.
The most challenging task faced at this point is keeping qubits pristine. The environment (or the classical world) is a constant “war” with quantum systems, causing decoherence and loss of superposition and entanglement, which means that a quantum system, say an electron, at some point interacts with other larger-scale systems or particles, which causes its wave function to collapse and result in classical behavior. This kind of interference may be called noise. Handling noise in quantum systems is the real challenge to achieving quantum supremacy. In most implementations of qubits, the system is able to keep a qubit pristine for perhaps some microseconds before decoherence occurs. Therefore, any quantum computing algorithm needs to execute within this timeframe.
To make matters worse, each additional qubit added into the system compounds the noise problem. This noise grows with the number of qubits and results hard to add the next qubit. Indeed, there are mathematicians like Gil Kai who argue that the rise of noise will make any quantum computer mediocre. They argue that quantum computers cannot work, even in principle.
A solution comes below the technique of quantum error correction using additional bits in a message to include error correction codes. The quantum error correction uses more qubits to verify and correct the fidelity of quantum information. The technique passes the information in any qubits into multiple qubits using entanglement. However, noise or errors grow exponentially with qubits quantity, which denotes the need for many more qubits in error correction. Estimates vary from thousands to millions of requirements to physical qubits for implementing the logical 50-70 qubits to achieve quantum supremacy.
This task of building a sufficient number of stable logical qubits is the holy grail of quantum computing today.
When IBM announces a 50-qubit processor, these are physical qubits, not logical ones. The number of logical qubits, in this case, maybe a handful. Companies racing to build quantum computing chips or systems include IBM, Microsoft, Google, Intel, Alibaba, Fujitsu, Lockheed Martin (and many others). However, also a whole host of universities and government-sponsored programs. However, there is a hope that this combined onslaught will solve this problem, and quantum computing will move into a practical reality. It will not be an exaggeration to say that we might see a decade or more pass before someone achieves quantum supremacy.
Applications of Quantum Computing
Quantum algorithms are an area of active research. Traditional approaches to computing algorithms generally yield the promised computational power of quantum systems. The creative use of superposition and entanglement are prerequisites to achieving these phenomenal speedups, which means that while classical algorithms implemented on quantum computers, one should look for developing new and novel quantum algorithms. The number of prevailing algorithms is minimum and still growing as compared to that of classical algorithms.
The quantum computing makes significant strides beyond cryptography in the areas including AI, Big data (Data Sciences), weather simulations, astronomical phenomena, and the science of synthesizing materials at the molecular level. The presence of noise in quantum systems helps in simulating molecular systems (such as for drug synthesis) as the latter also are beset by noise constantly. The objective is to reach a stage where nature accurately exists as quantum mechanical.
The research in quantum algorithms does not necessarily require the availability of quantum computers. Several computational theories (including Alan Turing’s work on universal computing machines) exist before the origin of actual computing machines. IBM has made the availability of its 5-qubit quantum computer on the World Wide Web for free experimentation with quantum computing. Algorithm development is an area that is rapidly growing.
Hybrid approaches prevail in the computing system. A computing system balances both classical and quantum worlds and uses quantum information to enhance acceleration for specific operations. D-Wave Systems’ QPU targets and solves a particular set of problems using a method called simulated annealing, which is a technique used in optimization problems. A quantum version of this method exists, which seeks the lowest energy or low-energy solutions. In this case, the quantum system seeks a minimal energy state. D-Wave’s 2000-qubit Quantum Annealer QPU has over 150 customers today (in 2019) and still growing. These specialized systems have applications in finding optimizations in areas as diverse as transportation, healthcare, automotive, and e-commerce. This approach does not require quantum logic gates or universal computation capabilities and is more tolerant of qubit noise.
What if Quantum Theory is wrong?
Now, this raises an interesting question. Given that, physicists cannot agree on the various interpretations of quantum mechanics and what if these theories are wrong. Schumacher provides his view on this question as per cryptography and quantum information theory.
“When we showed why entanglement had to be monogamous, we did not use the theory of quantum mechanics. We used the observed correlations between entangled photons. Quantum mechanics predict these, but they are also directly observable statistical facts. We used the no-signaling principle, which is also a directly observable fact that means quantum key distribution is secure, even if quantum theory is wrong. Logically, the establishment of quantum cryptography is more as compared to that of quantum theory. The monogamy of entanglement will be an enduring part of physics even beyond quantum theories.”