User:Qcomp/QC-History
A Brief History of Quantum Computing
[edit]to be proposed as replacement for the Timeline in the Quantum computing
First ideas: Paul Benioff described the first quantum mechanical model of a computer.[1] In this work, Benioff showed that a computer could operate under the laws of quantum mechanics by describing a Schrödinger equation description of Turing machines and laying a foundation for further work in quantum computing. The Russian mathematician Yuri Manin then motivated the development of quantum computers.[2]
In 1981, at the First Conference on the Physics of Computation held at MIT, Paul Benioff and Richard Feynman gave talks on quantum computing. Benioff built on his earlier 1980 work showing that a computer can operate under the laws of quantum mechanics. The talk was titled “Quantum mechanical Hamiltonian models of discrete processes that erase their own histories: application to Turing machines”. In Feynman's talk, he brought up for the first time the idea that quantum mechanical computers might be more efficient than classical ones for certain tasks. He observed that it appeared to be impossible to efficiently simulate an evolution of a quantum system on a classical computer, and he proposed a basic model for a quantum computer.[3] Urging the world to build a quantum computer, he said, "Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly, it's a wonderful problem because it doesn't look so easy."[4] Some years later, in 1985, David Deutsch formally described the first universal quantum computer.[5] Just as a Universal Turing machine can simulate any other Turing machine efficiently (Church-Turing thesis), so the universal quantum computer is able to simulate any other quantum computer with at most a polynomial slowdown.
In the following decade, quantum algorithms of increasing sophistication were developed that demonstrated the advantages of a quantum computer over its classical counterpart in different computational settings and with . Among the most influential developments of this period are the Deutsch algorithm and its generalization by Deutsch and Jozsa, which have an advantage in the case of exact and deterministic computation. This was perhaps the earliest result in the computational complexity of quantum computers, proving that they were capable of performing some well-defined computational task more efficiently than any classical computer. The Bernstein-Vazirani algorithm, was the first to show a quantum advantage even in the presence of small errors and introduced a central computational primitive, the quantum Fourier transform. Based on these the Simon algorithm was the first to promise an exponential quantum advantage. Like its predecessors it is an oracle algorithm, i.e., it provides an advantage given a device (the so-called "oracle") that implements a particular function. The advantage is in determining properties of the implemented function, but without counting the computational cost of the oracle itself.
A breakthrough was the algorithm discovered in 1994 by Peter Shor, then at AT&T's Bell Labs, which allows a quantum computer to factor large integers exponentially faster than the best known classical algorithm. Shor's algorithm can theoretically break many of the Public-key cryptography systems in use today,[6] sparking a tremendous interest in quantum computers.
quantum error correction Shor, Steane, Kitaev [7] addressing and refuting a persistent criticism that quantum computing were practically impossible because errors would necessarily accumulate, making long computations inviable. (cite Haroche)
These results lead to a great increase in interest in the field of quantum computation and information, with scientists from many different background starting to work on Most notable were a number of detailed proposals for how to implement the requirements for a quantum computer (qubits, universal gates, read-out) in concrete physical systems. Among the early and most influential proposals were Lloyd 1993, cold trapped ions (Cirac-Zoller 1995), electron spins in quantum dots (Loss-DiVincenzo 1998), spins of donors in Si (Kane 1998), nuclear spins (Gershenfeld, Cory), superconducting circuits (Shnirman et al 1997}. These, in turn, motivated a broad experimental effort to isolate, prepare, and measure qubits, to demonstrate the feasibility of quantum gates and to perform first proof-of-principle experiments demonstrating quantum algorithms on small (few-qubit) systems. An important influence for many of these developments were the DiVincenzo's criteria, a set of requirements deemed necessary for the physical implementation of quantum computation.[8][9]
Throughout the 1990s and first decade of the 2000s: more algorithms, development of quantum complexity theory giving indications for what kind of problems quantum computers might bring advantages (and for which not); better and larger implementations of quantum registers: better theoretical understanding and better experimental control of the sources of noise and errors and increasingly powerful ways to deal with them (the error correction threshold increased from errors per gate operation that could be tolerated to over ).
Architectures: after the first small-scale demonstrations, considerable theoretical effort was invested in the design of potentially scalable architectures of quantum computers, that take into account the practical problems of building systems of a large number of qubits. () This represents an ongoing line of research.
Involvement of companies and first commercially available small-scale quantum computers. (D-wave and controvery; ~50 qubit devices IBM, Google, Intel, ionQ, ...)
The latter 1989, Bikas K. Chakrabarti & collaborators proposes the idea that quantum fluctuations could help explore rough energy landscapes by escaping from local minima of glassy systems having tall but thin barriers by tunneling (instead of climbing over using thermal excitations), suggesting the effectiveness of quantum annealing over classical simulated annealing.[10][11]
- ^ Benioff, Paul (1980). "The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines". Journal of Statistical Physics. 22 (5): 563–591. Bibcode:1980JSP....22..563B. doi:10.1007/bf01011339. The paper was submitted in June 1979 and published in April 1980.
- ^ Manin, Yu I (1980). "Vychislimoe i nevychislimoe" [Computable and Noncomputable]. Sov. Radio (in Russian). pp. 13–15. Archived from the original on May 10, 2013. Retrieved October 20, 2017.
- ^ Feynman, Richard (1982). "Simulating physics with computers" (PDF). J. Theoret. Phys. 21: 467. doi:10.1007/BF02650179.
- ^ Gil, Dario (May 4, 2016). "The Dawn of Quantum Computing is Upon Us". Retrieved May 4, 2016.
- ^ Deutsch, David (1985). "Quantum Theory, the Church-Turing principle and the universal quantum computer". Proc. Roy. Soc. A. 400: 96. doi:10.1098/rspa.1985.0070.
- ^ Ekert, Artur; Josza, Richard (1996). "Quantum computation and Shor's factoring algorithm". Rev. Mod. Phys. 68 (3): 733–753. Bibcode:1996RvMP...68..733E. doi:10.1103/RevModPhys.68.733.
- ^ Shor, P. W. (1995). "Scheme for reducing decoherence in quantum computer memory". Phys. Rev. A. 52: 2493. doi:10.1103/PhysRevA.52.R2493.
- ^ DiVincenzo, D. P. (2000). "The Physical Implementation of Quantum Computation". Fort. Phys. 48: 771. arXiv:quant-ph/0002077. doi:10.1002/1521-3978(200009)48:9/11<771::AID-PROP771>3.0.CO;2-E.
- ^ In fact, they are not strictly necessary, as evidenced by quantum-computing proposals that do not satisfy all DiVincenzo criteria (e.g., dissipative quantum computing).
- ^ Ray, P.; Chakrabarti, B. K.; Chakrabarti, Arunava (1989). "Sherrington-Kirkpatrick model in a transverse field: Absence of replica symmetry breaking due to quantum fluctuations". Physical Review B. 39 (16): 11828–11832. doi:10.1103/PhysRevB.39.11828.
- ^ Cite error: The named reference
Das 2008 1061–1081
was invoked but never defined (see the help page).