scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Quantum theory, the Church-Turing principle and the universal quantum computer

TL;DR: In this paper, it is argued that underlying the Church-Turing hypothesis there is an implicit physical assertion: every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means.
Abstract: It is argued that underlying the Church-Turing hypothesis there is an implicit physical assertion. Here, this assertion is presented explicitly as a physical principle: ‘every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means’. Classical physics and the universal Turing machine, because the former is continuous and the latter discrete, do not obey the principle, at least in the strong form above. A class of model computing machines that is the quantum generalization of the class of Turing machines is described, and it is shown that quantum theory and the ‘universal quantum computer’ are compatible with the principle. Computing machines resembling the universal quantum computer could, in principle, be built and would have many remarkable properties not reproducible by any Turing machine. These do not include the computation of non-recursive functions, but they do include ‘quantum parallelism’, a method by which certain probabilistic tasks can be performed faster by a universal quantum computer than by any classical restriction of it. The intuitive explanation of these properties places an intolerable strain on all interpretations of quantum theory other than Everett’s. Some of the numerous connections between the quantum theory of computation and the rest of physics are explored. Quantum complexity theory allows a physically more reasonable definition of the ‘complexity’ or ‘knowledge’ in a physical system than does classical complexity theory.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the basic aspects of entanglement including its characterization, detection, distillation, and quantification are discussed, and a basic role of entonglement in quantum communication within distant labs paradigm is discussed.
Abstract: All our former experience with application of quantum theory seems to say: {\it what is predicted by quantum formalism must occur in laboratory} But the essence of quantum formalism - entanglement, recognized by Einstein, Podolsky, Rosen and Schr\"odinger - waited over 70 years to enter to laboratories as a new resource as real as energy This holistic property of compound quantum systems, which involves nonclassical correlations between subsystems, is a potential for many quantum processes, including ``canonical'' ones: quantum cryptography, quantum teleportation and dense coding However, it appeared that this new resource is very complex and difficult to detect Being usually fragile to environment, it is robust against conceptual and mathematical tools, the task of which is to decipher its rich structure This article reviews basic aspects of entanglement including its characterization, detection, distillation and quantifying In particular, the authors discuss various manifestations of entanglement via Bell inequalities, entropic inequalities, entanglement witnesses, quantum cryptography and point out some interrelations They also discuss a basic role of entanglement in quantum communication within distant labs paradigm and stress some peculiarities such as irreversibility of entanglement manipulations including its extremal form - bound entanglement phenomenon A basic role of entanglement witnesses in detection of entanglement is emphasized

6,980 citations

Proceedings ArticleDOI
Peter W. Shor1
20 Nov 1994
TL;DR: Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is polynomial in the input size, e.g., the number of digits of the integer to be factored are given.
Abstract: A computer is generally considered to be a universal computational device; i.e., it is believed able to simulate any physical computational device with a cost in computation time of at most a polynomial factor: It is not clear whether this is still true when quantum mechanics is taken into consideration. Several researchers, starting with David Deutsch, have developed models for quantum mechanical computers and have investigated their computational properties. This paper gives Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is polynomial in the input size, e.g., the number of digits of the integer to be factored. These two problems are generally considered hard on a classical computer and have been used as the basis of several proposed cryptosystems. We thus give the first examples of quantum cryptanalysis. >

6,961 citations

Proceedings ArticleDOI
Lov K. Grover1
01 Jul 1996
TL;DR: In this paper, it was shown that a quantum mechanical computer can solve integer factorization problem in a finite power of O(log n) time, where n is the number of elements in a given integer.
Abstract: were proposed in the early 1980’s [Benioff80] and shown to be at least as powerful as classical computers an important but not surprising result, since classical computers, at the deepest level, ultimately follow the laws of quantum mechanics. The description of quantum mechanical computers was formalized in the late 80’s and early 90’s [Deutsch85][BB92] [BV93] [Yao93] and they were shown to be more powerful than classical computers on various specialized problems. In early 1994, [Shor94] demonstrated that a quantum mechanical computer could efficiently solve a well-known problem for which there was no known efficient algorithm using classical computers. This is the problem of integer factorization, i.e. testing whether or not a given integer, N, is prime, in a time which is a finite power of o (logN) . ----------------------------------------------

6,335 citations

Journal ArticleDOI
TL;DR: In this article, the authors describe the mathematical underpinnings of topological quantum computation and the physics of the subject are addressed, using the ''ensuremath{ u}=5∕2$ fractional quantum Hall state as the archetype of a non-Abelian topological state enabling fault-tolerant quantum computation.
Abstract: Topological quantum computation has emerged as one of the most exciting approaches to constructing a fault-tolerant quantum computer. The proposal relies on the existence of topological states of matter whose quasiparticle excitations are neither bosons nor fermions, but are particles known as non-Abelian anyons, meaning that they obey non-Abelian braiding statistics. Quantum information is stored in states with multiple quasiparticles, which have a topological degeneracy. The unitary gate operations that are necessary for quantum computation are carried out by braiding quasiparticles and then measuring the multiquasiparticle states. The fault tolerance of a topological quantum computer arises from the nonlocal encoding of the quasiparticle states, which makes them immune to errors caused by local perturbations. To date, the only such topological states thought to have been found in nature are fractional quantum Hall states, most prominently the $\ensuremath{ u}=5∕2$ state, although several other prospective candidates have been proposed in systems as disparate as ultracold atoms in optical lattices and thin-film superconductors. In this review article, current research in this field is described, focusing on the general theoretical concepts of non-Abelian statistics as it relates to topological quantum computation, on understanding non-Abelian quantum Hall states, on proposed experiments to detect non-Abelian anyons, and on proposed architectures for a topological quantum computer. Both the mathematical underpinnings of topological quantum computation and the physics of the subject are addressed, using the $\ensuremath{ u}=5∕2$ fractional quantum Hall state as the archetype of a non-Abelian topological state enabling fault-tolerant quantum computation.

4,457 citations

Journal ArticleDOI
TL;DR: U(2) gates are derived, which derive upper and lower bounds on the exact number of elementary gates required to build up a variety of two- and three-bit quantum gates, the asymptotic number required for n-bit Deutsch-Toffoli gates, and make some observations about the number of unitary operations on arbitrarily many bits.
Abstract: We show that a set of gates that consists of all one-bit quantum gates (U(2)) and the two-bit exclusive-or gate (that maps Boolean values (x,y) to (x,x ⊕y)) is universal in the sense that all unitary operations on arbitrarily many bits n (U(2 n )) can be expressed as compositions of these gates. We investigate the number of the above gates required to implement other gates, such as generalized Deutsch-Toffoli gates, that apply a specific U(2) transformation to one input bit if and only if the logical AND of all remaining input bits is satisfied. These gates play a central role in many proposed constructions of quantum computational networks. We derive upper and lower bounds on the exact number of elementary gates required to build up a variety of two- and three-bit quantum gates, the asymptotic number required for n-bit Deutsch-Toffoli gates, and make some observations about the number required for arbitrary n-bit unitary operations.

3,731 citations

References
More filters
Journal ArticleDOI
01 Nov 1964-Physics
TL;DR: In this article, it was shown that even without such a separability or locality requirement, no hidden variable interpretation of quantum mechanics is possible and that such an interpretation has a grossly nonlocal structure, which is characteristic of any such theory which reproduces exactly the quantum mechanical predictions.
Abstract: THE paradox of Einstein, Podolsky and Rosen [1] was advanced as an argument that quantum mechanics could not be a complete theory but should be supplemented by additional variables These additional variables were to restore to the theory causality and locality [2] In this note that idea will be formulated mathematically and shown to be incompatible with the statistical predictions of quantum mechanics It is the requirement of locality, or more precisely that the result of a measurement on one system be unaffected by operations on a distant system with which it has interacted in the past, that creates the essential difficulty There have been attempts [3] to show that even without such a separability or locality requirement no "hidden variable" interpretation of quantum mechanics is possible These attempts have been examined elsewhere [4] and found wanting Moreover, a hidden variable interpretation of elementary quantum theory [5] has been explicitly constructed That particular interpretation has indeed a grossly nonlocal structure This is characteristic, according to the result to be proved here, of any such theory which reproduces exactly the quantum mechanical predictions

10,253 citations

Book
01 Jan 1934
TL;DR: The Open Society and Its Enemies as discussed by the authors is regarded as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day, as well as many of the ideas in the book.
Abstract: Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of 'falsificationism' electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day.

7,904 citations

Journal ArticleDOI
TL;DR: In this paper, the concept of black-hole entropy was introduced as a measure of information about a black hole interior which is inaccessible to an exterior observer, and it was shown that the entropy is equal to the ratio of the black hole area to the square of the Planck length times a dimensionless constant of order unity.
Abstract: There are a number of similarities between black-hole physics and thermodynamics. Most striking is the similarity in the behaviors of black-hole area and of entropy: Both quantities tend to increase irreversibly. In this paper we make this similarity the basis of a thermodynamic approach to black-hole physics. After a brief review of the elements of the theory of information, we discuss black-hole physics from the point of view of information theory. We show that it is natural to introduce the concept of black-hole entropy as the measure of information about a black-hole interior which is inaccessible to an exterior observer. Considerations of simplicity and consistency, and dimensional arguments indicate that the black-hole entropy is equal to the ratio of the black-hole area to the square of the Planck length times a dimensionless constant of order unity. A different approach making use of the specific properties of Kerr black holes and of concepts from information theory leads to the same conclusion, and suggests a definite value for the constant. The physical content of the concept of black-hole entropy derives from the following generalized version of the second law: When common entropy goes down a black hole, the common entropy in the black-hole exterior plus the black-hole entropy never decreases. The validity of this version of the second law is supported by an argument from information theory as well as by several examples.

6,591 citations

Journal ArticleDOI
TL;DR: In this paper, the authors apply Godel's seminal contribution to modern mathematics to the study of the human mind and the development of artificial intelligence, and apply it to the case of artificial neural networks.
Abstract: From the Publisher: Winner of the Pulitzer Prize, this book applies Godel's seminal contribution to modern mathematics to the study of the human mind and the development of artificial intelligence.

1,983 citations

Journal ArticleDOI
TL;DR: For systems with negligible self-gravity, the bound follows from application of the second law of thermodynamics to a gedanken experiment involving a black hole as discussed by the authors, and it is shown that black holes have the maximum entropy for given mass and size which is allowed by quantum theory and general relativity.
Abstract: We present evidence for the existence of a universal upper bound of magnitude $\frac{2\ensuremath{\pi}R}{\ensuremath{\hbar}c}$ to the entropy-to-energy ratio $\frac{S}{E}$ of an arbitrary system of effective radius $R$. For systems with negligible self-gravity, the bound follows from application of the second law of thermodynamics to a gedanken experiment involving a black hole. Direct statistical arguments are also discussed. A microcanonical approach of Gibbons illustrates for simple systems (gravitating and not) the reason behind the bound, and the connection of $R$ with the longest dimension of the system. A more general approach establishes the bound for a relativistic field system contained in a cavity of arbitrary shape, or in a closed universe. Black holes also comply with the bound; in fact they actually attain it. Thus, as long suspected, black holes have the maximum entropy for given mass and size which is allowed by quantum theory and general relativity.

1,079 citations