scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Adiabatic quantum computation

29 Jan 2018-Reviews of Modern Physics (American Physical Society)-Vol. 90, Iss: 1, pp 015002
TL;DR: In this paper, the equivalence of the adiabatic and circuit models of quantum computation has been proved, and the placement of quantum computations in the more general classification of computational complexity theory is discussed.
Abstract: The simple act of slowly varying the parameters of a quantum system so that it remains always in its ground state is extremely rich from an information processing point of view. For an ideal, closed system, this adiabatic evolution is equivalent to full quantum computation, and it is convenient for establishing quantum algorithms for optimization. This review presents adiabatic quantum algorithms, proves the closed-system equivalence of the adiabatic and circuit models of quantum computation, reviews the placement of adiabatic quantum computation in the more general classification of computational complexity theory, and discusses the case of ``stoquastic'' quantum evolutions.
Citations
More filters
Journal ArticleDOI
TL;DR: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future as mentioned in this paper, which will be useful tools for exploring many-body quantum physics, and may have other useful applications.
Abstract: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away --- we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

3,898 citations

Journal ArticleDOI
06 Aug 2018
TL;DR: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future, and the 100-qubit quantum computer will not change the world right away - but it should be regarded as a significant step toward the more powerful quantum technologies of the future.
Abstract: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away --- we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

2,598 citations

Journal ArticleDOI
TL;DR: This review presents strategies employed to construct quantum algorithms for quantum chemistry, with the goal that quantum computers will eventually answer presently inaccessible questions, for example, in transition metal catalysis or important biochemical reactions.
Abstract: One of the most promising suggested applications of quantum computing is solving classically intractable chemistry problems. This may help to answer unresolved questions about phenomena such as high temperature superconductivity, solid-state physics, transition metal catalysis, and certain biochemical reactions. In turn, this increased understanding may help us to refine, and perhaps even one day design, new compounds of scientific and industrial importance. However, building a sufficiently large quantum computer will be a difficult scientific challenge. As a result, developments that enable these problems to be tackled with fewer quantum resources should be considered important. Driven by this potential utility, quantum computational chemistry is rapidly emerging as an interdisciplinary field requiring knowledge of both quantum computing and computational chemistry. This review provides a comprehensive introduction to both computational chemistry and quantum computing, bridging the current knowledge gap. Major developments in this area are reviewed, with a particular focus on near-term quantum computation. Illustrations of key methods are provided, explicitly demonstrating how to map chemical problems onto a quantum computer, and how to solve them. The review concludes with an outlook on this nascent field.

954 citations

Journal ArticleDOI
TL;DR: This Review provides an overview of the algorithms and results that are relevant for quantum chemistry and aims to help quantum chemists who seek to learn more about quantum computing and quantum computing researchers who would like to explore applications in quantum chemistry.
Abstract: Practical challenges in simulating quantum systems on classical computers have been widely recognized in the quantum physics and quantum chemistry communities over the past century. Although many approximation methods have been introduced, the complexity of quantum mechanics remains hard to appease. The advent of quantum computation brings new pathways to navigate this challenging and complex landscape. By manipulating quantum states of matter and taking advantage of their unique features such as superposition and entanglement, quantum computers promise to efficiently deliver accurate results for many important problems in quantum chemistry, such as the electronic structure of molecules. In the past two decades, significant advances have been made in developing algorithms and physical hardware for quantum computing, heralding a revolution in simulation of quantum systems. This Review provides an overview of the algorithms and results that are relevant for quantum chemistry. The intended audience is both quantum chemists who seek to learn more about quantum computing and quantum computing researchers who would like to explore applications in quantum chemistry.

910 citations

Journal ArticleDOI
TL;DR: In the past 20 years, impressive progress has been made both experimentally and theoretically in superconducting quantum circuits, which provide a platform for manipulating microwave photons as mentioned in this paper, and many higher-order effects, unusual and less familiar in traditional cavity quantum electrodynamics with natural atoms, have been experimentally observed.

909 citations

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Book
01 Jan 1979
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Abstract: This is the second edition of a quarterly column the purpose of which is to provide a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NP-Completeness,’’ W. H. Freeman & Co., San Francisco, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed. Readers having results they would like mentioned (NP-hardness, PSPACE-hardness, polynomial-time-solvability, etc.), or open problems they would like publicized, should send them to David S. Johnson, Room 2C355, Bell Laboratories, Murray Hill, NJ 07974, including details, or at least sketches, of any new proofs (full papers are preferred). In the case of unpublished results, please state explicitly that you would like the results mentioned in the column. Comments and corrections are also welcome. For more details on the nature of the column and the form of desired submissions, see the December 1981 issue of this journal.

40,020 citations

Journal ArticleDOI
15 Oct 1999-Science
TL;DR: A model based on these two ingredients reproduces the observed stationary scale-free distributions, which indicates that the development of large networks is governed by robust self-organizing phenomena that go beyond the particulars of the individual systems.
Abstract: Systems as diverse as genetic networks or the World Wide Web are best described as networks with complex topology. A common property of many large networks is that the vertex connectivities follow a scale-free power-law distribution. This feature was found to be a consequence of two generic mechanisms: (i) networks expand continuously by the addition of new vertices, and (ii) new vertices attach preferentially to sites that are already well connected. A model based on these two ingredients reproduces the observed stationary scale-free distributions, which indicates that the development of large networks is governed by robust self-organizing phenomena that go beyond the particulars of the individual systems.

33,771 citations

Book
01 Jan 1985
TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Abstract: Linear algebra and matrix theory are fundamental tools in mathematical and physical science, as well as fertile fields for research. This new edition of the acclaimed text presents results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrates their importance in a variety of applications. The authors have thoroughly revised, updated, and expanded on the first edition. The book opens with an extended summary of useful concepts and facts and includes numerous new topics and features, such as: - New sections on the singular value and CS decompositions - New applications of the Jordan canonical form - A new section on the Weyr canonical form - Expanded treatments of inverse problems and of block matrices - A central role for the Von Neumann trace theorem - A new appendix with a modern list of canonical forms for a pair of Hermitian matrices and for a symmetric-skew symmetric pair - Expanded index with more than 3,500 entries for easy reference - More than 1,100 problems and exercises, many with hints, to reinforce understanding and develop auxiliary themes such as finite-dimensional quantum systems, the compound and adjugate matrices, and the Loewner ellipsoid - A new appendix provides a collection of problem-solving hints.

23,986 citations