scispace - formally typeset
Search or ask a question

Quantum Computation and Quantum Information

TL;DR: This chapter discusses quantum information theory, public-key cryptography and the RSA cryptosystem, and the proof of Lieb's theorem.
Abstract: Part I. Fundamental Concepts: 1. Introduction and overview 2. Introduction to quantum mechanics 3. Introduction to computer science Part II. Quantum Computation: 4. Quantum circuits 5. The quantum Fourier transform and its application 6. Quantum search algorithms 7. Quantum computers: physical realization Part III. Quantum Information: 8. Quantum noise and quantum operations 9. Distance measures for quantum information 10. Quantum error-correction 11. Entropy and information 12. Quantum information theory Appendices References Index.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors consider the atomic dynamics and the optical response of the medium to a continuous-wave laser and show how coherently prepared media can be used to improve frequency conversion in nonlinear optical mixing experiments.
Abstract: Coherent preparation by laser light of quantum states of atoms and molecules can lead to quantum interference in the amplitudes of optical transitions. In this way the optical properties of a medium can be dramatically modified, leading to electromagnetically induced transparency and related effects, which have placed gas-phase systems at the center of recent advances in the development of media with radically new optical properties. This article reviews these advances and the new possibilities they offer for nonlinear optics and quantum information science. As a basis for the theory of electromagnetically induced transparency the authors consider the atomic dynamics and the optical response of the medium to a continuous-wave laser. They then discuss pulse propagation and the adiabatic evolution of field-coupled states and show how coherently prepared media can be used to improve frequency conversion in nonlinear optical mixing experiments. The extension of these concepts to very weak optical fields in the few-photon limit is then examined. The review concludes with a discussion of future prospects and potential new applications.

4,218 citations


Cites background or methods from "Quantum Computation and Quantum Inf..."

  • ...A large single-photon nonlinear phase shift that exceeds rad can be used to implement two-qubit quantum logic gates Nielsen and Chuang, 2000 ....

    [...]

  • ...It is well known in quantum optics that the presence of high-finesse cavities can be used to enhance photonphoton interactions, for example, for the purpose of quantum computation Nielsen and Chuang, 2000 ....

    [...]

Journal ArticleDOI
TL;DR: A measure of entanglement that can be computed effectively for any mixed state of an arbitrary bipartite system is presented and it is shown that it does not increase under local manipulations of the system.
Abstract: We present a measure of entanglement that can be computed effectively for any mixed state of an arbitrary bipartite system. We show that it does not increase under local manipulations of the system, and use it to obtain a bound on the teleportation capacity and on the distillable entanglement of mixed states.

3,889 citations

Journal ArticleDOI
09 Sep 2004-Nature
TL;DR: It is shown that the strong coupling regime can be attained in a solid-state system, and the concept of circuit quantum electrodynamics opens many new possibilities for studying the strong interaction of light and matter.
Abstract: The interaction of matter and light is one of the fundamental processes occurring in nature, and its most elementary form is realized when a single atom interacts with a single photon. Reaching this regime has been a major focus of research in atomic physics and quantum optics1 for several decades and has generated the field of cavity quantum electrodynamics2,3. Here we perform an experiment in which a superconducting two-level system, playing the role of an artificial atom, is coupled to an on-chip cavity consisting of a superconducting transmission line resonator. We show that the strong coupling regime can be attained in a solid-state system, and we experimentally observe the coherent interaction of a superconducting two-level system with a single microwave photon. The concept of circuit quantum electrodynamics opens many new possibilities for studying the strong interaction of light and matter. This system can also be exploited for quantum information processing and quantum communication and may lead to new approaches for single photon generation and detection.

3,452 citations

MonographDOI
20 Apr 2009
TL;DR: This beginning graduate textbook describes both recent achievements and classical results of computational complexity theory and can be used as a reference for self-study for anyone interested in complexity.
Abstract: This beginning graduate textbook describes both recent achievements and classical results of computational complexity theory. Requiring essentially no background apart from mathematical maturity, the book can be used as a reference for self-study for anyone interested in complexity, including physicists, mathematicians, and other scientists, as well as a textbook for a variety of courses and seminars. More than 300 exercises are included with a selected hint set.

2,965 citations

Journal ArticleDOI
TL;DR: In this article, the authors present the Deutsch-Jozsa algorithm for continuous variables, and a deterministic version of it is used for quantum information processing with continuous variables.
Abstract: Preface. About the Editors. Part I: Quantum Computing. 1. Quantum computing with qubits S.L. Braunstein, A.K. Pati. 2. Quantum computation over continuous variables S. Lloyd, S.L. Braunstein. 3. Error correction for continuous quantum variables S.L. Braunstein. 4. Deutsch-Jozsa algorithm for continuous variables A.K. Pati, S.L. Braunstein. 5. Hybrid quantum computing S. Lloyd. 6. Efficient classical simulation of continuous variable quantum information processes S.D. Bartlett, B.C. Sanders, S.L. Braunstein, K. Nemoto. Part II: Quantum Entanglement. 7. Introduction to entanglement-based protocols S.L. Braunstein, A.K. Pati. 8. Teleportation of continuous uantum variables S.L. Braunstein, H.J. Kimble. 9. Experimental realization of continuous variable teleportation A. Furusawa, H.J. Kimble. 10. Dense coding for continuous variables S.L. Braunstein, H.J. Kimble. 11. Multipartite Greenberger-Horne-Zeilinger paradoxes for continuous variables S. Massar, S. Pironio. 12. Multipartite entanglement for continuous variables P. van Loock, S.L. Braunstein. 13. Inseparability criterion for continuous variable systems Lu-Ming Duan, G. Giedke, J.I. Cirac, P. Zoller. 14. Separability criterion for Gaussian states R. Simon. 15. Distillability and entanglement purification for Gaussian states G. Giedke, Lu-Ming Duan, J.I. Cirac, P. Zoller. 16. Entanglement purification via entanglement swapping S. Parke, S. Bose, M.B. Plenio. 17. Bound entanglement for continuous variables is a rare phenomenon P. Horodecki, J.I. Cirac, M. Lewenstein. Part III: Continuous Variable Optical-Atomic Interfacing. 18. Atomic continuous variable processing and light-atoms quantum interface A. Kuzmich, E.S. Polzik. Part IV: Limits on Quantum Information and Cryptography. 19. Limitations on discrete quantum information and cryptography S.L. Braunstein, A.K. Pati. 20. Quantum cloning with continuous variables N.J. Cerf. 21. Quantum key distribution with continuous variables in optics T.C. Ralph. 22. Secure quantum key distribution using squeezed states D. Gottesman, J. Preskill. 23. Experimental demonstration of dense coding and quantum cryptography with continuous variables Kunchi Peng, Qing Pan, Jing Zhang, Changde Xie. 24. Quantum solitons in optical fibres: basic requisites for experimental quantum communication G. Leuchs, Ch. Silberhorn, E. Konig, P.K. Lam, A. Sizmann, N. Korolkova. Index.

2,940 citations


Cites background from "Quantum Computation and Quantum Inf..."

  • ...N.=0.4 means that asymptotically 1000 copies of the state can be transformed into 400 maximally entangled states via deterministic state transformations using local operations and classical communication LOCC; Nielsen and Chuang, 2000 ....

    [...]

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations

Journal ArticleDOI
TL;DR: This paper suggests ways to solve currently open problems in cryptography, and discusses how the theories of communication and computation are beginning to provide the tools to solve cryptographic problems of long standing.
Abstract: Two kinds of contemporary developments in cryptography are examined. Widening applications of teleprocessing have given rise to a need for new types of cryptographic systems, which minimize the need for secure key distribution channels and supply the equivalent of a written signature. This paper suggests ways to solve these currently open problems. It also discusses how the theories of communication and computation are beginning to provide the tools to solve cryptographic problems of long standing.

14,980 citations

Journal ArticleDOI
TL;DR: Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that one is led to conclude that the description of reality as given by a wave function is not complete.
Abstract: In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false. One is thus led to conclude that the description of reality as given by a wave function is not complete.

13,778 citations

Journal ArticleDOI
E. T. Jaynes1
TL;DR: In this article, the authors consider statistical mechanics as a form of statistical inference rather than as a physical theory, and show that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle.
Abstract: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

12,099 citations

Book
01 Dec 1989
TL;DR: This best-selling title, considered for over a decade to be essential reading for every serious student and practitioner of computer design, has been updated throughout to address the most important trends facing computer designers today.
Abstract: This best-selling title, considered for over a decade to be essential reading for every serious student and practitioner of computer design, has been updated throughout to address the most important trends facing computer designers today. In this edition, the authors bring their trademark method of quantitative analysis not only to high-performance desktop machine design, but also to the design of embedded and server systems. They have illustrated their principles with designs from all three of these domains, including examples from consumer electronics, multimedia and Web technologies, and high-performance computing.

11,671 citations