scispace - formally typeset
Search or ask a question
Author

Linghang Kong

Bio: Linghang Kong is an academic researcher from Massachusetts Institute of Technology. The author has contributed to research in topics: Quantum gravity & Quantum entanglement. The author has an hindex of 2, co-authored 5 publications receiving 22 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it was shown that the corresponding notions of entanglement and scrambling can be fundamentally different, by proving an asymptotic separation between the time scales of the saturation of the out-of-time-ordered correlation (OTOC) and entropy in a random quantum circuit model defined on graphs with a tight bottleneck, such as tree graphs.
Abstract: The out-of-time-ordered correlation (OTOC) and entanglement are two physically motivated and widely used probes of the "scrambling" of quantum information, a phenomenon that has drawn great interest recently in quantum gravity and many-body physics. We argue that the corresponding notions of scrambling can be fundamentally different, by proving an asymptotic separation between the time scales of the saturation of OTOC and that of entanglement entropy in a random quantum circuit model defined on graphs with a tight bottleneck, such as tree graphs. Our result counters the intuition that a random quantum circuit mixes in time proportional to the diameter of the underlying graph of interactions. It also provides a more rigorous justification for an argument in our previous work arXiv:1807.04363, that black holes may be slow information scramblers, which in turn relates to the black hole information problem. The bounds we obtained for OTOC are interesting in their own right in that they generalize previous studies of OTOC on lattices to the geometries on graphs in a rigorous and general fashion.

27 citations

Journal ArticleDOI
11 Jun 2021
TL;DR: The difference between out-of-time-ordered correlation and entanglement in a general setting is rigorously demonstrated in this article, providing a deeper understanding of scrambling measures, as well as their implication to quantum gravity and many-body physics.
Abstract: The difference between out-of-time-ordered correlation and entanglement in a general setting is rigorously demonstrated, providing a deeper understanding of scrambling measures, as well as their implication to quantum gravity and many-body physics.

10 citations

Posted Content
TL;DR: In this paper, the authors consider the quantum error correction capability of uniformly random covariant codes, and analytically study the most essential cases of U(1) and SU(d) symmetries, and show that for both symmetry groups the error of the covariant code generated by Haar-random symmetric unitaries, i.e. unitaries that commute with the group actions, typically scale as O(n^(-1)) in terms of both the average and worst-case purified distances against erasure noise.
Abstract: Quantum error correction and symmetries play central roles in quantum information science and physics. It is known that quantum error-correcting codes that obey (covariant with respect to) continuous symmetries cannot correct erasure errors perfectly (a well-known result in this regard being the Eastin-Knill theorem in the context of fault-tolerant quantum computing), in contrast to the case without symmetry constraints. Furthermore, several quantitative fundamental limits on the accuracy of such covariant codes for approximate quantum error correction are known. Here, we consider the quantum error correction capability of uniformly random covariant codes. In particular, we analytically study the most essential cases of U(1) and SU(d) symmetries, and show that for both symmetry groups the error of the covariant codes generated by Haar-random symmetric unitaries, i.e. unitaries that commute with the group actions, typically scale as O(n^(-1)) in terms of both the average- and worst-case purified distances against erasure noise, saturating the fundamental limits to leading order. We note that the results hold for symmetric variants of unitary 2-designs, and comment on the convergence problem of symmetric random circuits. Our results not only indicate (potentially efficient) randomized constructions of optimal U(1)- and SU(d)-covariant codes, but also reveal fundamental properties of random symmetric unitaries, which underlie important models of complex quantum systems in wide-ranging physical scenarios with symmetries, such as black holes and many-body spin systems. Our study and techniques may have broad relevance for both physics and quantum computing.

3 citations

Posted Content
TL;DR: In this paper, the authors consider the quantum error correction capability of random covariant codes and show that such codes typically saturate the fundamental limits to leading order in terms of both the average and worst-case purified distances against erasure noise.
Abstract: Quantum error correction and symmetries play central roles in quantum information science and physics. It is known that quantum error-correcting codes covariant with respect to continuous symmetries cannot correct erasure errors perfectly (an important case being the Eastin-Knill theorem), in contrast to the case without symmetry constraints. Furthermore, there are fundamental limits on the accuracy of such covariant codes for approximate quantum error correction. Here, we consider the quantum error correction capability of random covariant codes. In particular, we show that $U(1)$-covariant codes generated by Haar random $U(1)$-symmetric unitaries, i.e. unitaries that commute with the charge operator (or conserve the charge), typically saturate the fundamental limits to leading order in terms of both the average- and worst-case purified distances against erasure noise. We note that the results hold for symmetric variants of unitary 2-designs, and comment on the convergence problem of charge-conserving random circuits. Our results not only indicate (potentially efficient) randomized constructions of optimal $U(1)$-covariant codes, but also reveal fundamental properties of random charge-conserving unitaries, which may underlie important models of complex quantum systems in wide-ranging physical scenarios where conservation laws are present, such as black holes and many-body spin systems.

3 citations

Posted Content
TL;DR: In this paper, the authors generalize the Randomized Benchmarking (RB) framework to continuous groups of gates and show that as long as the noise level is reasonably small, the output can be approximated as a linear combination of matrix exponential decays.
Abstract: Characterization of experimental systems is an essential step in developing and improving quantum hardware. A collection of protocols known as Randomized Benchmarking (RB) was developed in the past decade, which provides an efficient way to measure error rates in quantum systems. In a recent paper (arXiv:2010.07974), a general framework for RB was proposed, which encompassed most of the known RB protocols and overcame the limitation on error models in previous works. However, even this general framework has a restriction: it can only be applied to a finite group of gates. This does not meet the need posed by experiments, in particular the demand for benchmarking non-Clifford gates and continuous gate sets on quantum devices. In this work we generalize the RB framework to continuous groups of gates and show that as long as the noise level is reasonably small, the output can be approximated as a linear combination of matrix exponential decays. As an application, we numerically study the fully randomized benchmarking protocol (i.e. RB with the entire unitary group as the gate set) enabled by our proof. This provides a unified way to estimate the gate fidelity for any quantum gate in an experiment.

Cited by
More filters
Journal Article
TL;DR: In this article, the information retrieval from evaporating black holes is studied under the assumption that the internal dynamics of a black hole is unitary and rapidly mixing, and assuming that the retriever has unlimited control over the emitted Hawking radiation.
Abstract: We study information retrieval from evaporating black holes, assuming that the internal dynamics of a black hole is unitary and rapidly mixing, and assuming that the retriever has unlimited control over the emitted Hawking radiation. If the evaporation of the black hole has already proceeded past the ``half-way'' point, where half of the initial entropy has been radiated away, then additional quantum information deposited in the black hole is revealed in the Hawking radiation very rapidly. Information deposited prior to the half-way point remains concealed until the half-way point, and then emerges quickly. These conclusions hold because typical local quantum circuits are efficient encoders for quantum error-correcting codes that nearly achieve the capacity of the quantum erasure channel. Our estimate of a black hole's information retention time, based on speculative dynamical assumptions, is just barely compatible with the black hole complementarity hypothesis.

752 citations

Journal ArticleDOI
04 May 2021
TL;DR: It is shown that, for a quantum circuit to simulate quantum chaotic behavior, it is both necessary and sufficient that k = O(N) and this result implies the impossibility of simulating quantum chaos on a classical computer.
Abstract: It is well known that a quantum circuit on $N$ qubits composed of Clifford gates with the addition of $k$ non Clifford gates can be simulated on a classical computer by an algorithm scaling as $\text{poly}(N)\exp(k)$[1]. We show that, for a quantum circuit to simulate quantum chaotic behavior, it is both necessary and sufficient that $k=O(N)$. This result implies the impossibility of simulating quantum chaos on a classical computer.

43 citations

Journal ArticleDOI
TL;DR: This study rigorously demonstrate that the OTOC shows a polynomial growth over time as long as α>D and the necessary scrambling time over a distance R is larger than t≳R^{[(2 α-2D)/(2α-D+1)]}.
Abstract: In this study, we investigate out-of-time-order correlators (OTOCs) in systems with power-law decaying interactions such as ${R}^{\ensuremath{-}\ensuremath{\alpha}}$, where $R$ is the distance. In such systems, the fast scrambling of quantum information or the exponential growth of information propagation can potentially occur according to the decay rate $\ensuremath{\alpha}$. In this regard, a crucial open challenge is to identify the optimal condition for $\ensuremath{\alpha}$ such that fast scrambling cannot occur. In this study, we disprove fast scrambling in generic long-range interacting systems with $\ensuremath{\alpha}gD$ ($D$: spatial dimension), where the total energy is extensive in terms of system size and the thermodynamic limit is well defined. We rigorously demonstrate that the OTOC shows a polynomial growth over time as long as $\ensuremath{\alpha}gD$ and the necessary scrambling time over a distance $R$ is larger than $t\ensuremath{\gtrsim}{R}^{[(2\ensuremath{\alpha}\ensuremath{-}2D)/(2\ensuremath{\alpha}\ensuremath{-}D+1)]}$.

43 citations

Journal Article
TL;DR: In this paper, the authors measured entanglement in a system of itinerant particles using quantum interference of many-body twins in optical lattices, making use of their single-site-resolved control of ultracold bosonic atoms.
Abstract: Entanglement is one of the most intriguing features of quantum mechanics. It describes non-local correlations between quantum objects, and is at the heart of quantum information sciences. Entanglement is now being studied in diverse fields ranging from condensed matter to quantum gravity. However, measuring entanglement remains a challenge. This is especially so in systems of interacting delocalized particles, for which a direct experimental measurement of spatial entanglement has been elusive. Here, we measure entanglement in such a system of itinerant particles using quantum interference of many-body twins. Making use of our single-site-resolved control of ultracold bosonic atoms in optical lattices, we prepare two identical copies of a many-body state and interfere them. This enables us to directly measure quantum purity, Rényi entanglement entropy, and mutual information. These experiments pave the way for using entanglement to characterize quantum phases and dynamics of strongly correlated many-body systems.

39 citations

Posted Content
TL;DR: Evidence is given for the existence of computationally pseudorandom states in the CFT, and it is argued that wormhole volume is measureable in a non-physical but computational sense, by amalgamating the experiences of multiple observers in the wormhole.
Abstract: A fundamental issue in the AdS/CFT correspondence is the wormhole growth paradox. Susskind's conjectured resolution of the paradox was to equate the volume of the wormhole with the circuit complexity of its dual quantum state in the CFT. We study the ramifications of this conjecture from a complexity-theoretic perspective. Specifically we give evidence for the existence of computationally pseudorandom states in the CFT, and argue that wormhole volume is measureable in a non-physical but computational sense, by amalgamating the experiences of multiple observers in the wormhole. In other words the conjecture equates a quantity which is difficult to compute with one which is easy to compute. The pseudorandomness argument further implies that this is a necessary feature of any resolution of the wormhole growth paradox, not just of Susskind's Complexity=Volume conjecture. As a corollary we conclude that either the AdS/CFT dictionary map must be exponentially complex, or the quantum Extended Church-Turing thesis must be false in quantum gravity.

39 citations