scispace - formally typeset
Search or ask a question
Author

K. K. Likharev

Bio: K. K. Likharev is an academic researcher from Moscow State University. The author has contributed to research in topics: Josephson effect & Quantum statistical mechanics. The author has an hindex of 1, co-authored 1 publications receiving 131 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a model of a real physical device (parametric quantron) based on the Josephson effect in superconductors is used throughout the discussion, and it is shown that this device is physically reversible and moreover it can serve as the clementary cell of a logically reversible computer, both these properties being necessary to achieve the fundamental limits of energy dissipation.
Abstract: Fundamental limitations on the energy dissipated during one elementary logical operation are discussed. A model of a real physical device (parametric quantron) based on the Josephson effect in superconductors is used throughout the discussion. This device is shown to be physically reversible, and moreover it can serve as the clementary cell of a logically reversible computer, both these properties being necessary to achieve the fundamental limits of energy dissipation. These limits due to classical and quantum statistics are shown to lie well below the earlier estimates,k B T and ηϑ, respectively.

136 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, the rapid single-flux-quantum (RSFQ) circuit family is reviewed and a discussion of possible future developments and applications of this novel, ultrafast digital technology is discussed.
Abstract: Recent developments concerning the rapid single-flux-quantum (RSFQ) circuit family are reviewed. Elementary cells in this circuit family can generate, pass, memorize, and reproduce picosecond voltage pulses with a nominally quantized area corresponding to transfer of a single magnetic flux quantum across a Josephson junction. Functionally, each cell can be viewed as a combination of a logic gate and an output latch (register) controlled by clock pulses, which are physically similar to the signal pulses. Hand-shaking style of local exchange by the clock pulses enables one to increase complexity of the LSI RSFQ systems without loss of operating speed. The simplest components of the RSFQ circuitry have been experimentally tested at clock frequencies exceeding 100 GHz, and an increase of the speed beyond 300 GHz is expected as a result of using an up-to-date fabrication technology. This review includes a discussion of possible future developments and applications of this novel, ultrafast digital technology. >

2,013 citations

Journal ArticleDOI
Charles H. Bennett1
TL;DR: In this paper, the authors consider the problem of rendering a computation logically reversible (e.g., creation and annihilation of a history file) in a Brownian computer, and show that it is not the making of a measurement that prevents the demon from breaking the second law but rather the logically irreversible act of erasing the record of one measurement to make room for the next.
Abstract: Computers may be thought of as engines for transforming free energy into waste heat and mathematical work. Existing electronic computers dissipate energy vastly in excess of the mean thermal energykT, for purposes such as maintaining volatile storage devices in a bistable condition, synchronizing and standardizing signals, and maximizing switching speed. On the other hand, recent models due to Fredkin and Toffoli show that in principle a computer could compute at finite speed with zero energy dissipation and zero error. In these models, a simple assemblage of simple but idealized mechanical parts (e.g., hard spheres and flat plates) determines a ballistic trajectory isomorphic with the desired computation, a trajectory therefore not foreseen in detail by the builder of the computer. In a classical or semiclassical setting, ballistic models are unrealistic because they require the parts to be assembled with perfect precision and isolated from thermal noise, which would eventually randomize the trajectory and lead to errors. Possibly quantum effects could be exploited to prevent this undesired equipartition of the kinetic energy. Another family of models may be called Brownian computers, because they allow thermal noise to influence the trajectory so strongly that it becomes a random walk through the entire accessible (low-potential-energy) portion of the computer's configuration space. In these computers, a simple assemblage of simple parts determines a low-energy labyrinth isomorphic to the desired computation, through which the system executes its random walk, with a slight drift velocity due to a weak driving force in the direction of forward computation. In return for their greater realism, Brownian models are more dissipative than ballistic ones: the drift velocity is proportional to the driving force, and hence the energy dissipated approaches zero only in the limit of zero speed. In this regard Brownian models resemble the traditional apparatus of thermodynamic thought experiments, where reversibility is also typically only attainable in the limit of zero speed. The enzymatic apparatus of DNA replication, transcription, and translation appear to be nature's closest approach to a Brownian computer, dissipating 20–100kT per step. Both the ballistic and Brownian computers require a change in programming style: computations must be renderedlogically reversible, so that no machine state has more than one logical predecessor. In a ballistic computer, the merging of two trajectories clearly cannot be brought about by purely conservative forces; in a Brownian computer, any extensive amount of merging of computation paths would cause the Brownian computer to spend most of its time bogged down in extraneous predecessors of states on the intended path, unless an extra driving force ofkTln2 were applied (and dissipated) at each merge point. The mathematical means of rendering a computation logically reversible (e.g., creation and annihilation of a history file) will be discussed. The old Maxwell's demon problem is discussed in the light of the relation between logical and thermodynamic reversibility: the essential irreversible step, which prevents the demon from breaking the second law, is not the making of a measurement (which in principle can be done reversibly) but rather the logically irreversible act of erasing the record of one measurement to make room for the next. Converse to the rule that logically irreversible operations on data require an entropy increase elsewhere in the computer is the fact that a tape full of zeros, or one containing some computable pseudorandom sequence such as pi, has fuel value and can be made to do useful thermodynamic work as it randomizes itself. A tape containing an algorithmically random sequence lacks this ability.

1,637 citations

Journal ArticleDOI
31 Aug 2000-Nature
TL;DR: The physical limits of computation as determined by the speed of light c, the quantum scale ℏ and the gravitational constant G are explored.
Abstract: Computers are physical systems: the laws of physics dictate what they can and cannot do. In particular, the speed with which a physical device can process information is limited by its energy and the amount of information that it can process is limited by the number of degrees of freedom it possesses. Here I explore the physical limits of computation as determined by the speed of light c, the quantum scale h and the gravitational constant G. As an example, I put quantitative bounds to the computational power of an 'ultimate laptop' with a mass of one kilogram confined to a volume of one litre.

1,020 citations

Journal ArticleDOI
TL;DR: It was not hard to associate a logic gate with a degree of freedom, then to associate kT with that, and presume that this energy has to be dissipated at every step as discussed by the authors.
Abstract: Thermodynamics arose in the 19th century out of the attempt to understand the performance limits of steam engines in a way that would anticipate all further inventions. Claude Shannon, after World War II, analyzed the limits of the communications channel. It is no surprise, then, that shortly after the emergence of modern digital computing, similar questions appeared in that field. It was not hard to associate a logic gate with a degree of freedom, then to associate kT with that, and presume that this energy has to be dissipated at every step. Similarly, it seemed obvious to many that the uncertainty principle, ΔEΔt∼ℏ, could be used to calculate a required minimal energy involvement, and therefore energy loss, for very short Δt.

611 citations

Journal ArticleDOI
17 Sep 1993-Science
TL;DR: P pulsed arrays are true quantum computers: Bits can be placed in superpositions of 0 and 1, logical operations take place coherently, and dissipation is required only for error correction.
Abstract: Arrays of weakly coupled quantum systems might compute if subjected to a sequence of electromagnetic pulses of well-defined frequency and length. Such pulsed arrays are true quantum computers: Bits can be placed in superpositionsof 0 and 1, logical operations take place coherently, and dissipation is required only for error correction. Operated with frequent error correction, such a system functions as a parallel digital computer. Operated in a quantum-mechanically coherent manner, such a device functions as a generalpurpose quantum-mechanical micromanipulator, capable of both creating any desired quantum state of the array and transforming that state in any desired way.

572 citations