scispace - formally typeset
Search or ask a question
Author

Ken Sekimoto

Bio: Ken Sekimoto is an academic researcher. The author has contributed to research in topics: Energetics. The author has an hindex of 1, co-authored 1 publications receiving 474 citations.
Topics: Energetics

Papers
More filters
BookDOI
01 Jan 2010

562 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.
Abstract: Stochastic thermodynamics as reviewed here systematically provides a framework for extending the notions of classical thermodynamics such as work, heat and entropy production to the level of individual trajectories of well-defined non-equilibrium ensembles. It applies whenever a non-equilibrium process is still coupled to one (or several) heat bath(s) of constant temperature. Paradigmatic systems are single colloidal particles in time-dependent laser traps, polymers in external flow, enzymes and molecular motors in single molecule assays, small biochemical networks and thermoelectric devices involving single electron transport. For such systems, a first-law like energy balance can be identified along fluctuating trajectories. For a basic Markovian dynamics implemented either on the continuum level with Langevin equations or on a discrete set of states as a master equation, thermodynamic consistency imposes a local-detailed balance constraint on noise and rates, respectively. Various integral and detailed fluctuation theorems, which are derived here in a unifying approach from one master theorem, constrain the probability distributions for work, heat and entropy production depending on the nature of the system and the choice of non-equilibrium conditions. For non-equilibrium steady states, particularly strong results hold like a generalized fluctuation–dissipation theorem involving entropy production. Ramifications and applications of these concepts include optimal driving between specified states in finite time, the role of measurement-based feedback processes and the relation between dissipation and irreversibility. Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production. (Some figures may appear in colour only in the online journal) This article was invited by Erwin Frey.

2,834 citations

Journal ArticleDOI
08 Mar 2012-Nature
TL;DR: It is established that the mean dissipated heat saturates at the Landauer bound in the limit of long erasure cycles, demonstrating the intimate link between information theory and thermodynamics and highlighting the ultimate physical limit of irreversible computation.
Abstract: In 1961, Rolf Landauer argued that the erasure of information is a dissipative process. A minimal quantity of heat, proportional to the thermal energy and called the Landauer bound, is necessarily produced when a classical bit of information is deleted. A direct consequence of this logically irreversible transformation is that the entropy of the environment increases by a finite amount. Despite its fundamental importance for information theory and computer science, the erasure principle has not been verified experimentally so far, the main obstacle being the difficulty of doing single-particle experiments in the low-dissipation regime. Here we experimentally show the existence of the Landauer bound in a generic model of a one-bit memory. Using a system of a single colloidal particle trapped in a modulated double-well potential, we establish that the mean dissipated heat saturates at the Landauer bound in the limit of long erasure cycles. This result demonstrates the intimate link between information theory and thermodynamics. It further highlights the ultimate physical limit of irreversible computation.

1,019 citations

Journal ArticleDOI
TL;DR: The reason we never observe violations of the second law of thermodynamics is in part a matter of statistics: when ∼1023 degrees of freedom are involved, the odds are overwhelmingly stacked against the possibility of seeing significant deviations away from the mean behavior.
Abstract: The reason we never observe violations of the second law of thermodynamics is in part a matter of statistics: When ∼1023 degrees of freedom are involved, the odds are overwhelmingly stacked against the possibility of seeing significant deviations away from the mean behavior. As we turn our attention to smaller systems, however, statistical fluctuations become more prominent. In recent years it has become apparent that the fluctuations of systems far from thermal equilibrium are not mere background noise, but satisfy strong, useful, and unexpected properties. In particular, a proper accounting of fluctuations allows us to rewrite familiar inequalities of macroscopic thermodynamics as equalities. This review describes some of this progress, and argues that it has refined our understanding of irreversibility and the second law.

1,008 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a theoretical framework for the thermodynamics of information based on stochastic thermodynamics and fluctuation theorems, review some recent experimental results, and present an overview of the state of the art in the field.
Abstract: By its very nature, the second law of thermodynamics is probabilistic, in that its formulation requires a probabilistic description of the state of a system. This raises questions about the objectivity of the second law: does it depend, for example, on what we know about the system? For over a century, much effort has been devoted to incorporating information into thermodynamics and assessing the entropic and energetic costs of manipulating information. More recently, this historically theoretical pursuit has become relevant in practical situations where information is manipulated at small scales, such as in molecular and cell biology, artificial nano-devices or quantum computation. Here we give an introduction to a novel theoretical framework for the thermodynamics of information based on stochastic thermodynamics and fluctuation theorems, review some recent experimental results, and present an overview of the state of the art in the field. The task of integrating information into the framework of thermodynamics dates back to Maxwell and his infamous demon. Recent advances have made these ideas rigorous—and brought them into the laboratory.

879 citations

Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate the information-to-energy conversion by feedback control has been demonstrated experimentally and demonstrate that feedback can enable the transformation of information into energy without violating the second law of thermodynamics.
Abstract: Feedback mechanisms such as the ‘demon’ in Maxwell’s well-known thought experiment can, in principle, enable the transformation of information into energy, without violating the second law of thermodynamics. Such information-to-energy conversion by feedback control has now been demonstrated experimentally.

806 citations