scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Feedback for physicists: A tutorial essay on control

31 Aug 2005-Reviews of Modern Physics (American Physical Society)-Vol. 77, Iss: 3, pp 783-836
TL;DR: In this paper, a tutorial essay aims to give enough of the formal elements of control theory to satisfy the experimentalist designing or running a typical physics experiment and enough to satisfy a theorist wishing to understand its broader intellectual context.
Abstract: Feedback and control theory are important ideas that should form part of the education of a physicist but rarely do. This tutorial essay aims to give enough of the formal elements of control theory to satisfy the experimentalist designing or running a typical physics experiment and enough to satisfy the theorist wishing to understand its broader intellectual context. The level is generally simple, although more advanced methods are also introduced. Several types of applications are discussed, as the practical uses of feedback extend far beyond the simple regulation problems where it is most often employed. Sketches are then provided of some of the broader implications and applications of control theory, especially in biology, which are topics of active research.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, an introductory overview of the various ways in which feedback may be implemented in quantum systems, the theoretical methods that are currently used to treat it, the experiments in which it has been demonstrated to date, and its applications.

239 citations

Journal ArticleDOI
TL;DR: This work directly investigates control and stability through the application of torque impulses to freely flying fruit flies (Drosophila melanogaster) and measurement of their behavioral response, and discovers that flies respond to gentle disturbances by accurately returning to their original orientation.
Abstract: Just as the Wright brothers implemented controls to achieve stable airplane flight, flying insects have evolved behavioral strategies that ensure recovery from flight disturbances. Pioneering studies performed on tethered and dissected insects demonstrate that the sensory, neurological, and musculoskeletal systems play important roles in flight control. Such studies, however, cannot produce an integrative model of insect flight stability because they do not incorporate the interaction of these systems with free-flight aerodynamics. We directly investigate control and stability through the application of torque impulses to freely flying fruit flies (Drosophila melanogaster) and measurement of their behavioral response. High-speed video and a new motion tracking method capture the aerial “stumble,” and we discover that flies respond to gentle disturbances by accurately returning to their original orientation. These insects take advantage of a stabilizing aerodynamic influence and active torque generation to recover their heading to within 2° in < 60 ms. To explain this recovery behavior, we form a feedback control model that includes the fly’s ability to sense body rotations, process this information, and actuate the wing motions that generate corrective aerodynamic torque. Thus, like early man-made aircraft and modern fighter jets, the fruit fly employs an automatic stabilization scheme that reacts to short time-scale disturbances.

197 citations

Journal ArticleDOI
TL;DR: An overview of the impact of network patterns on the local and global dynamics of coupled phase oscillators and recent developments on variations of the Kuramoto model in networks, including the presence of noise and inertia are discussed.
Abstract: Synchronization of an ensemble of oscillators is an emergent phenomenon present in several complex systems, ranging from social and physical to biological and technological systems. The most successful approach to describe how coherent behavior emerges in these complex systems is given by the paradigmatic Kuramoto model. This model has been traditionally studied in complete graphs. However, besides being intrinsically dynamical, complex systems present very heterogeneous structure, which can be represented as complex networks. This report is dedicated to review main contributions in the field of synchronization in networks of Kuramoto oscillators. In particular, we provide an overview of the impact of network patterns on the local and global dynamics of coupled phase oscillators. We cover many relevant topics, which encompass a description of the most used analytical approaches and the analysis of several numerical results. Furthermore, we discuss recent developments on variations of the Kuramoto model in networks, including the presence of noise and inertia. The rich potential for applications is discussed for special fields in engineering, neuroscience, physics and Earth science. Finally, we conclude by discussing problems that remain open after the last decade of intensive research on the Kuramoto model and point out some promising directions for future research.

181 citations


Cites background from "Feedback for physicists: A tutorial..."

  • ...Feedback theory is commonly used for time-varying coupling, and it is one of the most important ideas developed in the last century, with fundamental implications especially for biological neural systems [210]....

    [...]

  • ...Adaptive mechanisms can be found in biological systems [209] and neural systems [210], e....

    [...]

Journal ArticleDOI
TL;DR: On the basis of applied neuropharmacology, prevention of anesthetic-drug related seizures would include avoiding sevoflurane and etomidate, considering prophylaxis with adjunctive benzodiazepines (&agr;-subunit GABAA agonists), or drugs that impair calcium entry into neurons, and using electroencephalogram monitoring to detect early signs of cortical instability and epileptiform activity.
Abstract: The true incidence of seizures caused by general anesthetic drugs is unknown. Abnormal movements are common during induction of anesthesia, but they may not be indicative of true seizures. Conversely, epileptiform electrocortical activity is commonly induced by enflurane, etomidate, sevoflurane and, to a lesser extent, propofol, but it rarely progresses to generalized tonic-clonic seizures. Even "nonconvulsant" anesthetic drugs occasionally cause seizures in subjects with preexisting epilepsy. These seizures most commonly occur during induction or emergence from anesthesia, when the anesthetic drug concentration is relatively low. There is no unifying neural mechanism of anesthetic drug-related seizurogenesis. However, there is a growing body of experimental work suggesting that seizures are not caused simply by "too much excitation," but rather by excitation applied to a mass of neurons which are primed to react to the excitation by going into an oscillatory seizure state. Increased gamma-amino-butyric acid (GABA)ergic inhibition can sensitize the cortex so that only a small amount of excitation is required to cause seizures. This has been postulated to occur 1) at the network level by increasing the propensity for reverberation (e.g., by prolongation of the "inhibitory lag"), or 2) via different effects on subpopulations of interneurons ("inhibiting-the-inhibitors") or 3) at the synaptic level by changing the chloride reversal potential ("excitatory GABA"). On the basis of applied neuropharmacology, prevention of anesthetic-drug related seizures would include 1) avoiding sevoflurane and etomidate, 2) considering prophylaxis with adjunctive benzodiazepines (alpha-subunit GABA(A) agonists), or drugs that impair calcium entry into neurons, and 3) using electroencephalogram monitoring to detect early signs of cortical instability and epileptiform activity. Seizures may falsely elevate electroencephalogram indices of depth of anesthesia.

167 citations

Journal Article

121 citations


Cites background from "Feedback for physicists: A tutorial..."

  • ...[136] John Bechhoefer, “Feedback for physicists: A tutorial essay on control,” Rev....

    [...]

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations


"Feedback for physicists: A tutorial..." refers background in this paper

  • ...Recent papers by Touchette and Lloyd (2000, 2004) begin to explore more formally these links and derive a fundamental relationship between the amount of control achievable (“decrease of entropy” in their formulation) and the “mutual information” (Cover and Thomas, 1991) between the dynamical system and the controller created by an initial interaction....

    [...]

Book
01 Jan 1987
TL;DR: Das Buch behandelt die Systemidentifizierung in dem theoretischen Bereich, der direkte Auswirkungen auf Verstaendnis and praktische Anwendung der verschiedenen Verfahren zur IdentifIZierung hat.
Abstract: Das Buch behandelt die Systemidentifizierung in dem theoretischen Bereich, der direkte Auswirkungen auf Verstaendnis und praktische Anwendung der verschiedenen Verfahren zur Identifizierung hat. Da ...

20,436 citations


"Feedback for physicists: A tutorial..." refers methods in this paper

  • ...For an introduction, see Dutton et al. 1997 ; for full details, see Ljung 1999 ....

    [...]

  • ...Alternatively, there are a number of methods that avoid the transfer function completely: from a given input u(t) and measured response y(t), they directly fit to the coefficients of a time-domain model or directly give pole and zero positions (Ljung, 1999)....

    [...]

  • ...Alternatively, there are a number of methods that avoid the transfer function completely: from a given input u t and measured response y t , they directly fit to the coefficients of a time-domain model or directly give pole and zero positions Ljung, 1999 ....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a simple model based on the power-law degree distribution of real networks was proposed, which was able to reproduce the power law degree distribution in real networks and to capture the evolution of networks, not just their static topology.
Abstract: The emergence of order in natural systems is a constant source of inspiration for both physical and biological sciences. While the spatial order characterizing for example the crystals has been the basis of many advances in contemporary physics, most complex systems in nature do not offer such high degree of order. Many of these systems form complex networks whose nodes are the elements of the system and edges represent the interactions between them. Traditionally complex networks have been described by the random graph theory founded in 1959 by Paul Erdohs and Alfred Renyi. One of the defining features of random graphs is that they are statistically homogeneous, and their degree distribution (characterizing the spread in the number of edges starting from a node) is a Poisson distribution. In contrast, recent empirical studies, including the work of our group, indicate that the topology of real networks is much richer than that of random graphs. In particular, the degree distribution of real networks is a power-law, indicating a heterogeneous topology in which the majority of the nodes have a small degree, but there is a significant fraction of highly connected nodes that play an important role in the connectivity of the network. The scale-free topology of real networks has very important consequences on their functioning. For example, we have discovered that scale-free networks are extremely resilient to the random disruption of their nodes. On the other hand, the selective removal of the nodes with highest degree induces a rapid breakdown of the network to isolated subparts that cannot communicate with each other. The non-trivial scaling of the degree distribution of real networks is also an indication of their assembly and evolution. Indeed, our modeling studies have shown us that there are general principles governing the evolution of networks. Most networks start from a small seed and grow by the addition of new nodes which attach to the nodes already in the system. This process obeys preferential attachment: the new nodes are more likely to connect to nodes with already high degree. We have proposed a simple model based on these two principles wich was able to reproduce the power-law degree distribution of real networks. Perhaps even more importantly, this model paved the way to a new paradigm of network modeling, trying to capture the evolution of networks, not just their static topology.

18,415 citations

Journal ArticleDOI
TL;DR: Developments in this field are reviewed, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.
Abstract: Inspired by empirical studies of networked systems such as the Internet, social networks, and biological networks, researchers have in recent years developed a variety of techniques and models to help us understand or predict the behavior of these systems. Here we review developments in this field, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.

17,647 citations


"Feedback for physicists: A tutorial..." refers background in this paper

  • ...The structure of such networks is a topic of intense current interest (Albert and Barabási, 2002; Newman, 2003)....

    [...]