scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Feedback for physicists: A tutorial essay on control

31 Aug 2005-Reviews of Modern Physics (American Physical Society)-Vol. 77, Iss: 3, pp 783-836
TL;DR: In this paper, a tutorial essay aims to give enough of the formal elements of control theory to satisfy the experimentalist designing or running a typical physics experiment and enough to satisfy a theorist wishing to understand its broader intellectual context.
Abstract: Feedback and control theory are important ideas that should form part of the education of a physicist but rarely do. This tutorial essay aims to give enough of the formal elements of control theory to satisfy the experimentalist designing or running a typical physics experiment and enough to satisfy the theorist wishing to understand its broader intellectual context. The level is generally simple, although more advanced methods are also introduced. Several types of applications are discussed, as the practical uses of feedback extend far beyond the simple regulation problems where it is most often employed. Sketches are then provided of some of the broader implications and applications of control theory, especially in biology, which are topics of active research.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The authors classify and provide specific remedies for addressing the underlying common difficulties faced by novices in the field of process control, and found a narrative style, linking content to learners' prior knowledge and use of metaphors to be useful for overcoming the cognitive barriers.

5 citations

Journal ArticleDOI
TL;DR: In this article , the authors investigated the effects of different types of agents' behavioral responses for the dynamics of hybrid stochastic SIR outbreak models, focusing on the effect of the structure of the response function and the form of the temporal distribution of such response.
Abstract: In the behavioral epidemiology (BE) of infectious diseases, little theoretical effort seems to have been devoted to understand the possible effects of individuals' behavioral responses during an epidemic outbreak in small populations. To fill this gap, here we first build general, behavior implicit, SIR epidemic models including behavioral responses and set them within the framework of nonlinear feedback control theory. Second, we provide a thorough investigation of the effects of different types of agents' behavioral responses for the dynamics of hybrid stochastic SIR outbreak models. In the proposed model, the stochastic discrete dynamics of infection spread is combined with a continuous model describing the agents' delayed behavioral response. The delay reflects the memory mechanisms with which individuals enact protective behavior based on past data on the epidemic course. This results in a stochastic hybrid system with time-varying transition probabilities. To simulate such system, we extend Gillespie's classic stochastic simulation algorithm by developing analytical formulas valid for our classes of models. The algorithm is used to simulate a number of stochastic behavioral models and to classify the effects of different types of agents' behavioral responses. In particular this work focuses on the effects of the structure of the response function and of the form of the temporal distribution of such response. Among the various results, we stress the appearance of multiple, stochastic epidemic waves triggered by the delayed behavioral response of individuals.

5 citations

Journal ArticleDOI
TL;DR: In this article , the authors present a method to facilitate Monte Carlo simulations in the grand canonical ensemble given a target mean particle number, which imposes a fictitious dynamics on the chemical potential, to be run concurrently with the Monte Carlo sampling of the physical system.
Abstract: We present a method to facilitate Monte Carlo simulations in the grand canonical ensemble given a target mean particle number. The method imposes a fictitious dynamics on the chemical potential, to be run concurrently with the Monte Carlo sampling of the physical system. Corrections to the chemical potential are made according to time-averaged estimates of the mean and variance of the particle number, with the latter being proportional to thermodynamic compressibility. We perform a variety of tests, and in all cases find rapid convergence of the chemical potential-inexactness of the tuning algorithm contributes only a minor part of the total measurement error for realistic simulations.

5 citations

01 Jan 2008
Abstract: This dissertation describes the development and usage of the experimental technique – Magnetic Resonance Force Microscopy (MRFM) – to study electron spin resonance at low temperature in sensitivity as high as two electron spins. MRFM detects magnetic resonance by sensing the small force acting on the cantilever by the paramagnetic electron spins in the sample through magnetic coupling. I have applied this technique to measure the fluctuating magnetic moments of few electron spin ensembles known as the statistical polarization or the spin noise. In this dissertation, I describe the basic principles and setup of the MRFM experiments. I have used the MRFM experiment to verify that applying negative feedback to the cantilever can reduce the cantilever response time without sacrificing the signal-to-noise ratio in the force detection. Using the new spin manipulation scheme and the microwave resonator I designed for low temperature MRFM experiments, MRFM force spectra are measured and understood by modeling the spins undergoing magnetic resonance in an inhomogeneous magnetic field. I have used the high sensitivity MRFM experiment to observe the real-time fluctuation of the electron spin magnetic moments. From the statistics of this fluctuation, the number of resonating spins and the correlation time of the statistical polarization are measured. I have shown that the spin correlation time is due to the one and two phonon relaxation processes in the silicon dioxide sample by measuring the spin

5 citations


Cites background from "Feedback for physicists: A tutorial..."

  • ...Feedback is an important concept in science and technology [80] because it improves overall system performance, for example, the glucose-insulin regulation in human body, the temperature control by thermostat, aircraft flight control, and automobile cruise control, to name a few....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors demonstrate a frequency control method relying on tracking over a wide range and stabilizing the beat note between the laser and the optical parametric oscillator (OPO).
Abstract: Optical frequency combs (OFCs) provide a convenient reference for the frequency stabilization of continuous-wave lasers. We demonstrate a frequency control method relying on tracking over a wide range and stabilizing the beat note between the laser and the OFC. The approach combines fast frequency ramps on a millisecond timescale in the entire mode-hop free tuning range of the laser and precise stabilization to single frequencies. We apply it to a commercially available optical parametric oscillator (OPO) and demonstrate tuning over more than 60 GHz with a ramping speed up to 3 GHz/ms. Frequency ramps spanning 15 GHz are performed in less than 10 ms, with the OPO instantly relocked to the OFC after the ramp at any desired frequency. The developed control hardware and software are able to stabilize the OPO to sub-MHz precision and to perform sequences of fast frequency ramps automatically.

4 citations

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations


"Feedback for physicists: A tutorial..." refers background in this paper

  • ...Recent papers by Touchette and Lloyd (2000, 2004) begin to explore more formally these links and derive a fundamental relationship between the amount of control achievable (“decrease of entropy” in their formulation) and the “mutual information” (Cover and Thomas, 1991) between the dynamical system and the controller created by an initial interaction....

    [...]

Book
01 Jan 1987
TL;DR: Das Buch behandelt die Systemidentifizierung in dem theoretischen Bereich, der direkte Auswirkungen auf Verstaendnis and praktische Anwendung der verschiedenen Verfahren zur IdentifIZierung hat.
Abstract: Das Buch behandelt die Systemidentifizierung in dem theoretischen Bereich, der direkte Auswirkungen auf Verstaendnis und praktische Anwendung der verschiedenen Verfahren zur Identifizierung hat. Da ...

20,436 citations


"Feedback for physicists: A tutorial..." refers methods in this paper

  • ...For an introduction, see Dutton et al. 1997 ; for full details, see Ljung 1999 ....

    [...]

  • ...Alternatively, there are a number of methods that avoid the transfer function completely: from a given input u(t) and measured response y(t), they directly fit to the coefficients of a time-domain model or directly give pole and zero positions (Ljung, 1999)....

    [...]

  • ...Alternatively, there are a number of methods that avoid the transfer function completely: from a given input u t and measured response y t , they directly fit to the coefficients of a time-domain model or directly give pole and zero positions Ljung, 1999 ....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a simple model based on the power-law degree distribution of real networks was proposed, which was able to reproduce the power law degree distribution in real networks and to capture the evolution of networks, not just their static topology.
Abstract: The emergence of order in natural systems is a constant source of inspiration for both physical and biological sciences. While the spatial order characterizing for example the crystals has been the basis of many advances in contemporary physics, most complex systems in nature do not offer such high degree of order. Many of these systems form complex networks whose nodes are the elements of the system and edges represent the interactions between them. Traditionally complex networks have been described by the random graph theory founded in 1959 by Paul Erdohs and Alfred Renyi. One of the defining features of random graphs is that they are statistically homogeneous, and their degree distribution (characterizing the spread in the number of edges starting from a node) is a Poisson distribution. In contrast, recent empirical studies, including the work of our group, indicate that the topology of real networks is much richer than that of random graphs. In particular, the degree distribution of real networks is a power-law, indicating a heterogeneous topology in which the majority of the nodes have a small degree, but there is a significant fraction of highly connected nodes that play an important role in the connectivity of the network. The scale-free topology of real networks has very important consequences on their functioning. For example, we have discovered that scale-free networks are extremely resilient to the random disruption of their nodes. On the other hand, the selective removal of the nodes with highest degree induces a rapid breakdown of the network to isolated subparts that cannot communicate with each other. The non-trivial scaling of the degree distribution of real networks is also an indication of their assembly and evolution. Indeed, our modeling studies have shown us that there are general principles governing the evolution of networks. Most networks start from a small seed and grow by the addition of new nodes which attach to the nodes already in the system. This process obeys preferential attachment: the new nodes are more likely to connect to nodes with already high degree. We have proposed a simple model based on these two principles wich was able to reproduce the power-law degree distribution of real networks. Perhaps even more importantly, this model paved the way to a new paradigm of network modeling, trying to capture the evolution of networks, not just their static topology.

18,415 citations

Journal ArticleDOI
TL;DR: Developments in this field are reviewed, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.
Abstract: Inspired by empirical studies of networked systems such as the Internet, social networks, and biological networks, researchers have in recent years developed a variety of techniques and models to help us understand or predict the behavior of these systems. Here we review developments in this field, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.

17,647 citations


"Feedback for physicists: A tutorial..." refers background in this paper

  • ...The structure of such networks is a topic of intense current interest (Albert and Barabási, 2002; Newman, 2003)....

    [...]