scispace - formally typeset
Search or ask a question
Topic

Sensitivity (control systems)

About: Sensitivity (control systems) is a research topic. Over the lifetime, 32786 publications have been published within this topic receiving 434147 citations.


Papers
More filters
Book
04 Feb 2008
TL;DR: In this article, the authors present a method for setting up Uncertainty and Sensitivity Analyses using Monte Carlo and Linear Regression (MCF) models and a set of experiments.
Abstract: Preface. 1. Introduction to Sensitivity Analysi. 1.1 Models and Sensitivity Analysis. 1.1.1 Definition. 1.1.2 Models. 1.1.3 Models and Uncertainty. 1.1.4 How to Set Up Uncertainty and Sensitivity Analyses. 1.1.5 Implications for Model Quality. 1.2 Methods and Settings for Sensitivity Analysis - An Introduction. 1.2.1 Local versus Global. 1.2.2 A Test Model. 1.2.3 Scatterplots versus Derivatives. 1.2.4 Sigma-normalized Derivatives. 1.2.5 Monte Carlo and Linear Regression. 1.2.6 Conditional Variances - First Path. 1.2.7 Conditional Variances - Second Path. 1.2.8 Application to Model (1.3). 1.2.9 A First Setting: 'Factor Prioritization' 1.2.10 Nonadditive Models. 1.2.11 Higher-order Sensitivity Indices. 1.2.12 Total Effects. 1.2.13 A Second Setting: 'Factor Fixing'. 1.2.14 Rationale for Sensitivity Analysis. 1.2.15 Treating Sets. 1.2.16 Further Methods. 1.2.17 Elementary Effect Test. 1.2.18 Monte Carlo Filtering. 1.3 Nonindependent Input Factors. 1.4 Possible Pitfalls for a Sensitivity Analysis. 1.5 Concluding Remarks. 1.6 Exercises. 1.7 Answers. 1.8 Additional Exercises. 1.9 Solutions to Additional Exercises. 2. Experimental Designs. 2.1 Introduction. 2.2 Dependency on a Single Parameter. 2.3 Sensitivity Analysis of a Single Parameter. 2.3.1 Random Values. 2.3.2 Stratified Sampling. 2.3.3 Mean and Variance Estimates for Stratified Sampling. 2.4 Sensitivity Analysis of Multiple Parameters. 2.4.1 Linear Models. 2.4.2 One-at-a-time (OAT) Sampling. 2.4.3 Limits on the Number of Influential Parameters. 2.4.4 Fractional Factorial Sampling. 2.4.5 Latin Hypercube Sampling. 2.4.6 Multivariate Stratified Sampling. 2.4.7 Quasi-random Sampling with Low-discrepancy Sequences. 2.5 Group Sampling. 2.6 Exercises. 2.7 Exercise Solutions. 3. Elementary Effects Method. 3.1 Introduction. 3.2 The Elementary Effects Method. 3.3 The Sampling Strategy and its Optimization. 3.4 The Computation of the Sensitivity Measures. 3.5 Working with Groups. 3.6 The EE Method Step by Step. 3.7 Conclusions. 3.8 Exercises. 3.9 Solutions. 4. Variance-based Methods. 4.1 Different Tests for Different Settings. 4.2 Why Variance? 4.3 Variance-based Methods. A Brief History. 4.4 Interaction Effects. 4.5 Total Effects. 4.6 How to Compute the Sensitivity Indices. 4.7 FAST and Random Balance Designs. 4.8 Putting the Method to Work: the Infection Dynamics Model. 4.9 Caveats. 4.10 Exercises. 5. Factor Mapping and Metamodelling. 5.1 Introduction. 5.2 Monte Carlo Filtering (MCF). 5.2.1 Implementation of Monte Carlo Filtering. 5.2.2 Pros and Cons. 5.2.3 Exercises. 5.2.4 Solutions. 5.2.5 Examples. 5.3 Metamodelling and the High-Dimensional Model Representation. 5.3.1 Estimating HDMRs and Metamodels. 5.3.2 A Simple Example. 5.3.3 Another Simple Example. 5.3.4 Exercises. 5.3.5 Solutions to Exercises. 5.4 Conclusions. 6. Sensitivity Analysis: from Theory to Practice. 6.1 Example 1: a Composite Indicator. 6.1.1 Setting the Problem. 6.1.2 A Composite Indicator Measuring Countries' Performance in Environmental Sustainability. 6.1.3 Selecting the Sensitivity Analysis Method. 6.1.4 The Sensitivity Analysis Experiment and its Results. 6.1.5 Conclusions. 6.2 Example 2: Importance of Jumps in Pricing Options. 6.2.1 Setting the Problem. 6.2.2 The Heston Stochastic Volatility Model with Jumps. 6.2.3 Selecting a Suitable Sensitivity Analysis Method. 6.2.4 The Sensitivity Analysis Experiment. 6.2.5 Conclusions. 6.3 Example 3: a Chemical Reactor. 6.3.1 Setting the Problem. 6.3.2 Thermal Runaway Analysis of a Batch Reactor. 6.3.3 Selecting the Sensitivity Analysis Method. 6.3.4 The Sensitivity Analysis Experiment and its Results. 6.3.5 Conclusions. 6.4 Example 4: a Mixed Uncertainty-Sensitivity Plot. 6.4.1 In Brief. 6.5 When to use What? Afterword. Bibliography. Index.

4,306 citations

Book
01 Apr 2004
TL;DR: In this paper, the authors present a method for sensitivity analysis of a fish population model using Monte Carlo filtering and variance-based methods, which is based on the Bayesian uncertainty estimation.
Abstract: PREFACE. 1. A WORKED EXAMPLE. 1.1 A simple model. 1.2 Modulus version of the simple model. 1.3 Six--factor version of the simple model. 1.4 The simple model 'by groups'. 1.5 The (less) simple correlated--input model. 1.6 Conclusions. 2. GLOBAL SENSITIVITY ANALYSIS FOR IMPORTANCE ASSESSMENT. 2.1 Examples at a glance. 2.2 What is sensitivity analysis? 2.3 Properties of an ideal sensitivity analysis method. 2.4 Defensible settings for sensitivity analysis. 2.5 Caveats. 3. TEST CASES. 3.1 The jumping man. Applying variance--based methods. 3.2 Handling the risk of a financial portfolio: the problem of hedging. Applying Monte Carlo filtering and variance--based methods. 3.3 A model of fish population dynamics. Applying the method of Morris. 3.4 The Level E model. Radionuclide migration in the geosphere. Applying variance--based methods and Monte Carlo filtering. 3.5 Two spheres. Applying variance based methods in estimation/calibration problems. 3.6 A chemical experiment. Applying variance based methods in estimation/calibration problems. 3.7 An analytical example. Applying the method of Morris. 4. THE SCREENING EXERCISE. 4.1 Introduction. 4.2 The method of Morris. 4.3 Implementing the method. 4.4 Putting the method to work: an analytical example. 4.5 Putting the method to work: sensitivity analysis of a fish population model. 4.6 Conclusions. 5. METHODS BASED ON DECOMPOSING THE VARIANCE OF THE OUTPUT. 5.1 The settings. 5.2 Factors Prioritisation Setting. 5.3 First--order effects and interactions. 5.4 Application of Si to Setting 'Factors Prioritisation'. 5.5 More on variance decompositions. 5.6 Factors Fixing (FF) Setting. 5.7 Variance Cutting (VC) Setting. 5.8 Properties of the variance based methods. 5.9 How to compute the sensitivity indices: the case of orthogonal input. 5.9.1 A digression on the Fourier Amplitude Sensitivity Test (FAST). 5.10 How to compute the sensitivity indices: the case of non--orthogonal input. 5.11 Putting the method to work: the Level E model. 5.11.1 Case of orthogonal input factors. 5.11.2 Case of correlated input factors. 5.12 Putting the method to work: the bungee jumping model. 5.13 Caveats. 6. SENSITIVITY ANALYSIS IN DIAGNOSTIC MODELLING: MONTE CARLO FILTERING AND REGIONALISED SENSITIVITY ANALYSIS, BAYESIAN UNCERTAINTY ESTIMATION AND GLOBAL SENSITIVITY ANALYSIS. 6.1 Model calibration and Factors Mapping Setting. 6.2 Monte Carlo filtering and regionalised sensitivity analysis. 6.2.1 Caveats. 6.3 Putting MC filtering and RSA to work: the problem of hedging a financial portfolio. 6.4 Putting MC filtering and RSA to work: the Level E test case. 6.5 Bayesian uncertainty estimation and global sensitivity analysis. 6.5.1 Bayesian uncertainty estimation. 6.5.2 The GLUE case. 6.5.3 Using global sensitivity analysis in the Bayesian uncertainty estimation. 6.5.4 Implementation of the method. 6.6 Putting Bayesian analysis and global SA to work: two spheres. 6.7 Putting Bayesian analysis and global SA to work: a chemical experiment. 6.7.1 Bayesian uncertainty analysis (GLUE case). 6.7.2 Global sensitivity analysis. 6.7.3 Correlation analysis. 6.7.4 Further analysis by varying temperature in the data set: fewer interactions in the model. 6.8 Caveats. 7. HOW TO USE SIMLAB. 7.1 Introduction. 7.2 How to obtain and install SIMLAB. 7.3 SIMLAB main panel. 7.4 Sample generation. 7.4.1 FAST. 7.4.2 Fixed sampling. 7.4.3 Latin hypercube sampling (LHS). 7.4.4 The method of Morris. 7.4.5 Quasi--Random LpTau. 7.4.6 Random. 7.4.7 Replicated Latin Hypercube (r--LHS). 7.4.8 The method of Sobol'. 7.4.9 How to induce dependencies in the input factors. 7.5 How to execute models. 7.6 Sensitivity analysis. 8. FAMOUS QUOTES: SENSITIVITY ANALYSIS IN THE SCIENTIFIC DISCOURSE. REFERENCES. INDEX.

2,297 citations

Journal ArticleDOI
George Zames1
TL;DR: In this article, the problem of sensitivity reduction by feedback is formulated as an optimization problem and separated from the problems of stabilization, and the feedback schemes obtainable from a given plant are parameterized.
Abstract: In this paper, the problem of sensitivity, reduction by feedback is formulated as an optimization problem and separated from the problem of stabilization. Stable feedback schemes obtainable from a given plant are parameterized. Salient properties of sensitivity reducing schemes are derived, and it is shown that plant uncertainty reduces the ability, of feedback to reduce sensitivity. The theory is developed for input-output systems in a general setting of Banach algebras, and then specialized to a class of multivariable, time-invariant systems characterized by n \times n matrices of H^{\infty} frequency response functions, either with or without zeros in the right half-plane. The approach is based on the use of a weighted seminorm on the algebra of operators to measure sensitivity, and on the concept of an approximate inverse. Approximate invertibility, of the plant is shown to be a necessary and sufficient condition for sensitivity reduction. An indicator of approximate invertibility, called a measure of singularity, is introduced. The measure of singularity of a linear time-invariant plant is shown to be determined by the location of its right half-plane zeros. In the absence of plant uncertainty, the sensitivity, to output disturbances can be reduced to an optimal value approaching the singularity, measure. In particular, if there are no right half-plane zeros, sensitivity can be made arbitrarily small. The feedback schemes used in the optimization of sensitivity resemble the lead-lag networks of classical control design. Some of their properties, and methods of constructing them in special cases are presented.

2,203 citations


Network Information
Related Topics (5)
Robustness (computer science)
94.7K papers, 1.6M citations
79% related
Control theory
299.6K papers, 3.1M citations
79% related
Optimal control
68K papers, 1.2M citations
79% related
Linear system
59.5K papers, 1.4M citations
77% related
Control system
129K papers, 1.5M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20242
20236,123
20229,388
20212,085
20201,764
20191,722