scispace - formally typeset
Search or ask a question
Topic

Sensitivity analysis

About: Sensitivity analysis is a research topic. Over the lifetime, 4235 publications have been published within this topic receiving 144759 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The material presented in this paper covers the method of describing the uncertainties in an engineering experiment and the necessary background material, as well as a technique for numerically executing uncertainty analyses when computerized data interpretation is involved.

6,868 citations

Book
01 Jan 1990
TL;DR: In this paper, the authors present a software tool for uncertainty analysis, called Analytica, for quantitative policy analysis, which can be used to perform probability assessment and propagation and analysis of uncertainty.
Abstract: Preface 1. Introduction 2. Recent milestones 3. An overview of quantitative policy analysis 4. The nature and sources of uncertainty 5. Probability distributions and statistical estimation 6. Human judgement about and with uncertainty 7. Performing probability assessment 8. The propagation and analysis of uncertainty 9. The graphic communication of uncertainty 10. Analytica: a software tool for uncertainty analysis 11. Large and complex models 12. The value of knowing how little you know Index.

2,666 citations

Posted Content
TL;DR: In this article, a new theoretical framework casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes was developed, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.
Abstract: Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and classification do not capture model uncertainty. In comparison, Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost. In this paper we develop a new theoretical framework casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes. A direct result of this theory gives us tools to model uncertainty with dropout NNs -- extracting information from existing models that has been thrown away so far. This mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. We perform an extensive study of the properties of dropout's uncertainty. Various network architectures and non-linearities are assessed on tasks of regression and classification, using MNIST as an example. We show a considerable improvement in predictive log-likelihood and RMSE compared to existing state-of-the-art methods, and finish by using dropout's uncertainty in deep reinforcement learning.

2,261 citations

Journal ArticleDOI
TL;DR: This work develops methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models and provides a complete methodology for performing these analyses, in both deterministic and stochastic settings, and proposes novel techniques to handle problems encountered during these types of analyses.

2,014 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of accounting for model uncertainty in linear regression models and propose two alternative approaches: the Occam's window approach and the Markov chain Monte Carlo approach.
Abstract: We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the underestimation of uncertainty when making inferences about quantities of interest. A Bayesian solution to this problem involves averaging over all possible models (i.e., combinations of predictors) when making inferences about quantities of interest. This approach is often not practical. In this article we offer two alternative approaches. First, we describe an ad hoc procedure, “Occam's window,” which indicates a small set of models over which a model average can be computed. Second, we describe a Markov chain Monte Carlo approach that directly approximates the exact solution. In the presence of model uncertainty, both of these model averaging procedures provide better predictive performance than any single model that might reasonably have been selected. In the extreme case where there are many candidate predictors but ...

1,804 citations


Network Information
Related Topics (5)
Sampling (statistics)
65.3K papers, 1.2M citations
78% related
Monte Carlo method
95.9K papers, 2.1M citations
76% related
Probabilistic logic
56K papers, 1.3M citations
76% related
Robustness (computer science)
94.7K papers, 1.6M citations
74% related
Regression analysis
31K papers, 1.7M citations
73% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202328
202242
20212
20201
20195
201831