scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Methodology For Performing Global Uncertainty And Sensitivity Analysis In Systems Biology

07 Sep 2008-Journal of Theoretical Biology (NIH Public Access)-Vol. 254, Iss: 1, pp 178-196
TL;DR: This work develops methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models and provides a complete methodology for performing these analyses, in both deterministic and stochastic settings, and proposes novel techniques to handle problems encountered during these types of analyses.
About: This article is published in Journal of Theoretical Biology.The article was published on 2008-09-07 and is currently open access. It has received 2014 citations till now. The article focuses on the topics: Uncertainty analysis & Sensitivity analysis.
Citations
More filters
Journal ArticleDOI
TL;DR: The model indicates that decreasing the incarceration rate in people who inject drugs and providing opioid agonist therapy could reduce the burden of HIV in this population of prisoners.

455 citations

Journal ArticleDOI
TL;DR: This work developed and experimentally validated a comprehensive mathematical model for T. reesei/E.
Abstract: Synergistic microbial communities are ubiquitous in nature and exhibit appealing features, such as sophisticated metabolic capabilities and robustness. This has inspired fast-growing interest in engineering synthetic microbial consortia for biotechnology development. However, there are relatively few reports of their use in real-world applications, and achieving population stability and regulation has proven to be challenging. In this work, we bridge ecology theory with engineering principles to develop robust synthetic fungal-bacterial consortia for efficient biosynthesis of valuable products from lignocellulosic feedstocks. The required biological functions are divided between two specialists: the fungus Trichoderma reesei, which secretes cellulase enzymes to hydrolyze lignocellulosic biomass into soluble saccharides, and the bacterium Escherichia coli, which metabolizes soluble saccharides into desired products. We developed and experimentally validated a comprehensive mathematical model for T. reesei/E. coli consortia, providing insights on key determinants of the system’s performance. To illustrate the bioprocessing potential of this consortium, we demonstrate direct conversion of microcrystalline cellulose and pretreated corn stover to isobutanol. Without costly nutrient supplementation, we achieved titers up to 1.88 g/L and yields up to 62% of theoretical maximum. In addition, we show that cooperator–cheater dynamics within T. reesei/E. coli consortia lead to stable population equilibria and provide a mechanism for tuning composition. Although we offer isobutanol production as a proof-of-concept application, our modular system could be readily adapted for production of many other valuable biochemicals.

374 citations


Cites methods from "A Methodology For Performing Global..."

  • ...The ODEs were numerically integrated with 1,000 sets of parameter values and initial conditions (ICs) sampled from appropriate statistical distributions (SI Appendix, Table S1), with Latin hypercube (20) selection....

    [...]

  • ...For each parameter or IC, partial rank correlation coefficients (PRCCs) (20) were calculated with a set of output metrics....

    [...]

Journal ArticleDOI
24 Feb 2020-eLife
TL;DR: It is found that most cases missed by screening are fundamentally undetectable, because they have not yet developed symptoms and are unaware they were exposed, which underscores the need for measures to limit transmission by individuals who become ill after being missed by a screening program.
Abstract: Traveller screening is being used to limit further spread of COVID-19 following its recent emergence, and symptom screening has become a ubiquitous tool in the global response. Previously, we developed a mathematical model to understand factors governing the effectiveness of traveller screening to prevent spread of emerging pathogens (Gostic et al., 2015). Here, we estimate the impact of different screening programs given current knowledge of key COVID-19 life history and epidemiological parameters. Even under best-case assumptions, we estimate that screening will miss more than half of infected people. Breaking down the factors leading to screening successes and failures, we find that most cases missed by screening are fundamentally undetectable, because they have not yet developed symptoms and are unaware they were exposed. Our work underscores the need for measures to limit transmission by individuals who become ill after being missed by a screening program. These findings can support evidence-based policy to combat the spread of COVID-19, and prospective planning to mitigate future emerging pathogens.

323 citations


Cites methods from "A Methodology For Performing Global..."

  • ...In the context of a growing epidemic, sensitivity analysis using the method of Latin hypercube sampling and partial rank correlation (Marino et al., 2008) showed that the fraction of travellers detected was moderately sensitive to all parameters considered – most coefficient estimates fell between…...

    [...]

  • ...Finally, we analysed the sensitivity of screening effectiveness (fraction of travellers detected) to each parameter, as measured by the partial rank cor- relation coefficient (PRCC) (Marino et al., 2008)....

    [...]

Journal ArticleDOI
TL;DR: The model simulations demonstrate that the elimination of ongoing SARS-CoV-2 pandemic is possible by combining the restrictive social distancing and contact tracing, and results reveal that achieving a reduction in the contact rate between uninfected and infected individuals by quarantined the susceptible individuals, can effectively reduce the basic reproduction number.
Abstract: In India, 100,340 confirmed cases and 3155 confirmed deaths due to COVID-19 were reported as of May 18, 2020. Due to absence of specific vaccine or therapy, non-pharmacological interventions including social distancing, contact tracing are essential to end the worldwide COVID-19. We propose a mathematical model that predicts the dynamics of COVID-19 in 17 provinces of India and the overall India. A complete scenario is given to demonstrate the estimated pandemic life cycle along with the real data or history to date, which in turn divulges the predicted inflection point and ending phase of SARS-CoV-2. The proposed model monitors the dynamics of six compartments, namely susceptible (S), asymptomatic (A), recovered (R), infected (I), isolated infected (Iq ) and quarantined susceptible (Sq ), collectively expressed SARIIqSq . A sensitivity analysis is conducted to determine the robustness of model predictions to parameter values and the sensitive parameters are estimated from the real data on the COVID-19 pandemic in India. Our results reveal that achieving a reduction in the contact rate between uninfected and infected individuals by quarantined the susceptible individuals, can effectively reduce the basic reproduction number. Our model simulations demonstrate that the elimination of ongoing SARS-CoV-2 pandemic is possible by combining the restrictive social distancing and contact tracing. Our predictions are based on real data with reasonable assumptions, whereas the accurate course of epidemic heavily depends on how and when quarantine, isolation and precautionary measures are enforced.

308 citations


Cites background or methods from "A Methodology For Performing Global..."

  • ...[43] , we conucted Latin hypercube sampling and generated 3200 samples to dentify which parameters consistently affect each model individals in time....

    [...]

  • ...[43] Marino S , Hogue IB , Ray CJ , Kirschner DE ....

    [...]

Journal ArticleDOI
Zhike Zi1
TL;DR: The author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.
Abstract: With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

304 citations

References
More filters
Book
14 Sep 1984
TL;DR: In this article, the distribution of the Mean Vector and the Covariance Matrix and the Generalized T2-Statistic is analyzed. But the distribution is not shown to be independent of sets of Variates.
Abstract: Preface to the Third Edition.Preface to the Second Edition.Preface to the First Edition.1. Introduction.2. The Multivariate Normal Distribution.3. Estimation of the Mean Vector and the Covariance Matrix.4. The Distributions and Uses of Sample Correlation Coefficients.5. The Generalized T2-Statistic.6. Classification of Observations.7. The Distribution of the Sample Covariance Matrix and the Sample Generalized Variance.8. Testing the General Linear Hypothesis: Multivariate Analysis of Variance9. Testing Independence of Sets of Variates.10. Testing Hypotheses of Equality of Covariance Matrices and Equality of Mean Vectors and Covariance Matrices.11. Principal Components.12. Cononical Correlations and Cononical Variables.13. The Distributions of Characteristic Roots and Vectors.14. Factor Analysis.15. Pattern of Dependence Graphical Models.Appendix A: Matrix Theory.Appendix B: Tables.References.Index.

9,693 citations

Journal ArticleDOI
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Abstract: Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.

8,328 citations


"A Methodology For Performing Global..." refers background or methods in this paper

  • ...UA, such as random sampling, importance sampling, or LHS (Helton and Davis, 2003; Mckay et al., 1979 )....

    [...]

  • ...LHS belongs to the MC class of sampling methods, and was introduced by Mckay et al. (1979) ....

    [...]

  • ...LHS allows an un-biased estimate of the average model output, with the advantage that it requires fewer samples than simple random sampling to achieve the same accuracy ( Mckay et al., 1979 )....

    [...]

Journal Article
TL;DR: The proposed experimental plans are composed of individually randomized one-factor-at-a-time designs, and data analysis is based on the resulting random sample of observed elementary effects, those changes in an output due solely to changes in a particular input.
Abstract: A computational model is a representation of some physical or other system of interest, first expressed mathematically and then implemented in the form of a computer program; it may be viewed as a function of inputs that, when evaluated, produces outputs. Motivation for this article comes from computational models that are deterministic, complicated enough to make classical mathematical analysis impractical and that have a moderate-to-large number of inputs. The problem of designing computational experiments to determine which inputs have important effects on an output is considered. The proposed experimental plans are composed of individually randomized one-factor-at-a-time designs, and data analysis is based on the resulting random sample of observed elementary effects, those changes in an output due solely to changes in a particular input. Advantages of this approach include a lack of reliance on assumptions of relative sparsity of important inputs, monotonicity of outputs with respect to inputs, or ad...

3,396 citations


"A Methodology For Performing Global..." refers methods in this paper

  • ...Within the class of screening methods, Morris (1991) (or elementary effects) is the most popular: it is global, computationally efficient, and should be implemented as a first preliminary US analysis when the execution time of the model is prohibitive (several hours or days)....

    [...]

  • ...Screening methods, such as those of Morris (1991), are global and computationally compatible: they represent adequate available tools to efficiently address the problem, if the model is very large and the execution time is prohibitive (several hours or days), as it is usually the case for ABMs (see…...

    [...]

Journal ArticleDOI

2,657 citations


"A Methodology For Performing Global..." refers background or methods in this paper

  • ...LHS belongs to the MC class of sampling methods, and was introduced by Mckay et al. (1979). LHS allows an un-biased estimate of the average model output, with the advantage that it requires fewer samples than simple random sampling to achieve the same accuracy (Mckay et al....

    [...]

  • ...interactions and was developed independently by Lotka (1925) and Volterra (1926):...

    [...]

Journal ArticleDOI
TL;DR: In this article, the problem of designing computational experiments to determine which inputs have important effects on an output is considered, and experimental plans are composed of individually randomized one-factor-at-a-time designs, and data analysis is based on the resulting random sample of observed elementary effects.
Abstract: A computational model is a representation of some physical or other system of interest, first expressed mathematically and then implemented in the form of a computer program; it may be viewed as a function of inputs that, when evaluated, produces outputs. Motivation for this article comes from computational models that are deterministic, complicated enough to make classical mathematical analysis impractical and that have a moderate-to-large number of inputs. The problem of designing computational experiments to determine which inputs have important effects on an output is considered. The proposed experimental plans are composed of individually randomized one-factor-at-a-time designs, and data analysis is based on the resulting random sample of observed elementary effects, those changes in an output due solely to changes in a particular input. Advantages of this approach include a lack of reliance on assumptions of relative sparsity of important inputs, monotonicity of outputs with respect to inputs, or ad...

2,446 citations