scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Parameter uncertainty in biochemical models described by ordinary differential equations.

TL;DR: This review provides an introduction to some of the techniques available as well as gives an overview of the state-of-the-art methods for parameter uncertainty analysis.
Abstract: Improved mechanistic understanding of biochemical networks is one of the driving ambitions of Systems Biology. Computational modeling allows the integration of various sources of experimental data in order to put this conceptual understanding to the test in a quantitative manner. The aim of computational modeling is to obtain both predictive as well as explanatory models for complex phenomena, hereby providing useful approximations of reality with varying levels of detail. As the complexity required to describe different system increases, so does the need for determining how well such predictions can be made. Despite efforts to make tools for uncertainty analysis available to the field, these methods have not yet found widespread use in the field of Systems Biology. Additionally, the suitability of the different methods strongly depends on the problem and system under investigation. This review provides an introduction to some of the techniques available as well as gives an overview of the state-of-the-art methods for parameter uncertainty analysis.
Citations
More filters
Journal Article
TL;DR: The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Abstract: The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs that are required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms. This allows for highly efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The methodology proposed exploits the Riemann geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density. The performance of these Riemann manifold Monte Carlo methods is rigorously assessed by performing inference on logistic regression models, log-Gaussian Cox point processes, stochastic volatility models and Bayesian estimation of dynamic systems described by non-linear differential equations. Substantial improvements in the time-normalized effective sample size are reported when compared with alternative sampling approaches. MATLAB code that is available from http://www.ucl.ac.uk/statistics/research/rmhmc allows replication of all the results reported.

1,031 citations

Journal ArticleDOI
01 Feb 2015
TL;DR: Global sensitivity analysis provides an innovative tool that can meet the challenge of reliably identifying and estimating respective model parameters in systems pharmacology models.
Abstract: A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015

272 citations

Journal ArticleDOI
TL;DR: This contribution aims to compare and highlight the different perspectives and contributions from these fields, with emphasis on two key questions: why are reverse engineering problems so hard to solve, and what methods are available for the particular problems arising from systems biology.
Abstract: The interplay of mathematical modelling with experiments is one of the central elements in systems biology. The aim of reverse engineering is to infer, analyse and understand, through this interplay, the functional and regulatory mechanisms of biological systems. Reverse engineering is not exclusive of systems biology and has been studied in different areas, such as inverse problem theory, machine learning, nonlinear physics, (bio)chemical kinetics, control theory and optimization, among others. However, it seems that many of these areas have been relatively closed to outsiders. In this contribution, we aim to compare and highlight the different perspectives and contributions from these fields, with emphasis on two key questions: (i) why are reverse engineering problems so hard to solve, and (ii) what methods are available for the particular problems arising from systems biology?

238 citations

Journal ArticleDOI
TL;DR: This review summarizes the recent development of metabolic engineering approaches to modulate yeast metabolism with representative examples and highlights new tools for biosynthetic pathway optimization and genome engineering to advance metabolic engineering in yeast.

203 citations

Journal ArticleDOI
TL;DR: The new parallel cooperative method presented here allows the solution of medium and large scale parameter estimation problems in reasonable computation times and with small hardware requirements and is believed to play a key role in the development of large-scale and even whole-cell dynamic models.
Abstract: The development of large-scale kinetic models is one of the current key issues in computational systems biology and bioinformatics. Here we consider the problem of parameter estimation in nonlinear dynamic models. Global optimization methods can be used to solve this type of problems but the associated computational cost is very large. Moreover, many of these methods need the tuning of a number of adjustable search parameters, requiring a number of initial exploratory runs and therefore further increasing the computation times. Here we present a novel parallel method, self-adaptive cooperative enhanced scatter search (saCeSS), to accelerate the solution of this class of problems. The method is based on the scatter search optimization metaheuristic and incorporates several key new mechanisms: (i) asynchronous cooperation between parallel processes, (ii) coarse and fine-grained parallelism, and (iii) self-tuning strategies. The performance and robustness of saCeSS is illustrated by solving a set of challenging parameter estimation problems, including medium and large-scale kinetic models of the bacterium E. coli, bakeres yeast S. cerevisiae, the vinegar fly D. melanogaster, Chinese Hamster Ovary cells, and a generic signal transduction network. The results consistently show that saCeSS is a robust and efficient method, allowing very significant reduction of computation times with respect to several previous state of the art methods (from days to minutes, in several cases) even when only a small number of processors is used. The new parallel cooperative method presented here allows the solution of medium and large scale parameter estimation problems in reasonable computation times and with small hardware requirements. Further, the method includes self-tuning mechanisms which facilitate its use by non-experts. We believe that this new method can play a key role in the development of large-scale and even whole-cell dynamic models.

158 citations

References
More filters
Book
23 Nov 2005
TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Abstract: A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

11,357 citations

Book
01 Jan 1964
TL;DR: In this article, the Poincare-Bendixson theory is used to explain the existence of linear differential equations and the use of Implicity Function and fixed point Theorems.
Abstract: Foreword to the Classics Edition Preface to the First Edition Preface to the Second Edition Errata I: Preliminaries II: Existence III: Differential In qualities and Uniqueness IV: Linear Differential Equations V: Dependence on Initial Conditions and Parameters VI: Total and Partial Differential Equations VII: The Poincare-Bendixson Theory VIII: Plane Stationary Points IX: Invariant Manifolds and Linearizations X: Perturbed Linear Systems XI: Linear Second Order Equations XII: Use of Implicity Function and Fixed Point Theorems XIII: Dichotomies for Solutions of Linear Equations XIV: Miscellany on Monotomy Hints for Exercises References Index.

9,036 citations

Book
01 Jan 2004
TL;DR: The fourth volume in a series of volumes devoted to self-contained and up-to-date surveys in the theory of ODEs was published by as discussed by the authors, with an additional effort to achieve readability for mathematicians and scientists from other related fields so that the chapters have been made accessible to a wider audience.
Abstract: This handbook is the fourth volume in a series of volumes devoted to self contained and up-to-date surveys in the theory of ordinary differential equations, with an additional effort to achieve readability for mathematicians and scientists from other related fields so that the chapters have been made accessible to a wider audience. It covers a variety of problems in ordinary differential equations. It provides pure mathematical and real world applications. It is written for mathematicians and scientists of many related fields.

7,749 citations

Journal ArticleDOI
TL;DR: This work states that rapid advances in network biology indicate that cellular networks are governed by universal laws and offer a new conceptual framework that could potentially revolutionize the view of biology and disease pathologies in the twenty-first century.
Abstract: A key aim of postgenomic biomedical research is to systematically catalogue all molecules and their interactions within a living cell. There is a clear need to understand how these molecules and the interactions between them determine the function of this enormously complex machinery, both in isolation and when surrounded by other cells. Rapid advances in network biology indicate that cellular networks are governed by universal laws and offer a new conceptual framework that could potentially revolutionize our view of biology and disease pathologies in the twenty-first century.

7,475 citations


"Parameter uncertainty in biochemica..." refers methods in this paper

  • ...Though methods for discovering interactions are well established [32–34], techniques for accurately determining biochemical parameters remain limited [12]....

    [...]

Book
01 Jan 1987
TL;DR: The Delta Method and the Influence Function Cross-Validation, Jackknife and Bootstrap Balanced Repeated Replication (half-sampling) Random Subsampling Nonparametric Confidence Intervals as mentioned in this paper.
Abstract: The Jackknife Estimate of Bias The Jackknife Estimate of Variance Bias of the Jackknife Variance Estimate The Bootstrap The Infinitesimal Jackknife The Delta Method and the Influence Function Cross-Validation, Jackknife and Bootstrap Balanced Repeated Replications (Half-Sampling) Random Subsampling Nonparametric Confidence Intervals.

7,007 citations