scispace - formally typeset
Search or ask a question

Showing papers presented at "Computational Methods in Systems Biology in 2014"


Book ChapterDOI
17 Nov 2014
TL;DR: Algorithms based on the computation of lower and upper bounds of the probability, in conjunction with refinement and sampling, which yield answers that are precise to within an arbitrarily small tolerance value are developed.
Abstract: We consider the problem of synthesising rate parameters for stochastic biochemical networks so that a given time-bounded CSL property is guaranteed to hold, or, in the case of quantitative properties, the probability of satisfying the property is maximised/minimised We develop algorithms based on the computation of lower and upper bounds of the probability, in conjunction with refinement and sampling, which yield answers that are precise to within an arbitrarily small tolerance value Our methods are efficient and improve on existing approximate techniques that employ discretisation and refinement We evaluate the usefulness of the methods by synthesising rates for two biologically motivated case studies, including the reliability analysis of a DNA walker

57 citations


Book ChapterDOI
17 Nov 2014
TL;DR: It is shown that bootstrapping, multi-start optimisation, and Fisher information matrix based approaches yield misleading results for parameters which are structurally non-identifiable.
Abstract: Dynamical systems are widely used to describe the behaviour of biological systems. When estimating parameters of dynamical systems, noise and limited availability of measurements can lead to uncertainties. These uncertainties have to be studied to understand the limitations and the predictive power of a model. Several methods for uncertainty analysis are available. In this paper we analysed and compared bootstrapping, profile likelihood, Fisher information matrix, and multi-start based approaches for uncertainty analysis. The analysis was carried out on two models which contain structurally non-identifiable parameters. We showed that bootstrapping, multi-start optimisation, and Fisher information matrix based approaches yield misleading results for parameters which are structurally non-identifiable. We provide a simple and intuitive explanation for this, using geometric arguments.

50 citations


Book ChapterDOI
17 Nov 2014
TL;DR: In the scope of qualitative models of interaction networks, the computation of attractors reachable from a given state of the network faces combinatorial issues due to the state space explosion.
Abstract: Attractors of network dynamics represent the long-term behaviours of the modelled system. Their characterization is therefore crucial for understanding the response and differentiation capabilities of a dynamical system. In the scope of qualitative models of interaction networks, the computation of attractors reachable from a given state of the network faces combinatorial issues due to the state space explosion.

32 citations


Book ChapterDOI
17 Nov 2014
TL;DR: This paper presents a parameter synthesis framework based on δ-complete decision procedures that sidesteps undecidability, and demonstrates the method on two highly nonlinear hybrid models of the cardiac cell action potential.
Abstract: A central problem in systems biology is to identify parameter values such that a biological model satisfies some behavioral constraints (e.g., time series). In this paper we focus on parameter synthesis for hybrid (continuous/discrete) models, as many biological systems can possess multiple operational modes with specific continuous dynamics in each mode. These biological systems are naturally modeled as hybrid automata, most often with nonlinear continuous dynamics. However, hybrid automata are notoriously hard to analyze — even simple reachability for hybrid systems with linear differential dynamics is an undecidable problem. In this paper we present a parameter synthesis framework based on δ-complete decision procedures that sidesteps undecidability. We demonstrate our method on two highly nonlinear hybrid models of the cardiac cell action potential. The results show that our parameter synthesis framework is convenient and efficient, and it enabled us to select a suitable model to study and identify crucial parameter ranges related to cardiac disorders.

18 citations


Book ChapterDOI
17 Nov 2014
TL;DR: A model of the Base Excision Repair pathway in Kappa, a rule based formalism for modeling protein-protein and protein-DNA interactions is designed to shed light on the key role of the scaffolding protein XRCC1 in coordinating the repair process.
Abstract: There are ongoing debates in the DNA repair community on whether the coordination of DNA repair is achieved by means of direct protein-protein interactions or whether substrate specificity is sufficient to explain how DNA intermediates are channeled from one repair enzyme to the other. In order to address these questions we designed a model of the Base Excision Repair pathway in Kappa, a rule based formalism for modeling protein-protein and protein-DNA interactions. We use this model to shed light on the key role of the scaffolding protein XRCC1 in coordinating the repair process.

13 citations


Book ChapterDOI
17 Nov 2014
TL;DR: The analysis showed that the novel method can yield an expected L 2 approximation error in marginals that is several orders of magnitude lower compared to classical approximations, which facilitates the analysis of uncertainty for problems with high computational complexity.
Abstract: Dynamical models are widely used in systems biology to describe biological processes ranging from single cell transcription of genes to the tissue scale formation of gradients for cell guidance. One of the key issues for this class of models is the estimation of kinetic parameters from given measurement data, the so called parameter estimation. Measurement noise and the limited amount of data, give rise to uncertainty in estimates which can be captured in a probability density over the parameter space. Unfortunately, studying this probability density, using e.g. Markov chain Monte-Carlo, is often computationally demanding as it requires the repeated simulation of the underlying model. In the case of highly complex models, such as PDE models, this can render the study intractable. In this paper, we will present novel methods for analysis of such probability densities using networks of radial basis functions. We employed lattice generation algorithms, adaptive interacting particle sampling schemes as well as classical sampling schemes for the generation of approximation nodes coupled to the respective weighting scheme and compared their efficiency on different application examples. Our analysis showed that the novel method can yield an expected L 2 approximation error in marginals that is several orders of magnitude lower compared to classical approximations. This allows for a drastic reduction of the number of model evaluations. This facilitates the analysis of uncertainty for problems with high computational complexity. Finally, we successfully applied our method to a complex partial differential equation model for guided cell migration of dendritic cells.

8 citations


Book ChapterDOI
17 Nov 2014
TL;DR: The paper presents how the framework has been applied to model the intracellular signalling network controlling integrin activation mediating leukocyte recruitment from the blood into the tissues, by handling the solution space complexity through different levels of simulation accuracy.
Abstract: Model development and analysis of biological systems is recognized as a key requirement for integrating in-vitro and in-vivo experimental data. In-silico simulations of a biochemical model allows one to test different experimental conditions, helping in the discovery of the dynamics that regulate the system. Several characteristics and issues of biological system modeling are common to the electronics system modeling, such as concurrency, reactivity, abstraction levels, as well as state space explosion during verification. This paper proposes a modeling and simulation framework for discrete event-based execution of biochemical systems based on SystemC. SystemC is the reference language in the electronic design automation (EDA) field for modeling and verifying complex systems at different abstraction levels. SystemC-based verification is the de-facto an alternative to model checking when such a formal verification technique cannot deal with the state space complexity of the model. The paper presents how the framework has been applied to model the intracellular signalling network controlling integrin activation mediating leukocyte recruitment from the blood into the tissues, by handling the solution space complexity through different levels of simulation accuracy.

8 citations


Book ChapterDOI
17 Nov 2014
TL;DR: The results for several models emphasize the fact that the selection of timing implementation can have both qualitative and quantitative effects on the model’s transient behavior and its steady state.
Abstract: We describe our approach to modeling timing of cell signaling systems in which existing information about the system spans from detailed mechanistic knowledge to much coarser observations about cause and effect. The results for several models emphasize the fact that the selection of timing implementation can have both qualitative and quantitative effects on the model’s transient behavior and its steady state.

7 citations


Book ChapterDOI
17 Nov 2014
TL;DR: A Bio-PEPA model is presented which builds on previous modelling work in this field to predict: the surviving fraction of cells in response to radiation, the relative proportion of cell death caused by bystander signalling, the risk of non-lethal damage and the probability of observing bystanderser signalling for a given dose.
Abstract: Radiation induced bystander effects are secondary effects caused by the production of chemical signals by cells in response to radiation. We present a Bio-PEPA model which builds on previous modelling work in this field to predict: the surviving fraction of cells in response to radiation, the relative proportion of cell death caused by bystander signalling, the risk of non-lethal damage and the probability of observing bystander signalling for a given dose. This work provides the foundation for modelling bystander effects caused by biologically realistic dose distributions, with implications for cancer therapies.

7 citations


Book ChapterDOI
17 Nov 2014
TL;DR: This work introduces an approach for coupling models taking uncertainties concerning the crosstalk into account, and a pool of possible integrated models is generated in agreement with previously validated behavior of the isolated models as well as additional experimental observations.
Abstract: Methods for model integration have become increasingly popular for understanding of the interplay between biological processes. In this work, we introduce an approach for coupling models taking uncertainties concerning the crosstalk into account. Using constraintbased modeling and formal verification techniques, a pool of possible integrated models is generated in agreement with previously validated behavior of the isolated models as well as additional experimental observations. Correlation- and causality-based analysis allows us to uncover the importance of particular crosstalk connections for specific functionalities leading to new biological insights and starting points for experimental design. We illustrate our approach studying crosstalk between the MAPK and mTor signaling pathways.

7 citations


Book ChapterDOI
17 Nov 2014
TL;DR: This work focuses on simultaneously dealing with ill-conditioning by making use of proper regularization methods and presents a critical comparison of several methods, and guidelines for properly tuning them.
Abstract: Kinetic models are being increasingly used as a systematic framework to understand function in biological systems. Calibration of these nonlinear dynamic models remains challenging due to the nonconvexity and ill-conditioning of the associated inverse problems. Nonconvexity can be dealt with suitable global optimization. Here, we focus on simultaneously dealing with ill-conditioning by making use of proper regularization methods. Regularized calibrations ensure the best trade-offs between bias and variance, thus reducing over-fitting. We present a critical comparison of several methods, and guidelines for properly tuning them. The performance of this procedure and its advantages are illustrated with a well known benchmark problem considering several scenarios of data availability and measurement noise.

Book ChapterDOI
17 Nov 2014
TL;DR: A new framework for EFMs analysis under steady state conditions with state of the art SAT solver, working on a propositional encoding of EFMs, and enriched with a simple SMT-like solver ensuring EFMs consistency with stoichiometric constraints is illustrated.
Abstract: Elementary flux modes (EFMs) are commonly accepted tools for metabolic network analysis under steady state conditions. They can be defined as the smallest sub-networks enabling the metabolic system to operate in steady state with all irreversible reactions proceeding in the appropriate direction. However, when networks are complex, the number of EFMs quickly leads to a combinatorial explosion, preventing from drawing even simple conclusions from their analysis. Since the concept of EFMs analysis was introduced in 1994, there has been an important and ongoing effort to develop more efficient algorithms. However, these methods share a common bottleneck: they enumerate all the EFMs which make the computation impossible when the metabolic network is large and only few works try to search only EFMs with specific properties. As we will show in this paper, enumerating all the EFMs is not necessary in many cases and it is possible to directly query the network instead with an appropriate tool. For ensuring a good query time, we will rely on a state of the art SAT solver, working on a propositional encoding of EFMs, and enriched with a simple SMT-like solver ensuring EFMs consistency with stoichiometric constraints. We illustrate our new framework by providing experimental evidences of almost immediate answer times on a non trivial metabolic network.

Book ChapterDOI
17 Nov 2014
TL;DR: In this paper, a stochastic control framework for real-time gene expression control based on Model Predictive Control (MPC) is proposed. But, the model is not suitable for single-cell dynamics, and a control framework explicitly accounting for this variability is presently lacking.
Abstract: Recent works have demonstrated the experimental feasibility of real-time gene expression control based on deterministic controllers. By taking control of the level of intracellular proteins, one can probe single-cell dynamics with unprecedented flexibility. However, single-cell dynamics are stochastic in nature, and a control framework explicitly accounting for this variability is presently lacking. Here we devise a stochastic control framework, based on Model Predictive Control, which fills this gap. Based on a stochastic modelling of the gene response dynamics, our approach combines a full state-feedback receding-horizon controller with a real-time estimation method that compensates for unobserved state variables. Using previously developed models of osmostress-inducible gene expression in yeast, we show in silico that our stochastic control approach outperforms deterministic control design in the regulation of single cells. The present new contribution leads to envision the application of the proposed framework to wetlab experiments on yeast.

Book ChapterDOI
17 Nov 2014
TL;DR: This article presents a greedy algorithm computing a sparsest set of conservation laws equivalent to a given set of Conservation Laws, taken from the BioModels database.
Abstract: Conservation laws are a key-tool to study systems of chemical reactions in biology. We address the problem of defining and computing "good" sets of conservation laws. In this article, we chose to focus on sparsest sets of conservation laws. We present a greedy algorithm computing a sparsest set of conservation laws equivalent to a given set of conservation laws. Benchmarks over a subset of the curated models taken from the BioModels database are given.

Book ChapterDOI
17 Nov 2014
TL;DR: The aim is, given a set of standard biological parts and some pre-specified performance requirements, to automatically find the circuit configuration and its tuning so that self-sustained oscillations meeting the requirements are produced.
Abstract: We consider the problem of optimal design of synthetic biological oscillators. Our aim is, given a set of standard biological parts and some pre-specified performance requirements, to automatically find the circuit configuration and its tuning so that self-sustained oscillations meeting the requirements are produced. To solve this design problem, we present a methodology based on mixed-integer nonlinear optimization. This method also takes into account the possibility of including more than one design objective and of handling both deterministic and stochastic descriptions of the dynamics. Further, it is capable of handling significant levels of circuit complexity. We illustrate the performance of this method with several challenging case studies.

Book ChapterDOI
17 Nov 2014
TL;DR: In this paper, a series of trace simplifications for some common temporal logic formulae are proposed to compute the relevant characteristics of the experimental traces, and to measure the adequacy of the model to its specification on simulation traces.
Abstract: Calibrating dynamical models on experimental data time series is a central task in computational systems biology. When numerical values for model parameters can be found to fit the data, the model can be used to make predictions, whereas the absence of any good fit may suggest to revisit the structure of the model and gain new insights in the biology of the system. Temporal logic provides a formal framework to deal with imprecise data and specify a wide variety of dynamical behaviors. It can be used to extract information from numerical traces coming from either experimental data or model simulations, and to specify the expected behaviors for model calibration. The computation time of the different methods depends on the number of points in the trace so the question of trace simplification is important to improve their performance. In this paper we study this problem and provide a series of trace simplifications which are correct to perform for some common temporal logic formulae. We give some general soundness theorems, and apply this approach to period and phase constraints on the circadian clock and the cell cycle. In this application, temporal logic patterns are used to compute the relevant characteristics of the experimental traces, and to measure the adequacy of the model to its specification on simulation traces. Speed-ups by several orders of magnitude are obtained by trace simplification even when produced by smart numerical integration methods.

Book ChapterDOI
17 Nov 2014
TL;DR: Genome-scale reconstructions are usually stoichiometric and analyzed under steady-state assumptions using constraint-based modelling with flux balance analysis with possible additional physico-chemical constraints to predict the set of resulting flux distributions of an organism.
Abstract: Genome-scale reconstructions are usually stoichiometric and analyzed under steady-state assumptions using constraint-based modelling with flux balance analysis (FBA). FBA requires not only the stoichiometry of the network, but also an appropriate cellular objective function and possible additional physico-chemical constraints to predict the set of resulting flux distributions of an organism.

Book ChapterDOI
17 Nov 2014
TL;DR: Time-series fluorescence microscopy is used to visualise the mechanics of synaptic vesicle recycling at the presynaptic terminal of neurons and predicts potential experimental outcomes would be highly informative for time consuming and expensive studies.
Abstract: Synaptic vesicle recycling at the presynaptic terminal of neurons is essential for the maintenance of neurotransmission at central synapses Among the tools used to visualise the mechanics of this process is time-series fluorescence microscopy Fluorescent dyes such as FM1-43, or engineered fluorescent versions of synaptic vesicle proteins such as pHluorins, have been employed to reveal different steps of this key process [3,7] Predictive in silico modelling of potential experimental outcomes would be highly informative for these time consuming and expensive studies

Book ChapterDOI
17 Nov 2014
TL;DR: An existing computational model of fusing and splitting mitochondria by representations of fission protein 1 (Fis1) and dynamin relatedprotein 1 (Drp1) is expanded and parameter scans are performed on simulations of it to show effect of lower Fis1 and Drp1 recruitment rates, i.e. lower availability, on network structure and overall health.
Abstract: Mitochondria are mobile cellular organelles that form networks by fusion and fission. These events lead to an exchange of components responsible for maintaining membrane potential, i.e. mitochondrial health. Membrane potential can be disturbed by an imbalance of fission-triggering proteins. We expand an existing computational model of fusing and splitting mitochondria by representations of fission protein 1 (Fis1) and dynamin related protein 1 (Drp1) and perform parameter scans on simulations of it. Our relatively basic model already shows an effect of lower Fis1 and Drp1 recruitment rates, i.e. lower availability, on network structure and overall health. Various aspects of the real system can be incorporated into model, e.g. further regulatory proteins, a varying spatial distribution of Fis1 and Drp1, or consequences of changed mitochondrial network structure and health on their behaviour, e.g. under oxidative stress.

Book ChapterDOI
17 Nov 2014
TL;DR: The original hypothesis that the observed long-term behavior is the result of macromolecular crowding holds and is concluded that the anisotropic displacement distribution observed at the border between the mid-cell and poles holds.
Abstract: The cytoplasm of Escherichia coli is a crowded, heterogeneous environment. The spatial kinetics and heterogeneities of synthetic RNA-protein complexes have been recently studied using single-cell live imaging. A strong polar retention of these complexes due to the presence of the nucleoid has been suggested based on their history of positions and long-term spatial distribution. Here, using stochastic modelling, we examine likely sources, which can reproduce the reported long-term spatial distribution of the complexes. Based on the anisotropic displacement distribution observed at the border between the mid-cell and poles, we conclude that the original hypothesis that the observed long-term behavior is the result of macromolecular crowding holds.

Book ChapterDOI
17 Nov 2014
TL;DR: The XTMS CAD tool for pathway design is showcased, which exploits the ability for pathway ranking in the RetroPath retrosynthetic algorithm within an extended metabolic space that considers putative routes through enzyme promiscuity.
Abstract: Despite the increase in recent years in the portfolio of added-value chemicals that can be microbially produced, the design process still remains a complex system, costly and rather slow. To overcome such limitations, the development of Computer-Aided-Design (CAD) tools is necessary to design production pathways that systematically screen metabolic databases to select best genes to import into chassis organisms. Here, we showcase the XTMS CAD tool for pathway design, which exploits the ability for pathway ranking in our RetroPath retrosynthetic algorithm within an extended metabolic space that considers putative routes through enzyme promiscuity. The validity of the ranking function for the production of malonyl-CoA, an important precursor for added-value compounds, is shown.

Book ChapterDOI
17 Nov 2014
TL;DR: A reduced model for the core fatty acid synthesis and elongation process with a regulatory mechanism is demonstrated, which allows us to explore fatty acid profiles from lipid metabolomics data.
Abstract: Stochastic reaction-centric views are suitable for exploring hybrid minimal mechanism-statistical models of fatty acid and lipid metabolism, the basis of de novo lipogenesis. In this work, we demonstrate a reduced model for the core fatty acid synthesis and elongation process with a regulatory mechanism. This allows us to explore fatty acid profiles from lipid metabolomics data. This is part of a current study to assess the programming languages for capturing inherent probabilistic behaviour of the hierarchical chemical transformations of complex lipid species.