scispace - formally typeset
Search or ask a question

Showing papers on "Stochastic simulation published in 2018"


Journal ArticleDOI
TL;DR: In this article, the authors present an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity, each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastically behaviors.
Abstract: Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This paper presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distribution model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.

94 citations


Journal ArticleDOI
TL;DR: How much Monte Carlo sampling needs to be performed within simulation-based recursions in order that the resulting iterates remain consistent and, more importantly, efficient, where “efficient” implies convergence at the fastest possible rate is asked.
Abstract: We consider the context of “simulation-based recursions,” that is, recursions that involve quantities needing to be estimated using a stochastic simulation. Examples include stochastic adaptations of fixed-point and gradient descent recursions obtained by replacing function and derivative values appearing within the recursion by their Monte Carlo counterparts. The primary motivating settings are simulation optimization and stochastic root finding problems, where the low point and the zero of a function are sought, respectively, with only Monte Carlo estimates of the functions appearing within the problem. We ask how much Monte Carlo sampling needs to be performed within simulation-based recursions in order that the resulting iterates remain consistent and, more importantly, efficient, where “efficient” implies convergence at the fastest possible rate. Answering these questions involves trading off two types of error inherent in the iterates: the deterministic error due to recursion and the “stochastic” er...

46 citations


Journal ArticleDOI
30 Mar 2018
TL;DR: A fully calibrated stochastic scenario generator that is based on real non-life insurance data and allows everyone to simulate their own synthetic insurance portfolio of individual claims histories and back-test thier preferred claims reserving method.
Abstract: The aim of this project is to develop a stochastic simulation machine that generates individual claims histories of non-life insurance claims. This simulation machine is based on neural networks to incorporate individual claims feature information. We provide a fully calibrated stochastic scenario generator that is based on real non-life insurance data. This stochastic simulation machine allows everyone to simulate their own synthetic insurance portfolio of individual claims histories and back-test thier preferred claims reserving method.

44 citations


Journal ArticleDOI
05 Nov 2018
TL;DR: Systems-biology modeling method stochastic random circuit perturbation (sRACIPE), which takes the GRC topology as the only input, and simulates an ensemble of models with random kinetic parameters at multiple noise levels, is developed and tested.
Abstract: Stochasticity in gene expression impacts the dynamics and functions of gene regulatory circuits. Intrinsic noises, including those that are caused by low copy number of molecules and transcriptional bursting, are usually studied by stochastic simulations. However, the role of extrinsic factors, such as cell-to-cell variability and heterogeneity in the microenvironment, is still elusive. To evaluate the effects of both the intrinsic and extrinsic noises, we develop a method, named sRACIPE, by integrating stochastic analysis with random circuit perturbation (RACIPE) method. RACIPE uniquely generates and analyzes an ensemble of models with random kinetic parameters. Previously, we have shown that the gene expression from random models form robust and functionally related clusters. In sRACIPE we further develop two stochastic simulation schemes, aiming to reduce the computational cost without sacrificing the convergence of statistics. One scheme uses constant noise to capture the basins of attraction, and the other one uses simulated annealing to detect the stability of states. By testing the methods on several synthetic gene regulatory circuits and an epithelial–mesenchymal transition network in squamous cell carcinoma, we demonstrate that sRACIPE can interpret the experimental observations from single-cell gene expression data. We observe that parametric variation (the spread of parameters around a median value) increases the spread of the gene expression clusters, whereas high noise merges the states. Our approach quantifies the robustness of a gene circuit in the presence of noise and sheds light on a new mechanism of noise-induced hybrid states. We have implemented sRACIPE as an R package. Gene regulatory circuits (GRCs) regulate many biological processes including cell cycle, cell differentiation, and phenotypic switching. Stochasticity in the gene expression impacts the dynamics and functions of such GRCs. Vivek Kohar and Mingyang Lu from Jackson Laboratory have developed a systems-biology modeling method stochastic random circuit perturbation (sRACIPE), which takes the GRC topology as the only input, and simulates an ensemble of models with random kinetic parameters at multiple noise levels. Statistical analysis of the generated gene expressions reveals the basin of attraction and stability of various phenotypic states and their changes associated with intrinsic and extrinsic noises. Application of the method to single cell expression data from synthetic circuits and epithelial-mesenchymal transition in squamous cell carcinoma shows its potential in yielding new insights on the structure and function of gene regulatory networks.

43 citations


Journal ArticleDOI
TL;DR: The continuous and discrete deterministic and discrete stochastic formulations of the SIR dynamical systems models are reviewed, and how they can be easily and rapidly constructed using Numerus Model Builder, a graphically-driven coding platform is outlined.

41 citations


Journal ArticleDOI
TL;DR: In this paper, an oscillator-based true random number generator (TRNG) was demonstrated by exploiting inherently stochastic threshold switching in the insulator-to-metal transition (IMT) in vanadium dioxide.
Abstract: An oscillator-based true random number generator (TRNG) is experimentally demonstrated by exploiting inherently stochastic threshold switching in the insulator-to-metal transition (IMT) in vanadium dioxide. Through experimentation and modeling, we show that the origin of stochasticity arises from small perturbations in the nanoscale domain structure, which are then subsequently amplified through a positive feedback process. Within a 1T1R oscillator, the stochastic cycle-to-cycle variations in the IMT trigger voltage result in random timing jitter, which is harnessed for a TRNG. The randomness of the IMT TRNG output is validated using the NIST SP800-22 statistical test.

39 citations


Journal ArticleDOI
TL;DR: In this paper, a general platform for the simulation of occupants' presence and behaviours is introduced, called No-MASS (Nottingham Multi-Agent Stochastic Simulation), which generates a synthetic popu...
Abstract: This paper introduces a new general platform for the simulation of occupants' presence and behaviours. Called No-MASS (Nottingham Multi-Agent Stochastic Simulation), this generates a synthetic popu...

38 citations


Journal ArticleDOI
TL;DR: This work constructs a finite-state abstraction on which a control policy is synthesized and refined back to the original belief model, and introduces a new notion of label-based approximate stochastic simulation to quantify the deviation between belief models.

31 citations


Posted Content
TL;DR: In this paper, robust dynamic programming mappings over the abstract system are introduced to solve the control synthesis and verification problem for stochastic control systems that are evolving over continuous spaces.
Abstract: Discrete-time stochastic systems are an essential modelling tool for many engineering systems. We consider stochastic control systems that are evolving over continuous spaces. For this class of models, methods for the formal verification and synthesis of control strategies are computationally hard and generally rely on the use of approximate abstractions. Building on approximate abstractions, we compute control strategies with lower- and upper-bounds for satisfying unbounded temporal logic specifications. Firstly, robust dynamic programming mappings over the abstract system are introduced to solve the control synthesis and verification problem. These mappings yield a control strategy and a unique lower bound on the satisfaction probability for temporal logic specifications that is robust to the incurred approximation errors. Secondly, upper-bounds on the satisfaction probability are quantified, and properties of the mappings are analysed and discussed. Finally, we show the implications of these results for linear stochastic dynamic systems with a continuous state space. This abstraction-based synthesis framework is shown to be able to handle infinite-horizon properties. Approximation errors expressed as deviations in the outputs of the models and as deviations in the probabilistic transitions are allowed and are quantified using approximate stochastic simulation relations.

29 citations


Journal ArticleDOI
Sharif Rahman1
TL;DR: The use of NURBS for random field discretization enriches the isogeometric paradigm and an uncertainty quantification pipeline of the future can be envisioned where geometric modeling, stress analysis, and stochastic simulation are all integrated using the same building blocks of N URBS.

29 citations


Journal ArticleDOI
TL;DR: A novel two-dimensional geometry-based stochastic model for multiple-input multiple-output (MIMO) vehicle-to-vehicle (V2V) wideband fading channels and the great agreement between simulation models and the reference model demonstrates not only the utility of simulation models, but also the correctness of theoretical derivations and simulations.
Abstract: In this paper, we consider a novel two-dimensional (2D) geometry-based stochastic model (GBSM) for multiple-input multiple-output (MIMO) vehicle-to-vehicle (V2V) wideband fading channels. The proposed model employs the combination of a two-ring model and a multiple confocal ellipses model, where the signal is sum of the line-of-sight (LoS) component, single-bounced (SB) rays, and double-bounced (DB) rays. Based on the reference model, we derive some expressions of channel statistical properties, including space-time correlation function (STCF), Doppler spectral power density (DPSD), envelope level crossing rate (LCR) and average fade duration (AFD). In addition, corresponding deterministic and stochastic simulation models are developed based on the reference model. Moreover, we compare the statistical properties of the reference model and the two simulation models in different scenarios and investigate the impact of different vehicular traffic densities (VTDs) on the channel statistical properties of the proposed model. Finally, the great agreement between simulation models and the reference model demonstrates not only the utility of simulation models, but also the correctness of theoretical derivations and simulations.

Journal ArticleDOI
TL;DR: A new extension where a conditional probability density function is approximated using Legendre-like orthogonal splines, and the coefficients of spline approximation are estimated using high-order spatial statistics inferred from the available sample data, additionally complemented by a training image.
Abstract: High-order sequential simulation techniques for complex non-Gaussian spatially distributed variables have been developed over the last few years. The high-order simulation approach does not require any transformation of initial data and makes no assumptions about any probability distribution function, while it introduces complex spatial relations to the simulated realizations via high-order spatial statistics. This paper presents a new extension where a conditional probability density function (cpdf) is approximated using Legendre-like orthogonal splines. The coefficients of spline approximation are estimated using high-order spatial statistics inferred from the available sample data, additionally complemented by a training image. The advantages of using orthogonal splines with respect to the previously used Legendre polynomials include their ability to better approximate a multidimensional probability density function, reproduce the high-order spatial statistics, and provide a generalization of high-order simulations using Legendre polynomials. The performance of the new method is first tested with a completely known image and compared to both the high-order simulation approach using Legendre polynomials and the conventional sequential Gaussian simulation method. Then, an application in a gold deposit demonstrates the advantages of the proposed method in terms of the reproduction of histograms, variograms, and high-order spatial statistics, including connectivity measures. The C++ course code of the high-order simulation implementation presented herein, along with an example demonstrating its utilization, are provided online as supplementary material.

Journal ArticleDOI
TL;DR: SPARTA (Stochastic Periodic AutoRegressive To Anything) offers an alternative and novel approach which allows the explicit representation of each process of interest with any distribution model, while simultaneously establishes dependence patterns that cannot be fully captured by the typical linear stochastic schemes.

Journal ArticleDOI
TL;DR: This paper derives sufficient small-gain type conditions for the compositional quantification of the distance in probability between the interconnection of stochastic control subsystems and that of their (finite or infinite) abstractions.

Journal ArticleDOI
TL;DR: In this article, a rare-event simulation-based method for computing epigenetic landscapes and phenotype-transitions in metastable gene networks is presented, inspired by studies of metastability and barrier-crossing in protein folding, and provides an automated means of computing and visualizing essential stationary and dynamic information that is generally inaccessible to conventional simulation.
Abstract: Stochastic simulation has been a powerful tool for studying the dynamics of gene regulatory networks, particularly in terms of understanding how cell-phenotype stability and fate-transitions are impacted by noisy gene expression. However, gene networks often have dynamics characterized by multiple attractors. Stochastic simulation is often inefficient for such systems, because most of the simulation time is spent waiting for rare, barrier-crossing events to occur. We present a rare-event simulation-based method for computing epigenetic landscapes and phenotype-transitions in metastable gene networks. Our computational pipeline was inspired by studies of metastability and barrier-crossing in protein folding, and provides an automated means of computing and visualizing essential stationary and dynamic information that is generally inaccessible to conventional simulation. Applied to a network model of pluripotency in Embryonic Stem Cells, our simulations revealed rare phenotypes and approximately Markovian transitions among phenotype-states, occurring with a broad range of timescales. The relative probabilities of phenotypes and the transition paths linking pluripotency and differentiation are sensitive to global kinetic parameters governing transcription factor-DNA binding kinetics. Our approach significantly expands the capability of stochastic simulation to investigate gene regulatory network dynamics, which may help guide rational cell reprogramming strategies. Our approach is also generalizable to other types of molecular networks and stochastic dynamics frameworks.

Journal ArticleDOI
TL;DR: The common surrogate model as Kriging metamodel is served to fit the simulation input–output data produced by Latin hypercube sampling experimental design to propose a comprehensive methodology applied to black-box stochastic simulation models under uncertainty.
Abstract: In spite of the wide improvement in computer simulation packages, analyzing, and optimizing the simulation model, particularly under uncertainty can still be computationally expensive and time-consuming. This paper aims to tackle these features by proposing a comprehensive methodology applied to black-box stochastic simulation models under uncertainty. For this purpose, the common surrogate model as Kriging metamodel is served to fit the simulation input–output data produced by Latin hypercube sampling experimental design. Taguchi terminology of robust design enables the optimization methods to control uncertainty and uncontrollable environmental factors. So as to formulate robust counterpart optimization, three different models in the class of dual-response surface are integrated with metamodel and robust design. Leave-one-out cross-validation is applied to validate the Kriging metamodel. Finally, a numerical case study as a direct speed control of dc motor under uncertainty is served to demonstrate the applicability of the proposed method in real engineering problems. This simplified and practical mechatronics case illustrates how the proposed procedure can be expanded for analyzing and optimizing the real complex systems.

Journal ArticleDOI
TL;DR: This work proposes a methodology based on vine copulas for the stochastic simulation of periodic streamflow scenarios that incorporates lags that are greater than one and is a non-linear periodic autoregressive model.
Abstract: Synthetic streamflow data is vital for the energy sector, as it feeds stochastic optimisation models that determine operational policies. Considered scenarios should differ from each other, but be the same from a statistical point of view, i.e., the scenarios must preserve features of the original time series such as the mean, variance, and temporal dependence structures. Traditionally, linear models are applied for this task. Recently, the advent of copulas has led to the emergence of an alternative that overcomes the drawbacks of linear models. In this context, we propose a methodology based on vine copulas for the stochastic simulation of periodic streamflow scenarios. Copula-based models that focus on single-site inflow simulation only consider lag-one time dependence. Therefore, we suggest an approach that incorporates lags that are greater than one. Furthermore, the proposed model deals with the strong periodicity that is commonly present in monthly streamflow time series. The resulting model is a non-linear periodic autoregressive model. Our results indicate that this model successfully simulates scenarios, preserving features that are observed in historical data.

Proceedings ArticleDOI
27 Jun 2018
TL;DR: A method based on vine copulas for stochastic simulation of evaluation results where the true system distributions are known upfront is proposed and shown in two sample applications replicating typical experiments found in the literature.
Abstract: Part of Information Retrieval evaluation research is limited by the fact that we do not know the distributions of system effectiveness over the populations of topics and, by extension, their true mean scores. The workaround usually consists in resampling topics from an existing collection and approximating the statistics of interest with the observations made between random subsamples, as if one represented the population and the other a random sample. However, this methodology is clearly limited by the availability of data, the impossibility to control the properties of these data, and the fact that we do not really measure what we intend to. To overcome these limitations, we propose a method based on vine copulas for stochastic simulation of evaluation results where the true system distributions are known upfront. In the basic use case, it takes the scores from an existing collection to build a semi-parametric model representing the set of systems and the population of topics, which can then be used to make realistic simulations of the scores by the same systems but on random new topics. Our ability to simulate this kind of data not only eliminates the current limitations, but also offers new opportunities for research. As an example, we show the benefits of this approach in two sample applications replicating typical experiments found in the literature. We provide a full R package to simulate new data following the proposed method, which can also be used to fully reproduce the results in this paper.

Journal ArticleDOI
TL;DR: The proposed computational homogenization approach focuses on empirically determining the autocorrelation functions of output response quantities through post processing finite element analyses to improve upon the current established approach by circumventing the need to analyze numerous successively larger domains in order to determine convergence of apparent properties.

Posted ContentDOI
28 Mar 2018-bioRxiv
TL;DR: The approach quantifies the robustness of a gene circuit in the presence of noise and sheds light on a new mechanism of noise induced hybrid states, which is implemented into a freely available R package.
Abstract: Stochasticity in gene expression impacts the dynamics and functions of gene regulatory circuits. Intrinsic noises, including those that are caused by low copy number of molecules and transcriptional bursting, are usually studied by stochastic analysis methods, such as Gillespie algorithm and Langevin simulation. However, the role of extrinsic factors, such as cell-to-cell variability and heterogeneity in the microenvironment, is still elusive. To evaluate the effects of both intrinsic and extrinsic noises, we develop a new method, named sRACIPE, by integrating stochastic analysis with random circuit perturbation (RACIPE) method. Unlike traditional methods, RACIPE generates and analyzes an ensemble of mathematical models with random kinetic parameters. Previously, we have shown that the gene expression from random models form robust and functionally related clusters. Under the framework of this randomization-based approach, here we develop two stochastic simulation schemes, aiming to reduce the computational cost without sacrificing the convergence of statistics. One scheme uses constant noise to capture the basins of attraction, and the other one uses simulated annealing to detect the stability of states. By testing the methods on several gene regulatory circuits, we found that high noise, but not large parameter variation, merges clusters together. Our approach quantifies the robustness of a gene circuit in the presence of noise and sheds light on a new mechanism of noise-induced hybrid states. We have implemented sRACIPE into a freely available R package.

Posted Content
TL;DR: This paper proposes a subsampling framework to bypass this computational bottleneck, by leveraging the form of the output variance and its estimation error in terms of data size and sampling effort to reduce the sampling complexity of the two-layer bootstrap required in simulation uncertainty quantification.
Abstract: In stochastic simulation, input uncertainty refers to the output variability arising from the statistical noise in specifying the input models. This uncertainty can be measured by a variance contribution in the output, which, in the nonparametric setting, is commonly estimated via the bootstrap. However, due to the convolution of the simulation noise and the input noise, the bootstrap consists of a two-layer sampling and typically requires substantial simulation effort. This paper investigates a subsampling framework to reduce the required effort, by leveraging the form of the variance and its estimation error in terms of the data size and the sampling requirement in each layer. We show how the total required effort can be reduced from an order bigger than the data size in the conventional approach to an order independent of the data size in subsampling. We explicitly identify the procedural specifications in our framework that guarantee relative consistency in the estimation, and the corresponding optimal simulation budget allocations. We substantiate our theoretical results with numerical examples.

Journal ArticleDOI
TL;DR: This paper focuses on studying the convergence in distribution for a sequence of uncertain random variables without a common chance distribution.
Abstract: A random variable is a measurable function from an uncertainty space to the set of real numbers, which is used to model randomness. An uncertain variable is a measurable function from uncertainty space to the set of real numbers, which is used to describe uncertainty. However, randomness and uncertainty often simultaneously appear in a complex system. Uncertain random variable provides a useful tool to handle such a hybrid case. This concept integrates random variable and uncertain variable into a broader view. For uncertain random variables, a basic and important topic is to discuss the convergence of its sequence. Specifically, this paper focuses on studying the convergence in distribution for a sequence of uncertain random variables without a common chance distribution.

Journal ArticleDOI
TL;DR: The selected-node stochastic simulation algorithm (snSSA), which allows us to exclusively simulate an arbitrary, selected subset of molecular species of a possibly large and complex reaction network, based on an analytical elimination of chemical species, thereby avoiding explicit simulation of the associated chemical events.
Abstract: Stochastic simulations of biochemical networks are of vital importance for understanding complex dynamics in cells and tissues. However, existing methods to perform such simulations are associated with computational difficulties and addressing those remains a daunting challenge to the present. Here we introduce the selected-node stochastic simulation algorithm (snSSA), which allows us to exclusively simulate an arbitrary, selected subset of molecular species of a possibly large and complex reaction network. The algorithm is based on an analytical elimination of chemical species, thereby avoiding explicit simulation of the associated chemical events. These species are instead described continuously in terms of statistical moments derived from a stochastic filtering equation, resulting in a substantial speedup when compared to Gillespie’s stochastic simulation algorithm (SSA). Moreover, we show that statistics obtained via snSSA profit from a variance reduction, which can significantly lower the number of M...

Proceedings ArticleDOI
TL;DR: It is shown that rollout fused with OCBA performs competitively with respect to rollout with total equal allocation at a meagre simulation budget of 5-10% of rollout with TEA, which is a crucial step towards addressing large-scale community recovery problems following natural disasters.
Abstract: Computation of optimal recovery decisions for community resilience assurance post-hazard is a combinatorial decision-making problem under uncertainty. It involves solving a large-scale optimization problem, which is significantly aggravated by the introduction of uncertainty. In this paper, we draw upon established tools from multiple research communities to provide an effective solution to this challenging problem. We provide a stochastic model of damage to the water network (WN) within a testbed community following a severe earthquake and compute near-optimal recovery actions for restoration of the water network. We formulate this stochastic decision-making problem as a Markov Decision Process (MDP), and solve it using a popular class of heuristic algorithms known as rollout. A simulation-based representation of MDPs is utilized in conjunction with rollout and the Optimal Computing Budget Allocation (OCBA) algorithm to address the resulting stochastic simulation optimization problem. Our method employs non-myopic planning with efficient use of simulation budget. We show, through simulation results, that rollout fused with OCBA performs competitively with respect to rollout with total equal allocation (TEA) at a meagre simulation budget of 5-10% of rollout with TEA, which is a crucial step towards addressing large-scale community recovery problems following natural disasters.

Journal ArticleDOI
31 Jan 2018
TL;DR: This study compares the efficiency and limitations of several available implementations of network-based and -free stochastic simulation approaches, to allow for an informed selection of the implementation and methodology for specific biochemical modeling applications.
Abstract: Stochastic simulation has been widely used to model the dynamics of biochemical reaction networks. Several algorithms have been proposed that are exact solutions of the chemical master equation, following the work of Gillespie. These stochastic simulation approaches can be broadly classified into two categories: network-based and -free simulation. The network-based approach requires that the full network of reactions be established at the start, while the network-free approach is based on reaction rules that encode classes of reactions, and by applying rule transformations, it generates reaction events as they are needed without ever having to derive the entire network. In this study, we compare the efficiency and limitations of several available implementations of these two approaches. The results allow for an informed selection of the implementation and methodology for specific biochemical modeling applications.

Journal ArticleDOI
TL;DR: In this paper, a framework of spectral representation-based dimension reduction for simulating multivariate non-stationary stochastic ground motion processes is addressed, by means of introducing random functions serving as constraints correlating with the orthogonal random variables in the original spectral representation scheme.

Journal ArticleDOI
TL;DR: In this article, the authors proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations and used the full relationship between the correlation of the observed data and the normally transformed data.
Abstract: Multisite stochastic simulations of daily precipitation have been widely employed in hydrologic analyses for climate change assessment and agricultural model inputs Recently, a copula model with a gamma marginal distribution has become one of the common approaches for simulating precipitation at multiple sites Here, we tested the correlation structure of the copula modeling The results indicate that there is a significant underestimation of the correlation in the simulated data compared to the observed data Therefore, we proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations We used the full relationship between the correlation of the observed data and the normally transformed data Although this indirect method offers certain improvements in preserving the cross-correlations between sites in the original domain, the method was not reliable in application Therefore, we further improved a simulation-based method (SBM) that was developed to model the multisite precipitation occurrence The SBM preserved well the cross-correlations of the original domain The SBM method provides around 02 better cross-correlation than the direct method and around 01 degree better than the indirect method The three models were applied to the stations in the Nakdong River basin, and the SBM was the best alternative for reproducing the historical cross-correlation The direct method significantly underestimates the correlations among the observed data, and the indirect method appeared to be unreliable

Journal ArticleDOI
TL;DR: A simple and flexible package, BioSimulator.jl, for implementing the Gillespie algorithm, τ-leaping, and related stochastic simulation algorithms based on Markov chain theory, to provide scientists across domains with fast, user-friendly simulation tools.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed to combine adaptive filters with an overlap strategy along a raster path and an efficient conditioning method to develop an algorithm for reservoir simulation with high accuracy and continuity.
Abstract: Multiple-point geostatistical simulation is used to simulate the spatial structures of geological phenomena. In contrast to conventional two-point variogram based geostatistical methods, the multiple-point approach is capable of simulating complex spatial patterns, shapes, and structures normally observed in geological media. A commonly used pattern based multiple-point geostatistical simulation algorithms is called FILTERSIM. In the conventional FILTERSIM algorithm, the patterns identified in training images are transformed into filter score space using fixed filters that are neither dependent on the training images nor on the characteristics of the patterns extracted from them. In this paper, we introduce two new methods, one for geostatistical simulation and another for conditioning the results. At first, new filters are designed using principal component analysis in such a way to include most structural information specific to the governing training images resulting in the selection of closer patterns in the filter score space. We then propose to combine adaptive filters with an overlap strategy along a raster path and an efficient conditioning method to develop an algorithm for reservoir simulation with high accuracy and continuity. We also combine image quilting with this algorithm to improve connectivity a lot. The proposed method, which we call random partitioning with adaptive filters simulation method, can be used both for continuous and discrete variables. The results of the proposed method show a significant improvement in recovering the expected shapes and structural continuity in the final simulated realizations as compared to those of conventional FILTERSIM algorithm and the algorithm is more than ten times faster than FILTERSIM because of using raster path and using small overlap specially when we use image quilting.

Journal ArticleDOI
Seon Han Choi1, Tag Gon Kim1
TL;DR: The proposed algorithm evaluates an uncertainty to assess whether the observed best design is truly optimal, based on hypothesis test, and conservatively allocates additional simulation resources to reduce uncertainty with an intuitive allocation rule in each iteration of a sequential procedure.
Abstract: This paper proposes an efficient ranking and selection algorithm for a stochastic simulation model. The proposed algorithm evaluates an uncertainty to assess whether the observed best design is truly optimal, based on hypothesis test. Then, it conservatively allocates additional simulation resources to reduce uncertainty with an intuitive allocation rule in each iteration of a sequential procedure. This conservative allocation provides a high robustness to noise for the algorithm. The results of several experiments demonstrated its improved performance compared to the other algorithms in the literature. The algorithm can be an efficient way to solve optimization problems in real-world systems where significant noise exists.