scispace - formally typeset
Search or ask a question
Topic

Stochastic process

About: Stochastic process is a research topic. Over the lifetime, 31227 publications have been published within this topic receiving 898736 citations. The topic is also known as: random process & stochastic processes.


Papers
More filters
Book
01 Jan 1968
TL;DR: This chapter discusses filter theory, applications, and applications of filter theory and modeling techniques for free flight and powered flight navigation, and error analyses and sub-optimal modeling.
Abstract: Part I. Theory: Ordinary differential equations and stability Random processes and stochastic models Observability and controllability Filtering theory Global theory of filtering Stochastic stability Optimal filtering for correlated noise processes Approximate optimal non-linear filtering Optimum filtering for discrete time random processes Stochastic control Open questions and historical comments Part II. Applications: Application to navigation Applications of filter theory and modeling techniques Free flight and powered flight navigation Error analyses and sub-optimal modeling Errors in the filtering process Appendix A. Least squares curve fitting Appendix B. Probability review References Appendix C. The Riccati equation and its bounds Appendix D. Further references Index.

445 citations

Journal ArticleDOI
TL;DR: This article describes two important problems which motivate the study of efficient gradient estimation algorithms and presents the likelihood ratio gradient estimator in a general setting in which the essential idea is most transparent, and derives likelihood-ratio-gradient estimators for both time-homogeneous and non-time homogeneous discrete-time Markov chains.
Abstract: Consider a computer system having a CPU that feeds jobs to two input/output (I/O) devices having different speeds. Let t be the fraction of jobs routed to the first I/O device, so that 1 - t is the fraction routed to the second. Suppose that a = a(t) is the steady-sate amount of time that a job spends in the system. Given that t is a decision variable, a designer might wish to minimize a(t) over t. Since a(·) is typically difficult to evaluate analytically, Monte Carlo optimization is an attractive methodology. By analogy with deterministic mathematical programming, efficient Monte Carlo gradient estimation is an important ingredient of simulation-based optimization algorithms. As a consequence, gradient estimation has recently attracted considerable attention in the simulation community. It is our goal, in this article, to describe one efficient method for estimating gradients in the Monte Carlo setting, namely the likelihood ratio method (also known as the efficient score method). This technique has been previously described (in less general settings than those developed in this article) in [6, 16, 18, 21]. An alternative gradient estimation procedure is infinitesimal perturbation analysis; see [11, 12] for an introduction. While it is typically more difficult to apply to a given application than the likelihood ratio technique of interest here, it often turns out to be statistically more accurate.In this article, we first describe two important problems which motivate our study of efficient gradient estimation algorithms. Next, we will present the likelihood ratio gradient estimator in a general setting in which the essential idea is most transparent. The section that follows then specializes the estimator to discrete-time stochastic processes. We derive likelihood-ratio-gradient estimators for both time-homogeneous and non-time homogeneous discrete-time Markov chains. Later, we discuss likelihood ratio gradient estimation in continuous time. As examples of our analysis, we present the gradient estimators for time-homogeneous continuous-time Markov chains; non-time homogeneous continuous-time Markov chains; semi-Markov processes; and generalized semi-Markov processes. (The analysis throughout these sections assumes the performance measure that defines a(t) corresponds to a terminating simulation.) Finally, we conclude the article with a brief discussion of the basic issues that arise in extending the likelihood ratio gradient estimator to steady-state performance measures.

442 citations

Journal ArticleDOI
TL;DR: In this article, a nonlinear stochastic process is presented that reproduces Luder's projection postulate, and the corresponding density operator undergoes a linear evolution reproducing von Neumann's projection.
Abstract: A nonlinear stochastic process is presented that, for each realization and for large times, reproduces L\"uder's projection postulate. The corresponding density operator undergoes a linear evolution reproducing von Neumann's projection postulate. The violation of the Bell inequality, for instance, is described with the two apparatus acting independently on the composed system.

442 citations

Journal ArticleDOI
TL;DR: The orthogonal decomposition of an exponential family or mixture family of probability distributions has a natural hierarchical structure is given and is important for extracting intrinsic interactions in firing patterns of an ensemble of neurons and for estimating its functional connections.
Abstract: An exponential family or mixture family of probability distributions has a natural hierarchical structure. This paper gives an "orthogonal" decomposition of such a system based on information geometry. A typical example is the decomposition of stochastic dependency among a number of random variables. In general, they have a complex structure of dependencies. Pairwise dependency is easily represented by correlation, but it is more difficult to measure effects of pure triplewise or higher order interactions (dependencies) among these variables. Stochastic dependency is decomposed quantitatively into an "orthogonal" sum of pairwise, triplewise, and further higher order dependencies. This gives a new invariant decomposition of joint entropy. This problem is important for extracting intrinsic interactions in firing patterns of an ensemble of neurons and for estimating its functional connections. The orthogonal decomposition is given in a wide class of hierarchical structures including both exponential and mixture families. As an example, we decompose the dependency in a higher order Markov chain into a sum of those in various lower order Markov chains.

441 citations


Network Information
Related Topics (5)
Nonlinear system
208.1K papers, 4M citations
89% related
Robustness (computer science)
94.7K papers, 1.6M citations
86% related
Estimator
97.3K papers, 2.6M citations
86% related
Matrix (mathematics)
105.5K papers, 1.9M citations
85% related
Differential equation
88K papers, 2M citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023159
2022355
2021985
20201,151
20191,119
20181,115