scispace - formally typeset
Search or ask a question

Showing papers on "Probability distribution published in 1995"


Journal ArticleDOI
TL;DR: In this article, an alternative approach, which relies on the assumption that areas of true neural activity will tend to stimulate signal changes over contiguous pixels, is presented, which can improve statistical power by as much as fivefold over techniques that rely solely on adjusting per pixel false positive probabilities.
Abstract: The typical functional magnetic resonance (fMRI) study presents a formidable problem of multiple statistical comparisons (i.e., > 10,000 in a 128 x 128 image). To protect against false positives, investigators have typically relied on decreasing the per pixel false positive probability. This approach incurs an inevitable loss of power to detect statistically significant activity. An alternative approach, which relies on the assumption that areas of true neural activity will tend to stimulate signal changes over contiguous pixels, is presented. If one knows the probability distribution of such cluster sizes as a function of per pixel false positive probability, one can use cluster-size thresholds independently to reject false positives. Both Monte Carlo simulations and fMRI studies of human subjects have been used to verify that this approach can improve statistical power by as much as fivefold over techniques that rely solely on adjusting per pixel false positive probabilities.

3,094 citations


Proceedings ArticleDOI
Reinhard Kneser1, Hermann Ney
09 May 1995
TL;DR: This paper proposes to use distributions which are especially optimized for the task of back-off, which are quite different from the probability distributions that are usually used for backing-off.
Abstract: In stochastic language modeling, backing-off is a widely used method to cope with the sparse data problem. In case of unseen events this method backs off to a less specific distribution. In this paper we propose to use distributions which are especially optimized for the task of backing-off. Two different theoretical derivations lead to distributions which are quite different from the probability distributions that are usually used for backing-off. Experiments show an improvement of about 10% in terms of perplexity and 5% in terms of word error rate.

1,768 citations


Journal ArticleDOI
06 Jul 1995-Nature
TL;DR: In this paper, it was shown that the scaling of the probability distribution of a particular economic index can be described by a non-gaussian process with dynamics that, for the central part of the distribution, correspond to that predicted for a Levy stable process.
Abstract: THE large-scale dynamical properties of some physical systems depend on the dynamical evolution of a large number of nonlinearly coupled subsystems. Examples include systems that exhibit self-organized criticality1 and turbulence2,3. Such systems tend to exhibit spatial and temporal scaling behaviour–power–law behaviour of a particular observable. Scaling is found in a wide range of systems, from geophysical4 to biological5. Here we explore the possibility that scaling phenomena occur in economic systemsa-especially when the economic system is one subject to precise rules, as is the case in financial markets6–8. Specifically, we show that the scaling of the probability distribution of a particular economic index–the Standard & Poor's 500–can be described by a non-gaussian process with dynamics that, for the central part of the distribution, correspond to that predicted for a Levy stable process9–11. Scaling behaviour is observed for time intervals spanning three orders of magnitude, from 1,000 min to 1 min, the latter being close to the minimum time necessary to perform a trading transaction in a financial market. In the tails of the distribution the fall-off deviates from that for a Levy stable process and is approximately exponential, ensuring that (as one would expect for a price difference distribution) the variance of the distribution is finite. The scaling exponent is remarkably constant over the six-year period (1984-89) of our data. This dynamical behaviour of the economic index should provide a framework within which to develop economic models.

1,689 citations


Book
06 Feb 1995
TL;DR: Theoretical Probability Distributions, Empirical Distributions and Exploratory Data Analysis, and Methods for Multivariate Data are reviewed.
Abstract: Introduction Review of Probability Empirical Distributions and Exploratory Data Analysis Theoretical Probability Distributions Hypothesis Testing Statistical Weather Forecasting Forecast Verification Time Series Methodsfor Multivariate Data Chapter Exercises Appendices: Example Data Sets Selected Probability Tables Answers to Exercises References Subject Index

1,531 citations


Journal ArticleDOI
TL;DR: In inverse problems, obtaining a maximum likelihood model is usually not sucient, as the theory linking data with model parameters is nonlinear and the a posteriori probability in the model space may not be easy to describe.
Abstract: Probabilistic formulation of inverse problems leads to the definition of a probability distribution in the model space. This probability distribution combines a priori information with new information obtained by measuring some observable parameters (data). As, in the general case, the theory linking data with model parameters is nonlinear, the a posteriori probability in the model space may not be easy to describe (it may be multimodal, some moments may not be defined, etc.). When analyzing an inverse problem, obtaining a maximum likelihood model is usually not sucient, as we normally also wish to have infor

1,124 citations


Book
07 Aug 1995
TL;DR: In this paper, the authors present a mathematical model of a GA multimodal fitness function, genetic drift, GA with sharing, and repeat (parallel) GA uncertainty estimates evolutionary programming -a variant of GA.
Abstract: Part 1 Preliminary statistics: random variables random nunmbers probability probability distribution, distribution function and density function joint and marginal probability distributions mathematical expectation, moments, variances and covariances conditional probability Monte Carlo integration importance sampling stochastic processes Markov chains homogeneous, inhomogeneous, irreducible and aperiodic Markov chains the limiting probability. Part 2 Direct, linear and iterative-linear inverse methods: direct inversion methods model based inversion methods linear/linearized inverse methods iterative linear methods for quasi-linear problems Bayesian formulation solution using probabilistic formulation. Part 3 Monte Carlo methods: enumerative or grid search techniques Monte Carlo inversion hybrid Monte Carlo-linear inversion directed Monte Carlo methods. Part 4 Simulated annealing methods: metropolis algorithm heat bath algorithm simulated annealing without rejected moves fast simulated annealing very fast simulated reannealing mean field annealing using SA in geophysical inversion. Part 5 Genetic algorithms: a classical GA schemata and the fundamental theorem of genetic algorithms problems combining elements of SA into a new GA a mathematical model of a GA multimodal fitness functions, genetic drift, GA with sharing, and repeat (parallel) GA uncertainty estimates evolutionary programming - a variant of GA. Part 6 Geophysical applications of SA and GA: 1-D seismic waveform inversion pre-stack migration velocity estimation inversion of resistivity sounding data for 1-D earth models inversion of resistivity profiling data for 2-D earth models inversion of magnetotelluric sounding data for 1-D earth models stochastic reservoir modelling seismic deconvolution by mean field annealing and Hopfield network. Part 7 Uncertainty estimation: methods of numerical integration simulated annealing - the Gibbs' sampler genetic algorithm - the parallel Gibbs' sampler numerical examples.

710 citations


Journal ArticleDOI
Philip Heidelberger1
TL;DR: In this article, a survey of efficient techniques for estimating the probabilities of certain rare events in queueing and reliability models is presented, where the rare events of interest are long waiting times or buffer overflows in queuing systems and system failure events in reliability models of highly dependable computing systems.
Abstract: This paper surveys efficient techniques for estimating, via simulation, the probabilities of certain rare events in queueing and reliability models. The rare events of interest are long waiting times or buffer overflows in queueing systems, and system failure events in reliability models of highly dependable computing systems. The general approach to speeding up such simulations is to accelerate the occurrence of the rare events by using importance sampling. In importance sampling, the system is simulated using a new set of input probability distributions, and unbiased estimates are recovered by multiplying the simulation output by a likelihood ratio. Our focus is on describing asymptotically optimal importance sampling techniques. Using asymptotically optimal importance sampling, the number of samples required to get accurate estimates grows slowly compared to the rate at which the probability of the rare event approaches zero. In practice, this means that run lengths can be reduced by many orders of magnitude, compared to standard simulation. In certain cases, asymptotically optimal importance sampling results in estimates having bounded relative error. With bounded relative error, only a fixed number of samples are required to get accurate estimates, no matter how rare the event of interest is. The queueing systems studied include simple queues (e.g., GI/GI/1), Jackson networks, discrete time queues with multiple autocorrelated arrival processes that arise in the analysis of Asynchronous Transfer Mode communications switches, and tree structured networks of such switches. Both Markovian and non-Markovian reliability models are treated.

582 citations


Journal ArticleDOI
TL;DR: Buridan is an implemented least-commitment planner that solves problems of this form and it is proved that the algorithm is both sound and complete, and the interplay between generating plans and evaluating them is described.

406 citations


Journal ArticleDOI
TL;DR: The need for training procedures that result in experts with relatively independent opinions or for aggregation methods that implicitly or explicitly model the dependence among the experts is reviewed.
Abstract: This article reviews statistical techniques for combining multiple probability distributions. The framework is that of a decision maker who consults several experts regarding some events. The exper...

405 citations


Journal Article
TL;DR: A brief numerical analysis of the model reveals that uncertainty can account for a large proportion of the costs of the morning commute.
Abstract: Existing models of the commuting time-of-day choice are extended in order to analyze the effect of uncertain travel times. Travel time includes a time-varying congestion component and a random element specified by a probability distribution. Results from the uniform and exponential probability distributions are compared and the optimal "head start" time that the commuter chooses to account for travel time variability, i.e., a safety margin that determines the probability of arriving late for work, is derived. The model includes a one-time lateness penalty for arriving late as well as the per minute penalties for early and late arrival that other investigators have included. It also generalizes earlier work by accounting for the time variation in the predictable component of congestion, which interacts with uncertainty in interesting ways. A brief numerical analysis of the model reveals that uncertainty can account for a large proportion of the costs of the morning commute.

390 citations


Journal ArticleDOI
TL;DR: A programmatic procedure for establishing the stability of queueing networks and scheduling policies that establishes not only positive recurrence and the existence of a steady-state probability distribution, but also the geometric convergence of an exponential moment.
Abstract: We develop a programmatic procedure for establishing the stability of queueing networks and scheduling policies. The method uses linear or nonlinear programming to determine what is an appropriate quadratic functional to use as a Lyapunov function. If the underlying system is Markovian, our method establishes not only positive recurrence and the existence of a steady-state probability distribution, but also the geometric convergence of an exponential moment. We illustrate this method on several example problems. >

Book
01 Feb 1995
TL;DR: In this paper, the authors present a model for estimating parameters of the Lognormal distribution of the probability distribution in a three-parameter lognormal model and apply it to environmental problems.
Abstract: Random Processes Stochastic Processes in the Environment Structure of the Book Theory of Probability Probability Concepts Probability Laws Conditional Probability and Bayes' Theorem Summary Problems Probability Models Discrete Probability Models Continuous Random Variables Moments, Expected Value, and Central Tendency Variance, Kurtosis, and Skewness Analysis of Observed Data Summary Problems Bernoulli Processes Conditions for Bernoulli Process Development of Model Binomial Distribution Applications to Environmental Problems Computation of B(n,p) Problems Poisson Processes Conditions for Poisson Process Development of Model Poisson Distribution Examples Applications to Environmental Problems Computation of P(l,t) Problems Diffusion and Dispersion of Pollutants Wedge Machine Particle Frame Machine Plume Model Summary and Conclusions Problems Normal Processes Conditions for Normal Process Development of Model Confidence Intervals Applications to Environmental Problems Computation of N(m,s) Problems Dilution of Pollutants Deterministic Dilution Stochastic Dilution Applications to Environmental Problems Summary and Conclusions Problems Lognormal Processes Conditions for Lognormal Process Development of Model Lognormal Probability Model Estimating Parameters of the Lognormal Distribution Three-Parameter Lognormal Model Statistical Theory of Rollback Applications to Environmental Problems Summary and Conclusions Problems Index

Journal ArticleDOI
TL;DR: In this paper, the authors present a simple technique that gives slightly better bounds than these and that more importantly requires only limited independence among the random variables, thereby importing a variety of standard results to the case of limited independence for free.
Abstract: Chernoff-Hoeffding (CH) bounds are fundamental tools used in bounding the tail probabilities of the sums of bounded and independent random variables (r.v.'s). We present a simple technique that gives slightly better bounds than these and that more importantly requires only limited independence among the random variables, thereby importing a variety of standard results to the case of limited independence for free. Additional methods are also presented, and the aggregate results are sharp and provide a better understanding of the proof techniques behind these bounds. These results also yield improved bounds for various tail probability distributions and enable improved approximation algorithms for jobshop scheduling. The limited independence result implies that a reduced amount and weaker sources of randomness are sufficient for randomized algorithms whose analyses use the CH bounds, e.g., the analysis of randomized algorithms for random sampling and oblivious packet routing.

Journal ArticleDOI
TL;DR: The statistical results of this study provide a quantitative estimate of the dependence of the accuracy with the system size and the value of the self-affine exponent.
Abstract: Reliability and accuracy of the determination of self-affine exponents are studied and quantified from the analysis of synthetic self-affine profiles and surfaces. The self-affine exponent is measured using different methods either relying on the determination of a ``fractal dimension'' (i.e., box counting and divider methods) or directly analyzing the self-affine exponent. The second group of methods includes the variable bandwidth, the first return and the multireturn probability distribution, and the power spectrum. The accuracy of all these methods is assessed in terms of the difference between an ``input'' self-affine exponent used for the synthetic construction and the ``output'' exponent measured by those different methods. The statistical results of this study provide a quantitative estimate of the dependence of the accuracy with the system size and the value of the self-affine exponent. Artifacts in the measurement of self-affine profiles or surfaces, misorientation, signal amplification, and local geometric filtering, which lead to biased estimates of the self-affine exponent, are also discussed.

Journal ArticleDOI
TL;DR: It is demonstrated that the breakdown of Galilean invariance is responsible for intermittency and the operator product expansion is established, and the effects of pressure can be turned on perturbatively.
Abstract: We develop exact field-theoretic methods to treat turbulence when the effect of pressure is negligible. We find explicit forms of certain probability distributions, demonstrate that the breakdown of Galilean invariance is responsible for intermittency, and establish the operator product expansion. We also indicate how the effects of pressure can be turned on perturbatively. (c) 1995 The American Physical Society

Proceedings Article
18 Aug 1995
TL;DR: This paper proposes a method for elicitation of probabilities from a domain expert that is non-invasive and accommodates whatever probabilistic information the expert is willing to state and uses this canonical form to derive second-order probability distributions over the desired probabilities.
Abstract: Although the usefulness of belief networks for reasoning under uncertainty is widely accepted, obtaining numerical probabilities that they require is still perceived a major obstacle Often not enough statistical data is available to allow for reliable probability estimation Available information may not be directly amenable for encoding in the network Finally, domain experts may be reluctant to provide numerical probabilities In this paper, we propose a method for elicitation of probabilities from a domain expert that is non-invasive and accommodates whatever probabilistic information the expert is willing to state We express all available information, whether qualitative or quantitative in nature, in a canonical form consisting of (in) equalities expressing constraints on the hyperspace of possible joint probability distributions We then use this canonical form to derive second-order probability distributions over the desired probabilities

Journal ArticleDOI
TL;DR: Finite sample estimators for entropy and other functions of a discrete probability distribution when the data is a finite sample drawn from that probability distribution are presented.
Abstract: This paper addresses the problem of estimating a function of a probability distribution from a finite set of samples of that distribution. A Bayesian analysis of this problem is presented, the optimal properties of the Bayes estimators are discussed, and as an example of the formalism, closed form expressions for the Bayes estimators for the moments of the Shannon entropy function are derived. Then numerical results are presented that compare the Bayes estimator to the frequency-counts estimator for the Shannon entropy. We also present the closed form estimators, all derived elsewhere, for the mutual information, ${\mathrm{\ensuremath{\chi}}}^{2}$ covariance, and some other statistics. (c) 1995 The American Physical Society

Journal ArticleDOI
TL;DR: In this paper, the probability distribution for rotated, squeezed and shifted quadratures is expressed in terms of the Wigner function and the density operator in the coordinate representation, and the inverse transformation generalizing the homodyne detection formula is obtained.
Abstract: The probability distribution for rotated, squeezed and shifted quadratures is shown to be expressed in terms of the Wigner function (as well as in terms of the Q-function and density operator in the coordinate representation). The inverse transformation generalizing the homodyne detection formula is obtained.

Journal ArticleDOI
TL;DR: This paper presents the theoretical developments of a novel approach for the optimization of design models involving stochastic parameters involving decomposition based approach to deal with linear and convex nonlinear models involving uncertainty described by arbitrary probability distribution functions.

Proceedings ArticleDOI
20 Jun 1995
TL;DR: An algorithm and representation level theory of illusory contour shape and salience that relies on the fact that the probability that a particle following a random walk will pass through a given position and orientation on a path joining two boundary fragments can be computed directly as the product of two vector-field convolutions.
Abstract: We describe an algorithm and representation level theory of illusory contour shape and salience. Unlike previous theories, our model is derived from a single assumption-namely, that the prior probability distribution of boundary completion shape can be modeled by a random walk in a lattice whose points are positions and orientations in the image plane (i.e. the space which one can reasonably assume is represented by neurons of the mammalian visual cortex). Our model does not employ numerical relaxation or other explicit minimization, but instead relies on the fact that the probability that a particle following a random walk will pass through a given position and orientation on a path joining two boundary fragments can be computed directly as the product of two vector-field convolutions. We show that for the random walk we define, the maximum likelihood paths are curves of least energy, that is, on average, random walks follow paths commonly assumed to model the shape of illusory contours. A computer model is demonstrated on numerous illusory contour stimuli from the literature. >

ReportDOI
01 Mar 1995
TL;DR: The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model predictions that arises from uncertainty in input values and suggestions for treating structural uncertainty for submodels are presented.
Abstract: The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

Journal ArticleDOI
TL;DR: In this article, the authors modeled residential water use as a customer-server interaction often encountered in queueing theory and derived expressions for the mean, variance and probability distribution of the flow rate and the corresponding pipe Reynolds number at points along a dead-end trunk line.
Abstract: Residential water use is visualized as a customer-server interaction often encountered in queueing theory. Individual customers are assumed to arrive according to a nonhomogeneous Poisson process, then engage water servers fro random lengths of time. Busy servers are assumed to draw water at steady but random rates from the distribution system. These conditions give rise to a time-dependent Markovian queueing system having servers that deliver random rectangular pulses of water. Expressions are derived for the mean, variance, and probability distribution of the flow rate and the corresponding pipe Reynolds number at points along a dead-end trunk line. Comparison against computer-simulated results shows that the queueing model provides an excellent description of the temporal and spatial variations of the flow regime through a dead-end trunk line supplying water to a block of heterogeneous homes. The behavior of the local flow field given by the queueing model can be coupled with water-quality models that require ultra-fine temporal and spatial resolutions to predict the fate of contaminants moving through municipal distribution systems.

Journal ArticleDOI
TL;DR: In this article, beam splitting, amplification and heterodyning are used to measure the Wigner function of a single light mode in optical homodyne tomography, where the WIGNer function is reconstructed from measured quadrature distributions.

Book ChapterDOI
01 Jan 1995
TL;DR: A stochastic programming model is a model that specifies the assumptions made concerning the system in mathematical terms and identifies system parameters with mathematical objects and forms a problem to be solved and uses the obtained result for descriptive or operative purposes.
Abstract: When formulating a stochastic programming problem, we usually start from a deterministic problem that we call underlying deterministic problem. Then, observing that some of the parameters are random, we formulate another problem, the stochastic programming problem, by taking into account the probability distribution of the random elements in the underlying problem. When decision can or has to be made in the presence of randomness, at one single step, i.e., we do not wait for the occurrence of any event or realization of some random variable(s), then we say that the stochastic programming model is static. The two words: model and problem are used as synonyms. In the strict sense, the model specifies the assumptions made concerning the system in mathematical terms and identifies system parameters with mathematical objects. Having these, we formulate a problem to be solved and use the obtained result for descriptive or operative purposes.

Journal ArticleDOI
TL;DR: In this paper, the authors considered the ensemble of random symmetric n×n matrices specified by an orthogonal invariant probability distribution and showed that the normalized eigenvalue counting function of this ensemble converges in probability to a nonrandom limit as n→∞ and that this limiting distribution is the solution of a certain self-consistent equation.
Abstract: We consider the ensemble of random symmetricn×n matrices specified by an orthogonal invariant probability distribution. We treat this distribution as a Gibbs measure of a mean-field-type model. This allows us to show that the normalized eigenvalue counting function of this ensemble converges in probability to a nonrandom limit asn→∞ and that this limiting distribution is the solution of a certain self-consistent equation.

Journal ArticleDOI
TL;DR: In this article, a new class of continuous multivariate distributions on × ∈ ℜ n is proposed, called υ-spherical distributions, which generalize the classes of spherical and l q -spherical (when υ(·) is the l 2 norm) distributions.
Abstract: A new class of continuous multivariate distributions on × ∈ ℜ n is proposed. We define these so-called υ-spherical distributions through properties of the density function in a location-scale context. We derive conditions for properness of υ-spherical distributions and discuss how to generate them in practice. The name “υ-spherical” is motivated by the fact that these distributions generalize the classes of spherical (when υ(·) is the l 2 norm) and l q -spherical (when υ(·) is the l q norm) distributions. Isodensity sets are still always situated around the location parameter μ, but exchangeability and axial symmetry are no longer imposed, as is illustrated in some examples. As an important special case, we define a class of distributions suggested by independent sampling from a generalization of exponential power distributions. This allows us to model skewness. Interestingly, all the robustness results found previously for spherical and l q -spherical models carry over directly to υ-spherical mo...

Journal ArticleDOI
TL;DR: In this article, the trend to equilibrium of solutions to the spacehomogeneous Boltzmann equation for Maxwellian molecules with angular cutoff as well as with infinite-range forces is investigated. And the relation between several metrics for spaces of probability distributions, and the relation this to the Boltzman equation, by proving that Fourier-transformed solutions are at least as regular as the Fourier transform of the initial data, is established.
Abstract: This paper deals with the trend to equilibrium of solutions to the spacehomogeneous Boltzmann equation for Maxwellian molecules with angular cutoff as well as with infinite-range forces. The solutions are considered as densities of probability distributions. The Tanaka functional is a metric for the space of probability distributions, which has previously been used in connection with the Boltzmann equation. Our main result is that, if the initial distribution possesses moments of order 2+e, then the convergence to equilibrium in his metric is exponential in time. In the proof, we study the relation between several metrics for spaces of probability distributions, and relate this to the Boltzmann equation, by proving that the Fourier-transformed solutions are at least as regular as the Fourier transform of the initial data. This is also used to prove that even if the initial data only possess a second moment, then ∫∣v∣>R f(v, t) ∣v∣2 dv→0 asR→∞, and this convergence is uniform in time.

Journal ArticleDOI
James E. Smith1
TL;DR: A very general framework for analyzing these kinds of problems where, given certain "moments" of a distribution, the authors can compute bounds on the expected value of an arbitrary "objective" function.
Abstract: In many decision analysis problems, we have only limited information about the relevant probability distributions. In problems like these, it is natural to ask what conclusions can be drawn on the basis of this limited information. For example, in the early stages of analysis of a complex problem, we may have only limited fractile information for the distributions in the problem; what can we say about the optimal strategy or certainty equivalents given these few fractiles? This paper describes a very general framework for analyzing these kinds of problems where, given certain "moments" of a distribution, we can compute bounds on the expected value of an arbitrary "objective" function. By suitable choice of moment and objective functions we can formulate and solve many practical decision analysis problems. We describe the general framework and theoretical results, discuss computational strategies, and provide specific results for examples in dynamic programming, decision analysis with incomplete information, Bayesian statistics, and option pricing.

Journal ArticleDOI
TL;DR: The purpose of this paper is to show that the probability distribution of T scales with T as P(T)\ensuremath{\sim}$ T as H=1/2 and the application of the result to the characterization of on-off intermittency, a recently proposed mechanism for bursting.
Abstract: Herein, the term fractional Brownian motion is used to refer to a class of random walks with long-range correlated steps where the mean square displacement of the walker at large time t is proportional to ${\mathit{t}}^{2\mathit{H}}$ with 0H1. For ordinary Brownian motion we obtain H=1/2. Let T denote the time at which the random walker starting at the origin first returns to the origin. The purpose of this paper is to show that the probability distribution of T scales with T as P(T)\ensuremath{\sim}${\mathit{T}}^{\mathit{H}\mathrm{\ensuremath{-}}2}$. Theoretical arguments and numerical simulations are presented to support the result. Additional issues explored include modification to the power law distribution when the random walk is biased and the application of the result to the characterization of on-off intermittency, a recently proposed mechanism for bursting.

Book ChapterDOI
21 Aug 1995
TL;DR: The trace semantics for labeled transition systems is extended to a randomized model of concurrent computation and a notion of a probabilistic forward simulation is introduced and it is proved that it is sound for the trace distribution precongruence.
Abstract: We extend the trace semantics for labeled transition systems to a randomized model of concurrent computation. The main objective is to obtain a compositional semantics. The role of a trace in the randomized model is played by a probability distribution over traces, called a trace distribution. We show that the preorder based on trace distribution inclusion is not a precongruence, and we build an elementary context, called the principal context, that is sufficiently powerful to characterize the coarsest precongruence that is contained in the trace distribution preorder. Finally, we introduce a notion of a probabilistic forward simulation and we prove that it is sound for the trace distribution precongruence. An important characteristic of probabilistic forward simulations is that they relate states to probability distributions over states.