scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 1998"


Journal ArticleDOI
TL;DR: In this article, the probability distribution of stock price changes is studied by analyzing a database (the Trades and Quotes Database) documenting every trade for all stocks in three major US stock markets, for the two year period Jan 1994 -- Dec 1995.
Abstract: The probability distribution of stock price changes is studied by analyzing a database (the Trades and Quotes Database) documenting every trade for all stocks in three major US stock markets, for the two year period Jan 1994 -- Dec 1995. A sample of 40 million data points is extracted, which is substantially larger than studied hitherto. We find an asymptotic power-law behavior for the cumulative distribution with an exponent alpha approximately 3, well outside the Levy regime 0< alpha <2.

333 citations


Journal ArticleDOI
TL;DR: This paper derived the probability density function of the infinite sum of correlated lognormal variables and showed that it is reciprocal gamma distributed, under suitable parameter restrictions, and then obtained a closed-form analytic expression for the value of an arithmetic Asian option using the reciprocal gamma distribution as the state-price density function.
Abstract: Arithmetic Asian options are difficult to price and hedge as they do not have closed-form analytic solutions. The main theoretical reason for this difficulty is that the payoff de? pends on the finite sum of correlated lognormal variables, which is not lognormal and for which there is no recognizable probability density function. We use elementary techniques to derive the probability density function of the infinite sum of correlated lognormal ran? dom variables and show that it is reciprocal gamma distributed, under suitable parameter restrictions. A random variable is reciprocal gamma distributed if its inverse is gamma dis? tributed. We use this result to approximate the finite sum of correlated lognormal variables and then value arithmetic Asian options using the reciprocal gamma distribution as the state-price density function. We thus obtain a closed-form analytic expression for the value of an arithmetic Asian option, where the cumulative density function of the gamma distri? bution, G(d) in our formula, plays the exact same role as N(d) in the Black-Scholes/Merton formula. In addition to being theoretically justified and exact in the limit, we compare our method against other algorithms in the literature and show our method is quicker, at least as accurate, and, in our opinion, more intuitive and pedagogically appealing than any pre? viously published result, especially when applied to high yielding currency options.

293 citations


Journal ArticleDOI
TL;DR: In this paper, two techniques are proposed to improve the performance of Latin hypercube sampling by finding the probabilistic means of equiprobable disjunct intervals in the variable's domain instead of using the cumulative distribution function directly, as is currently done.

214 citations


Journal ArticleDOI
TL;DR: In this paper, the authors corroborate the idea of replica symmetry breaking and aging in the linear response function for a large class of finite-dimensional systems with short-range interactions, which are characterized by a continuity condition with respect to weak random perturbations of the Hamiltonian.
Abstract: We corroborate the idea of a close connection between replica-symmetry breaking and aging in the linear response function for a large class of finite-dimensional systems with short-range interactions. In these systems, which are characterized by a continuity condition with respect to weak random perturbations of the Hamiltonian, the ``fluctuation dissipation ratio'' in off-equilibrium dynamics should be equal to the static cumulative distribution function of the overlaps. This allows for an experimental measurement of the equilibrium order parameter function.

188 citations


Journal ArticleDOI
TL;DR: The more general case where the two Gaussian noise processes describing the Rice process are correlated is considered, and the resulting process are named as extended Suzuki process, which can be used as a suitable statistical model for describing the fading behavior of large classes of frequency nonselective land mobile satellite channels.
Abstract: This paper deals with the statistical characterization of a stochastic process which is a product of a Rice and lognormal process. Thereby, we consider the more general case where the two Gaussian noise processes describing the Rice process are correlated. The resulting process are named as extended Suzuki process, which can be used as a suitable statistical model for describing the fading behavior of large classes of frequency nonselective land mobile satellite channels. In particular, the statistical properties (e.g., probability density function (pdf) of amplitude and phase, level-crossing rate, and average duration of fades) of the Rice process with cross-correlated components as well as of the proposed extended Suzuki process are investigated. Moreover, all statistical model parameters are optimized numerically to fit the cumulative distribution function and the level-crossing rate of the underlying analytical model to measured data collected in different environments. Finally, an efficient simulation model is presented which is in excellent conformity with the proposed analytical model.

110 citations


Journal ArticleDOI
TL;DR: In this article, both order quantity and lead time are considered as decision variables in a mixed inventory model and an algorithmic procedure is developed to find the optimal order quantity, and the effects of parameters are also studied.

95 citations


Journal ArticleDOI
TL;DR: In this paper, a robust reliability analysis method for durability of structural components subjected to external and inertial loads with time-dependent variable amplitudes is presented, considering uncertainties such as material properties, manufacturing tolerances, and initial crack size.
Abstract: Anefe cientreliability analysis method fordurability of structural components subjected to external and inertial loads with time-dependent variable amplitudes is presented. This method is able to support reliability analysis of crack-initiation and crack-propagation lives of practical applications, considering uncertainties such as material properties, manufacturing tolerances, and initial crack size. Three techniques are employed to support the probabilistic durability prediction: 1 ) strain-based approach for multiaxial crack-initiation-life prediction and linear elastic fracture mechanics approach for crack-propagation-life prediction, 2 ) statistics-based approach for reliability analysis, and 3 ) sensitivity analysis and optimization methods for searching the most probable point (MPP) in the random variable space to compute the fatigue failure probability using the e rst-order reliability analysis method. A two-point approximation method is employed to speed up the MPP search. A tracked-vehicle roadarm is presented to demonstrate feasibility of the proposed method.

77 citations


Journal ArticleDOI
TL;DR: In this paper, central limit theorems for Hilbert-valued arrays near epoch dependent on mixing processes were obtained for general Hilbertvalued adapted dependent heterogeneous arrays, which are useful in delivering asymptotic distributions for parametric and nonparametric estimators and their functionals in time series econometrics.
Abstract: We obtain new central limit theorems (CLT's) and functional central limit theorems (FCLT's) for Hilbert-valued arrays near epoch dependent on mixing processes, and also new FCLT's for general Hilbert-valued adapted dependent heterogeneous arrays. These theorems are useful in delivering asymptotic distributions for parametric and nonparametric estimators and their functionals in time series econometrics. We give three significant applications for near epoch dependent observations: (1) A new CLT for any plug-in estimator of a cumulative distribution function (c.d.f.) (e.g., an empirical c.d.f., or a c.d.f. estimator based on a kernel density estimator), which can in turn deliver distribution results for many Von Mises functionals; (2) a new limiting distribution result for degenerate U -statistics, which delivers distribution results for Bierens's integrated conditional moment tests; (3) a new functional central limit result for Hilbert-valued stochastic approximation procedures, which delivers distribution results for nonparametric recursive generalized method of moment estimators, including nonparametric adaptive learning models.

61 citations


Journal ArticleDOI
TL;DR: A new class of probability density function estimators is described, based on the autoregressive model, which has similar properties to a power spectral density of a continuous random variable.
Abstract: Noting that the probability density function of a continuous random variable has similar properties to a power spectral density, a new class of probability density function estimators is described. The specific model examined is the autoregressive model, although the extension to other time series models is evident. An example is given to illustrate the approach.

45 citations


Journal ArticleDOI
TL;DR: In this paper, a stochastic representation is used to represent various probability distributions in this problem (mean probability density function and first passage time distributions) and it is shown clearly that the disorder dominates the problem and that the thermal distributions tend to zero-one laws.
Abstract: We study the continuum version of Sinai's problem of a random walker in a random force field in one dimension. A method of stochastic representation is used to represent various probability distributions in this problem (mean probability density function and first passage time distributions). This method reproduces already known rigorous results and also confirms directly some recent results derived using approximation schemes. We demonstrate clearly, in the Sinai scaling regime, that the disorder dominates the problem and that the thermal distributions tend to zero-one laws.

36 citations


Proceedings Article
01 Dec 1998
TL;DR: This work introduces a stochastic method for learning the cumulative distribution and an analogous deterministic technique for density estimation, and demonstrates convergence of these methods both theoretically and experimentally.
Abstract: We introduce two new techniques for density estimation. Our approach poses the problem as a supervised learning task which can be performed using Neural Networks. We introduce a stochastic method for learning the cumulative distribution and an analogous deterministic technique. We demonstrate convergence of our methods both theoretically and experimentally, and provide comparisons with the Parzen estimate. Our theoretical results demonstrate better convergence properties than the Parzen estimate.

Journal Article
TL;DR: The main aim of as discussed by the authors is to establish an Ostrowski type inequality for the cumulative distribution function of a random variable taking values in a finite interval, and an application for a Beta random variable is given.
Abstract: The main aim of this paper is to establish an Ostrowski type inequality for the cumulative distribution function of a random variable taking values in a finite interval [a,b]. An application for a Beta random variable is given.

Journal ArticleDOI
TL;DR: In this paper, the probability density function (p.d.f) of a random variable is approximated by a linear combination of B-splines and the best model is determined by entropy analysis.

Journal ArticleDOI
TL;DR: In this paper, a trimmed version of the Mallows distance (Mallows 1972) between two cumulative distribution functions (c.d.'s F and G) is used to assess the similarity of two c.d.s with respect to this distance at controlled type I error rate.
Abstract: The problem of assessing similarity of two cumulative distribution functions (c.d.f.'s) has been the topic of a previous paper by the authors (Munk and Czado (1995)). Here, we developed an asymptotic test based on a trimmed version of the Mallows distance (Mallows 1972) between two c.d.f.'s F and G. This allows to assess the similarity of two c.d.f.'s with respect to this distance at controlled type I error rate. In particular, this applies to bioequivalence testing within a purely nonparametric setting. In this paper, we investigate the finite sample behavior of this test. The effect of trimming and non equal sample size on the observed power and level is studied. Sample size driven recommendations for the choice of the trimming bounds are given in order to minimize the bias. Finally, assuming normality and homogeneous variances, we simulate the relative efficiency of the Mallows test to the (asymptotically optimal) standard equivalence t test, which reveals the Mallows test as a robust alternative to th...

Journal ArticleDOI
TL;DR: A model of the absorption coefficient cumulative distribution function for a wide band spectral interval is presented in this article, where the dominant bands of water vapor and carbon dioxide are classified into three important categories, depending on the number and interaction of the participating vibrational transitions within the considered spectral interval.
Abstract: A model of the absorption coefficient cumulative distribution function for a wide band spectral interval is presented The dominant bands of water vapor and carbon dioxide are classified into three important categories, depending on the number and interaction of the participating vibrational transitions within the considered spectral interval Limiting models for the absorption coefficient cumulative distribution function are proposed for small and large pressures and compared with exact line-by-line calculations The model results in large and small pressure limits and for temperatures up to 1000 K are in good agreement with exact calculations

Journal ArticleDOI
TL;DR: In this paper, a cumulative distribution function of the order statistics without the continuity assumption is derived and an inequality is developed showing that this distribution has an upper bound which equals to the corresponding distribution when the continuity assumptions are satisfied.

Posted Content
Jushan Bai1
TL;DR: In this paper, the least square estimation of a change point in multiple regressions is studied and the analytical density function and the cumulative distribution function for the general skewed distribution are derived.
Abstract: This paper studies the least squares estimation of a change point in multiple regressions. Consistency, rate of convergence, and asymptotic distributions are obtained. The model allows for lagged dependent variables and trending regressors. The error process can be dependent and heteroskedastic. For nonstationary regressors or disturbances, the asymptotic distribution is shown to be skewed. The analytical density function and the cumulative distribution function for the general skewed distribution are derived. The analysis applies to both pure and partial changes. The method is used to analyze the response of market interest rates to discount rate changes.

Journal ArticleDOI
TL;DR: In this article, second-order saddlepoint approximations are given for approximating these multivariate cumulative distribution functions (cdf's) in their most general settings, and the probability of rectangular regions associated with these cdf's are also approximated directly using secondorder saddle point methods.
Abstract: Four multivariate distributions commonly arise in sampling theory: the multinomial, multivariate hypergeometric, Dirichlet, and multivariate Polya distributions. Second-order saddlepoint approximations are given for approximating these multivariate cumulative distribution functions (cdf's) in their most general settings. Probabilities of rectangular regions associated with these cdf's are also approximated directly using second-order saddlepoint methods. All the approximations follow from characterizations of the multivariate distributions as conditional distributions. Applications to outlier discordancy tests and slippage tests are discussed.

Journal ArticleDOI
TL;DR: In this article, shape-preserving wavelet approximators of cumulative distribution functions and densities with a priori prescribed smoothness properties are introduced and evaluated for a variety of loss functions and analyzed their asymptotic behavior.
Abstract: In a previous paper we introduced a general class of shapepreserving wavelet approximating operators (approximators) which transform cumulative distribution functions (cdf) and densities into functions of the same type. Empirical versions of these operators are used in this paper to introduce, in an unified way, shape- preserving wavelet estimators of cdf and densities, with a priori prescribed smoothness properties. We evaluate their risk for a variety of loss functions and analyze their asymptotic behavior. We study the convergence rates depending on minimal additional assumptions about the cdf/ density. These assumptions are in terms of the function belonging to certain homogeneous Besov or Triebel- Lizorkin spaces and others. As a main evaluation tool the integral p-modulus of smoothness is used

Journal ArticleDOI
TL;DR: In this paper, a simple and accurate model to describe the wide band absorption coefficient cumulative distribution function was developed, which explicitly expresses the absorption coefficient in terms of the cumulative distribution functions, simplifying the solution of the radiative transfer equation.

Posted Content
TL;DR: The authors derived the probability density function of the infinite sum of correlated lognormal random variables and showed that it is reciprocal gamma distributed, under suitable parameter restrictions, and then value arithmetic Asian options using the reciprocal gamma distribution as the state-price density function.
Abstract: Arithmetic Asian options are difficult to price and hedge as they do not have closed-form analytic solutions. The main theoretical reason for this difficulty is that the payoff depends on the finite sum of correlated lognormal variables, which is not lognormal and for which there is no recognizable probability density function. We use elementary techniques to derive the probability density function of the infinite sum of correlated lognormal random variables and show that it is reciprocal gamma distributed, under suitable parameter restrictions. A random variable is reciprocal gamma distributed if its inverse is gamma distributed. We use this result to approximate the finite sum of correlated lognormal variables and then value arithmetic Asian options using the reciprocal gamma distribution as the state-price density function. We thus obtain a closed-form analytic expression for the value of an arithmetic Asian option, where the cumulative density function of the gamma distribution, G(d) in our formula, plays the exact same role as N(d) in the Black-Scholes/Merton formula. In addition to being theoretically justified and exact in the limit, we compare our method against other algorithms in the literature and show our method is quicker, at least as accurate, and, in our opinion, more intuitive and pedagogically appealing than any previously published result, especially when applied to high yielding currency options.

Patent
Hidefumi Osawa1
07 Oct 1998
TL;DR: In this paper, an arithmetic encoding/decoding method for updating a cumulative probability when the generation frequency of a symbol to be encoded/decoded exceeds an allowable maximum value is presented.
Abstract: Coding efficiency is improved by adopting dynamic probability estimation, and the adaptation rate is adjusted by a minimum number of times by adding means for detecting a change in predicted state of encoding, thereby shortening encoding/decoding time. In an arithmetic encoding/decoding method for updating a cumulative probability when the generation frequency of a symbol to be encoded/decoded exceeds an allowable maximum value, an entropy associated with the generation state of the symbol to be encoded/decoded is calculated when the generation frequency has exceeded the allowable maximum value (S103), it is determined if the currently calculated entropy and previously calculated entropy have a significant difference (S104), and the cumulative probability is updated when it is determined that the two entropy values have the significant difference (S105).

Proceedings ArticleDOI
12 Jan 1998
TL;DR: A method is introduced and demonstrated which uses parametric stability derivative data (in the form of regression equations) and probabilistic analysis techniques to evaluate the impact of uncertainty on the handling qualities characteristics of a family of aircraft alternatives.
Abstract: A method is introduced and demonstrated which uses parametric stability derivative data (in the form of regression equations) and probabilistic analysis techniques to evaluate the impact of uncertainty on the handling qualities characteristics of a family of aircraft alternatives. While the method is based on the use of elementary design parameters familiar to the configuration designer, it enables the computation of responses more familiar to the stability and control engineer. This connection is intended to bring about a more complete accounting of stability and handling quality characteristics in aircraft design, based on engineering analysis instead of historical data. Another key advantage of the method is that it allows for the quantification of analysis imprecision and information quantity/quality trades through fidelity uncertainty models. The metrics for these quantifications are the cumulative distribution function and probability sensitivity derivatives. The method is exemplified through the investigation of the longitudinal handling qualities trends for a defined High Speed Civil Transport design space, in the presence of fidelity uncertainty in the stability derivatives. Introduction This paper describes techniques developed for evaluating aircraft stability and handling qualities. These techniques are part of a larger, overall design methodology under development by the authors. The core focus of the overall method is evaluating aircraft system feasibility and viability in a multidisciplinary and probabilistic fashion. A simplified view of the new approach is shown inFigure 1, and Ref. [1] provides a comprehensive description of key elements of, and the rationale behind, the method. In this setting, the desire to reduce design cycle time and to improve the quality of information available during conceptual design motivate the need for multidisciplinary analysis. As a * Assistant Professor, ASDL Manager, AIAA Senior Member f Graduate Research Assistant, AIAA Student Member AIAA 98-0492, 36th Aerospace Sciences Meeting & Exhibit Copyright © 1998 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved. consequence, the method begins with the building of parametric relationships of disciplinary metrics as a function of elementary design variables (e.g. configuration geometry), based on engineering analysis. These relationships are called metamodels. For example, response surface equations (RSEs) representing drag polars may be formed by using a aerodynamic analysis code with geometric variables as inputs. These RSE's, which capture the individual discipline physics for a class of aircraft, are then integrated into a sizing/synthesis code, which sizes the vehicle for a given mission. After uncertainty models are established for the operation of the vehicle (e.g. random variables for fuel cost, utilization rate, etc.), standard Monte Carlo simulation methods or Fast Probability Integration (FPI) techniques may then be used to determine the system feasibility and viability via the construction of cumulative probability distributions (CDFs) for key system constraints and objectives. The ultimate objective of these probabilistic feasibility and viability investigations is to find robust solutions, which are solutions that maximize the probability of achieving success while satisfying imposed constraints. Design Sp*c* Definition (Elementary Design Variables)

Proceedings ArticleDOI
14 Sep 1998
TL;DR: In this paper, the uncertainty characterization of the measurement quantity by a probability density function is considered and the performance evaluation of the probability density functions estimation of measurement is investigated, given by either the geometrical approach or the proposed approach, called "corrected normality probability".
Abstract: Uncertainty characterization of the measurement quantity by a probability density function is considered. The performance evaluation of the probability density function estimation of the measurement is investigated. It is given by either the geometrical approach or the proposed approach, called "corrected normality probability". An efficient method, the so-called Monte Carlo Latin hypercube sampling, is used to compare the predicted results of both approaches. The choice between these methods critically depends on the error variance and the model nonlinearity.

Journal ArticleDOI
TL;DR: For the Student's t cumulative distribution function F(x;n) with n ≥ 3 degrees of freedom, a corrected normal approximation, Φ(λx), was proposed as an extension of the well-known ordinary normal approximation.

Patent
02 Oct 1998
TL;DR: In this paper, a system and system for automatically refreshing documents in a cache, so that each particular document is refreshed no more often and no less often than needed, is presented.
Abstract: The invention provides a system and system for automatically refreshing documents in a cache, so that each particular document is refreshed no more often and no less often than needed. For each document, the cache estimates a probability distribution of times for client requests for that document and a probability distribution of times for server changes to that document. Times for refresh are selected for each particular document in response to both the estimated probability distribution of times for client requests and the estimated probability distribution of times for server changes. The invention also provides a system and system for objectively estimating the value the cache is providing for the system including the cache. The cache estimates for each document a probability distribution of times for client requests for that document, and determines a cumulative probability distribution which reflects the estimated marginal hit rate at the storage limit of the cache and the marginal advantage of adding storage to the cache.

Journal ArticleDOI
TL;DR: Here it is demonstrated how to construct a continuous stopping boundary from an alpha spending function, useful in the design of clinical trials when they are close to crossing the boundary.
Abstract: Lan and DeMets (1983, Biometrika 70, 659-663) proposed a flexible method for monitoring accumulating data that does not require the number and times of analyses to be specified in advance yet maintains an overall Type I error, alpha. Their method amounts to discretizing a preselected continuous boundary by clumping the density of the boundary crossing time at discrete analysis times and calculating the resultant discrete-time boundary values. In this framework, the cumulative distribution function of the continuous-time stopping rule is used as an alpha spending function. A key assumption that underlies this method is that future analysis times are not chosen on the basis of the current value of the statistic. However, clinical trials may be monitored more frequently when they are close to crossing the boundary. In this situation, the corresponding continuous-time boundary should be used. Here we demonstrate how to construct a continuous stopping boundary from an alpha spending function. This capability is useful in the design of clinical trials. We use the Beta-Blocker Heart Attack Trial (BHAT) and AIDS Clinical Trials Group protocol 021 for illustration.

01 Apr 1998
TL;DR: This paper summarizes how the probability tables are utilized in this MCNP upgrade and compares this treatment with the approximate smooth treatment for some example problems.
Abstract: An upgrade for MCNP has been implemented to sample the neutron cross sections in the unresolved resonance range using probability tables. These probability tables are generated with the cross section processor code NJOY, by using the evaluated statistical information about the resonances to calculate cumulative probability distribution functions for the microscopic total cross section. The elastic, fission, and radiative capture cross sections are also tabulated as the average values of each of these partials conditional upon the value of the total. This paper summarizes how the probability tables are utilized in this MCNP upgrade and compares this treatment with the approximate smooth treatment for some example problems.

Journal ArticleDOI
TL;DR: A new statistic model named Center- Distance Continuous Probability Model (CDCPM) for speech recognition is described, which is based on Center-Distance Normal (CDN) distribution, and a distance measure for CDCPMs is proposed which isbased on the Bayesian minimum classification error (MCE) discrimination.
Abstract: In this paper, a new statistic model named Center-Distance Continuous Probability Model (CDCPM) for speech recognition is described, which is based on Center-Distance Normal (CDN) distribution. In a CDCPM, the probability transition matrix is omitted, and the observation probability density function (PDF) in each state is in the form of embedded multiple-model (EMM) based on the Nearest Neighbour rule. The experimental results on two giant real-world Chinese speech databases and a real-world continuous-manner 2000 phrase system show that this model is a powerful one. Also, a distance measure for CDCPMs is proposed which is based on the Bayesian minimum classification error (MCE) discrimination.

Book ChapterDOI
01 Jan 1998
TL;DR: The nature of probability distributions which are strongly stochastically compatible with a given possibility distribution, and the derivation of frequency distributions from empirical random sets, are discussed.
Abstract: The measurement of set-valued and interval data is considered. Their random interval mathematical representations are introduced. The conditions for distributional (probabilistic and possibilistic) representations and conversions to and among them are established. Some properties of empirical random sets and possibilistic histograms related to strong probabilistic compatibility are described. Finally, the nature of probability distributions which are strongly stochastically compatible with a given possibility distribution, and the derivation of frequency distributions from empirical random sets, are discussed.