scispace - formally typeset
Search or ask a question
Book

An introduction to probability theory

01 Jan 1968-
TL;DR: The authors introduce probability theory for both advanced undergraduate students of statistics and scientists in related fields, drawing on real applications in the physical and biological sciences, and make probability exciting." -Journal of the American Statistical Association
Abstract: This classic text and reference introduces probability theory for both advanced undergraduate students of statistics and scientists in related fields, drawing on real applications in the physical and biological sciences. The book makes probability exciting." -Journal of the American Statistical Association
Citations
More filters
Journal ArticleDOI
Steven L. Heston1
TL;DR: In this paper, a closed-form solution for the price of a European call option on an asset with stochastic volatility is derived based on characteristi c functions and can be applied to other problems.
Abstract: I use a new technique to derive a closed-form solution for the price of a European call option on an asset with stochastic volatility. The model allows arbitrary correlation between volatility and spotasset returns. I introduce stochastic interest rates and show how to apply the model to bond options and foreign currency options. Simulations show that correlation between volatility and the spot asset’s price is important for explaining return skewness and strike-price biases in the BlackScholes (1973) model. The solution technique is based on characteristi c functions and can be applied to other problems.

7,867 citations

Journal ArticleDOI
TL;DR: A "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator and is proposed and tested empirically, showing that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.
Abstract: Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In the paper, a "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator. The relationship between FEP and classical EP (CEP) is similar to that between fast simulated annealing and the classical version. Both analytical and empirical studies have been carried out to evaluate the performance of FEP and CEP for different function optimization problems. The paper shows that FEP is very good at search in a large neighborhood while CEP is better at search in a small local neighborhood. For a suite of 23 benchmark problems, FEP performs much better than CEP for multimodal functions with many local minima while being comparable to CEP in performance for unimodal and multimodal functions with only a few local minima. The paper also shows the relationship between the search step size and the probability of finding a global optimum and thus explains why FEP performs better than CEP on some functions but not on others. In addition, the importance of the neighborhood size and its relationship to the probability of finding a near-optimum is investigated. Based on these analyses, an improved FEP (IFEP) is proposed and tested empirically. This technique mixes different search operators (mutations). The experimental results show that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.

3,412 citations

Journal ArticleDOI
TL;DR: In this article, a nonparametric maximum likelihood estimator for the distribution of unobservables and a computational strategy for implementing it is developed. But the estimator does not account for population variation in observed and unobserved variables unless it is assumed that individuals are homogeneous.
Abstract: Conventional analyses of single spell duration models control for unobservables using a random effect estimator with the distribution of unobservables selected by ad hoc criteria. Both theoretical and empirical examples indicate that estimates of structural parameters obtained from conventional procedures are very sensitive to the choice of mixing distribution. Conventional procedures overparameterize duration models. We develop a consistent nonparametric maximum likelihood estimator for the distribution of unobservables and a computational strategy for implementing it. For a sample of unemployed workers our estimator produces estimates in concordance with standard search theory while conventional estimators do not. ECONOMIC THEORIES of search unemployment (Lippman and McCall [34]; Flinn and Heckman [14]), job turnover (Jovanovic [25]), mortality (Harris [17]), labor supply (Heckman and Willis [23]) and marital instability (Becker [3]) produce structural distributions for durations of occupancy of states. These theories generate qualitative predictions about the effects of changes in parameters on these structural distributions, and occasionally predict their functional forms.2 In order to test economic theories about durations and recover structural parameters, it is necessary to account for population variation in observed and unobserved variables unless it is assumed a priori that individuals are homogeneous.3 In every microeconomic study in which the hypothesis of heterogeneity is subject to test, it is not rejected. Temporally persistent unobserved components are an empirically important fact of life in microeconomic data (Heckman [19]). Since the appearance of papers by Silcock [39] and Blumen, Kogan, and McCarthy [5], social scientists have been aware that failure to adequately control for population heterogeneity can produce severe bias in structural estimates of duration models. Serious empirical analysts attempt to control for these unob

2,940 citations

Journal ArticleDOI
TL;DR: In this article, the authors describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes and show convergence results for a general class of normal mixture models.
Abstract: We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation and are exemplified by special cases where data are modeled as a sample from mixtures of normal distributions. Efficient simulation methods are used to approximate various prior, posterior, and predictive distributions. This allows for direct inference on a variety of practical issues, including problems of local versus global smoothing, uncertainty about density estimates, assessment of modality, and the inference on the numbers of components. Also, convergence results are established for a general class of normal mixture models.

2,473 citations

Journal ArticleDOI
Frank R. Hampel1
TL;DR: In this article, the first derivative of an estimator viewed as functional and the ways in which it can be used to study local robustness properties are discussed, and a theory of robust estimation "near" strict parametric models is briefly sketched and applied to some classical situations.
Abstract: This paper treats essentially the first derivative of an estimator viewed as functional and the ways in which it can be used to study local robustness properties. A theory of robust estimation “near” strict parametric models is briefly sketched and applied to some classical situations. Relations between von Mises functionals, the jackknife and U-statistics are indicated. A number of classical and new estimators are discussed, including trimmed and Winsorized means, Huber-estimators, and more generally maximum likelihood and M-estimators. Finally, a table with some numerical robustness properties is given.

2,410 citations