On minimax statistical decision procedures and their admissibility
Reads0
Chats0
TLDR
In this paper, the authors considered the problem of making a decision on the basis of a sequence of observations on a random variable and gave two loss functions, each depending on the distribution of the random variable, the number of observations taken, and the decision made.Abstract:
This paper is concerned with the problem of making a decision on the basis of a sequence of observations on a random variable. Two loss functions, each depending on the distribution of the random variable, the number of observations taken, and the decision made, are assumed given. Minimax problems can be stated for weighted sums of the two loss functions, or for either one subject to an upper bound on the expectation of the other. Under suitable conditions it is shown that solutions of the first type of problem provide solutions for all problems of the latter types, and that admissibility for a problem of the first type implies admissibility for problems of the latter types. Two examples are given: Estimation of the mean of a random variable which is (1) normal with known variance, (2) rectangular with known range. The resulting minimax estimates are, with a small class of exceptions, proved admissible among the class of all procedures with continuous risk functions. The two loss functions are in each case the number of observations, and an arbitrary nondecreasing function of the absolute error of estimate. Extensions to a function of the number of observations for the first loss function are indicated, and two examples are given for the normal case where the sample size can or must be randomised among more than a consecutive pair of integers.read more
Citations
More filters
Book ChapterDOI
Estimation with Quadratic Loss
W. James,Charles Stein +1 more
TL;DR: In this paper, the authors consider the problem of finding the best unbiased estimator of a linear function of the mean of a set of observed random variables. And they show that for large samples the maximum likelihood estimator approximately minimizes the mean squared error when compared with other reasonable estimators.
Journal ArticleDOI
Admissible Estimators, Recurrent Diffusions, and Insoluble Boundary Value Problems
BookDOI
Decision theory : principles and approaches
TL;DR: A guided tour of decision theory can be found in this paper, where the authors present a set of decision-theoretic approaches to sample size, including the standard gamble, the "standard gamble of money", and the axioms of probabilities.
Journal ArticleDOI
Lectures on the theory of estimation of many parameters
TL;DR: In this paper, a series of lectures given by Prof. Charles Stein of Stanford University at LOMI AN SSSR in the fall of 1976 were used to discuss several problems related to the estimation of multivariate parameters and poses some unsolved problems.
References
More filters
Journal ArticleDOI
Sequential Confidence Intervals for the Mean of a Normal Distribution with Known Variance
Charles Stein,Abraham Wald +1 more
TL;DR: In this paper, the authors consider sequential procedures for obtaining confidence intervals of prescribed length and confidence coefficient for the mean of a normal distribution with known variance, and prove that the usual non-sequential procedure is optimum.