scispace - formally typeset
Journal ArticleDOI

Lower bounds on Bayes risks for estimating a normal variance: With applications†

TLDR
Brown and Gajek (1990) gave useful lower bounds on Bayes risks, which improved on earlier bounds by various authors as mentioned in this paper, such as the information inequality, using convexity of appropriate functionals instead of information inequality.
Abstract
Brown and Gajek (1990) gave useful lower bounds on Bayes risks, which improve on earlier bounds by various authors. Many of these use the information inequality. For estimating a normal variance using the invariant quadratic loss and any arbitrary prior on the reciprocal of the variance that is a mixture of Gamma distributions, we obtain lower bounds on Bayes risks that are different from Borovkov-Sakhanienko bounds. The main tool is convexity of appropriate functionals as opposed to the information inequality. The bounds are then applied to many specific examples, including the multi-Bayesian setup (Zidek and his coauthors). Subsequent use of moment theory and geometry gives a number of new results on efficiency of estimates which are linear in the sufficient statistic. These results complement earlier results of Donoho, Liu and MacGibbon (1990), Johnstone and MacGibbon (1992) and Vidakovic and DasGupta (1994) for the location case. Brown et Gajek (1990) ont donne des limites inferieures utiles aux risques de Bayes, qui ameliorent les limites donnees precedemment par differents auteurs. Plusieurs d'entre eux utilisent l'inegalite d'information. En estimant une variance normale a l'aide d'une fonction de perte quadratique invariante et d'un a priori arbitraire sur l'inverse de la variance, qui est un melange de distributions Gamma, on obtient des limites inferieures sur les risques de Bayes qui sont differentes des limites de Borovkov-Sakhanienko. L'outil principal est la convexite de fonctionnels appropries, et non l'inegalite de l'information. Les limites sont alors appliquees a differents exemples, dont a la situation multi-Bayesienne (Zidek et co-auteurs). L'utilisation subsequente de la theorie des moments et de la geometrie donne un certain nombre de nouveaux resultats sur l'efficacite des estimations, qui sont lineaires dans la statistique exhaustive. Ces resultats completent les resultats obtenus precedemment par Donoho, Liu et MacGibbon (1990), Johnson et MacGibbon (1992) et Vidakovic et DasGupta (1994) pour le cas de la position.

read more

Citations
More filters
Journal ArticleDOI

A survey on some inequalities for expectation and variance

TL;DR: In this paper, the authors surveyed the inequalities for continuous random variables having the probability density function defined on a finite interval obtained recently by the authors, and showed that these inequalities hold for any continuous random variable.
Journal ArticleDOI

Variance Estimation in a Model with Gaussian Sub-Models.

TL;DR: It is shown that in the variance estimation problem, efficiency gains arise by exploiting the submodel structure through the use of two-step and weighted estimators, but that this advantage may deteriorate or be lost altogether for some two- step estimators as the number of submodels increases or the distance between them decreases.
Book ChapterDOI

False vs. missed discoveries, Gaussian decision theory, and the Donsker-Varadhan principle

TL;DR: In this paper, it was shown that the minimax risk of estimating the Gaussian mean can be approximated by chasing a Brownian motion to the boundary of the parameter space.
Journal ArticleDOI

Bayes estimators of heterogeneity variance and T-systems

TL;DR: In this article, the problem of simultaneous inference for curve-confined natural parameters of independent, heterogeneous gamma random variables with known shape parameters is considered, and a loss function is suggested that is motivated by meta-analysis.
References
More filters
Book

Testing statistical hypotheses

TL;DR: The general decision problem, the Probability Background, Uniformly Most Powerful Tests, Unbiasedness, Theory and First Applications, and UNbiasedness: Applications to Normal Distributions, Invariance, Linear Hypotheses as discussed by the authors.
Book

Statistical Decision Theory and Bayesian Analysis

TL;DR: An overview of statistical decision theory, which emphasizes the use and application of the philosophical ideas and mathematical structure of decision theory.
Journal ArticleDOI

Minimax Risk Over Hyperrectangles, and Implications

TL;DR: In this paper, it was shown that the difficulty of estimating the mean of a standard Gaussian shift when that mean lies in an orthosymmetric quadratically convex set in 2-dimensional space is measured by the complexity of the problem.
Related Papers (5)