scispace - formally typeset
Open AccessJournal ArticleDOI

Bayes theory: Hartigan, Springer-Verlag, New York 1983, p. 145, DM 46,-

D. V. Lindley
- 01 Dec 1984 - 
- Vol. 31, Iss: 1, pp 214-214
Reads0
Chats0
TLDR
The theory of Bayesian inference at a rather sophisticated mathematical level is discussed in this paper, which is based on lectures given to students who already have had a course in measure-theoretic probability and has the rather clipped style of notes.
Abstract
This is a book about the theory of Bayesian inference at a rather sophisticated mathematical level. It is based on lectures given to students who already have had a course in measure-theoretic probability, and has the rather clipped style of notes. This led me to some difficulties of comprehension, especially when typographical errors occur, as in the definition of a random variable. Against this there is no unnecessary material and space for a few human touches. The development takes as fundamental the notion of expectation, though that word is scarcely used it does not appear in the inadequate index but has a brief mention on page 17. The book begins therefore with linear, non-negative, continuous operators and the treatment has the novelty that it does not require that the total probability be one: indeed, infinity is admitted, this having the advantage that improper distributions of the Jeffreys type can be included. There is an original and interesting account of marginal and conditional distributions with impropriety. For example, in discussing a uniform distribution over pairs (i,D of integers, the sets]=l and ]------2 both have infinite probability and cannot therefore be compared; so that conditional probabilities p( i=l / ]=l) , p(i=lff------2) require separate discussion. My own view is that this feature is not needed, for although improper distributions have some interest in low dimensions (and mainly in achieving an unnecessary match between Bayesian and Fisherian ideas) they fail in high dimensions, as Hartigan shows in chapter 9, where there is an admirable account of many normal means. A lesser objection is the complexity introduced by admitting impropriety: Bayes theorem takes 14 lines to state and 20 to prove. Chapter 5 is interestingly called \"Making Probabilities\" and discusses Jaynes' maximum entropy principle, Jeffreys' invariance, and similarity as ways of constructing distributions; those produced by the first two methods are typically improper. This attitude is continued into chapter 8 where exponential families are introduced as those minimizing information subject to constraints. There is a discussion of decision theory, as distinct from inference, but there is no attempt to consider utility: all is with respect to an undefined loss function. The consideration of the different types of admissibility is very brief and the opportunity to discuss the mathematically sensitive but practically meaningful aspects of this topic is lost. Other chapters are concerned with convergence, unbiasedness and confidence, multinomials, asymptotic normality, robustness and non-parametric procedures; the last being mainly devoted to a good account of the Dirichlet process. Before all this mathematics, the book begins with a brief account of the various theories of probability: logical, empirical and subjective. At the end of the account is a fascinating discussion of why the author thinks \"there is a probability 0.05 that there will be a large scale nuclear war between the U.S. and the U.S.S.R before 2000\". This connection between mathematics and reality is most warmly to be welcomed. The merit of this book lies in the novelty of the perspective presented. It is like looking at a courtyard from some unfamiliar window in an upper turret. Things look different from up there. Some corners of the courtyard are completely obscured. (It is suprising that there is no mention at all of the likelihood principle; and only an aside reference to likelihood.) Other matters are better appreciated because of the unfamiliar aspect normal means, for example. The book does not therefore present a balanced view of Bayesian theory but does provide an interesting and valuable account of many aspects of it and should command the attention of any statistical theorist.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Minimally informative prior distributions for non-parametric Bayesian analysis

TL;DR: In this article, the authors address the problem of how to conduct a minimally informative, non-parametric Bayesian analysis and devise a model so that the posterior distribution satisfies a few basic properties.
Journal ArticleDOI

Nonparametric predictive inference for system reliability with redundancy allocation

TL;DR: In this paper, lower and upper probabilities for the reliability of k-out-of-m systems were presented for both series and parallel systems, including series systems with independent K-outof-mi subsystems.
Journal ArticleDOI

Understanding Tourists’ Experiences at Local Markets in Phuket: An Analysis of TripAdvisor Reviews

TL;DR: In this article, the authors explore the tourists' experiences at local markets in Phuket by analyzing the online reviews from TripAdvisor and adopts a novel method by combining the KNIME A...
Journal ArticleDOI

Asymptotic normality of the posterior given a statistic

TL;DR: In this article, the authors established the asymptotic normality and determined the limiting variance of the posterior density for a multivariate parameter, given the value of a consistent Gaussian statistic satisfying a uniform local central limit theorem.
Book ChapterDOI

Observer theory, Bayes theory, and psychophysics

TL;DR: This paper discusses observer theory, gives a sympathetic analysis of its candidacy, describes its relationship to standard Bayesian analysis, and uses it to develop a new account of the relationship between computational theories and psychophysical data.
References
More filters
Journal ArticleDOI

Approximate Bayesian-inference With the Weighted Likelihood Bootstrap

TL;DR: The weighted likelihood bootstrap (WLB) as mentioned in this paper is a generalization of the Rubin's Bayesian bootstrap, which is used to simulate the posterior distribution of a posterior distribution.
Book ChapterDOI

Global Games: Theory and Applications

TL;DR: In this article, Mertens and Zamir have shown how one can give a complete description of the "type" of a player in an incomplete information game in terms of a full hierarchy of beliefs at all levels.
Journal Article

Replicated microarray data

TL;DR: This paper presents an empirical Bayes method for analysing replicated microarray data and presents the results of a simulation study estimating the ROC curve of B and three other statistics for determining differential expression: the average and two simple modifications of the usual t-statistic.
Journal ArticleDOI

"Does One Soros make a difference? A theory of currency crises with large and small traders"

TL;DR: In this article, the authors build a model of currency crises where a single large investor and a continuum of small investors independently decide whether to attack a currency based on their private information about fundamentals.
Journal ArticleDOI

Conditioning as disintegration

TL;DR: Conditional probability distributions seem to have a bad reputation when it comes to rigorous treatment of conditioning, but in print, measurability and averaging properties substitute for intuitive ideas about random variables behaving like constants given particular conditioning information.
Related Papers (5)