scispace - formally typeset
Journal ArticleDOI

Inferences from Multinomial Data: Learning About a Bag of Marbles

Reads0
Chats0
TLDR
In this article, the imprecise Dirichlet model is proposed for multinomial data in cases where there is no prior information and the probabilities are expressed in terms of posterior upper and lower probabilities.
Abstract
A new method is proposed for making inferences from multinomial data in cases where there is no prior information. A paradigm is the problem of predicting the colour of the next marble to be drawn from a bag whose contents are (initially) completely unknown. In such problems we may be unable to formulate a sample space because we do not know what outcomes are possible. This suggests an invariance principle : inferences based on observations should not depend on the sample space in which the observations and future events of interest are represented. Objective Bayesian methods do not satisfy this principle. This paper describes a statistical model, called the imprecise Dirichlet model, for drawing coherent inferences from multinomial data. Inferences are expressed in terms of posterior upper and lower probabilities. The probabilities are initially vacuous, reflecting prior ignorance, but they become more precise as the number of observations increases. This model does satisfy the invariance principle. Two sets of data are analysed in detail. In the first example one red marble is observed in six drawings from a bag. Inferences from the imprecise Dirichlet model are compared with objective Bayesian and frequentist inferences. The second example is an analysis of data from medical trials which compared two treatments for cardiorespiratory failure in newborn babies. There are two problems : to draw conclusions about which treatment is more effective and to decide when the randomized trials should be terminated. This example shows how the imprecise Dirichlet model can be used to analyse data in the form of a contingency table.

read more

Citations
More filters
Journal ArticleDOI

Dempster belief functions are based on the principle of complete ignorance

TL;DR: This paper shows that a "principle of complete ignorance" plays a central role in decisions based on Dempster belief functions, when a random message is received and then, in a second stage, a true state of nature obtains.
Journal ArticleDOI

Formalization of Evidence: A Comparative Study

TL;DR: This article analyzes and compares several approaches of formalizing the notion of evidence in the context of general-purpose reasoning system, including the approach used in NARS, which is designed according to the considerations ofgeneral-purpose intelligent systems, and provides novel solutions to several traditional problems on evidence.
Journal ArticleDOI

Representation insensitivity in immediate prediction under exchangeability

TL;DR: This work considers immediate predictive inference, where a subject is asked to coherently model his beliefs about the next observation, in terms of a predictive lower prevision, and studies when such predictive lowerprevisions are representation insensitive, meaning that they are essentially independent of the choice of the set of possible values for the random variables.
Journal ArticleDOI

A Model of Prior Ignorance for Inferences in the One-parameter Exponential Family

TL;DR: In this article, the authors propose a model of prior ignorance about a scalar variable based on a set of distributions M. In particular, the set of minimal properties that a set M of distributions should satisfy to be a model without producing vacuous inferences is defined.
Journal ArticleDOI

Inference after checking multiple Bayesian models for data conflict and applications to mitigating the influence of rejected priors

TL;DR: In this paper, the authors combine model checking with robust Bayes to guide inference whether or not a model is found to be inadequate for purposes of data analysis, and the resulting set of adequate models is then used in the second stage either for summarizing a combined posterior such as a maximum-entropy posterior or for inference according to decision rules of the robust-Bayes approach.
References
More filters
Journal ArticleDOI

Bootstrap Methods: Another Look at the Jackknife

TL;DR: In this article, the authors discuss the problem of estimating the sampling distribution of a pre-specified random variable R(X, F) on the basis of the observed data x.
Book

Theory of probability

TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Journal ArticleDOI

A Bayesian Analysis of Some Nonparametric Problems

TL;DR: In this article, a class of prior distributions, called Dirichlet process priors, is proposed for nonparametric problems, for which treatment of many non-parametric statistical problems may be carried out, yielding results that are comparable to the classical theory.
Book

Bayesian inference in statistical analysis

TL;DR: In this article, the effect of non-normality on inference about a population mean with generalizations was investigated. But the authors focused on the effect on the mean with information from more than one source.