scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian probability published in 1981"


Journal ArticleDOI
TL;DR: In this article, a convenient notation for matrix-variate distributions is proposed, which, by focusing on the important underlying parameters, eases greatly the task of manipulating such distributions.
Abstract: SUMMARY We introduce and justify a convenient notation for certain matrix-variate distributions which, by its emphasis on the important underlying parameters, and the theory on which it is based, eases greatly the task of manipulating such distributions. Important examples include the matrix-variate normal, t, F and beta, and the Wishart and inverse Wishart distributions. The theory is applied to compound matrix distributions and to Bayesian prediction in the multivariate linear model.

458 citations


Journal ArticleDOI
TL;DR: In this paper, the authors illustrate Bayesian and empirical Bayesian techniques that can be used to summarize the evidence in such data about differences among treatments, thereby obtaining improved estimates of the treatment effect in each experiment, including the one having the largest observed effect.
Abstract: Many studies comparing new treatments to standard treatments consist of parallel randomized experiments. In the example considered here, randomized experiments were conducted in eight schools to determine the effectiveness of special coaching programs for the SAT. The purpose here is to illustrate Bayesian and empirical Bayesian techniques that can be used to help summarize the evidence in such data about differences among treatments, thereby obtaining improved estimates of the treatment effect in each experiment, including the one having the largest observed effect. Three main tools are illustrated: 1) graphical techniques for displaying sensitivity within an empirical Bayes framework, 2) simple simulation techniques for generating Bayesian posterior distributions of individual effects and the largest effect, and 3) methods for monitoring the adequacy of the Bayesian model specification by simulating the posterior predictive distribution in hypothetical replications of the same treatments in the same eig...

263 citations


Journal ArticleDOI
TL;DR: In this paper, a stochastic process is defined whose sample paths may be assumed to be either increasing hazard rates or decreasing hazard rates by properly choosing the parameter functions of the process.
Abstract: : It is suggested that problems in a reliability context may be handled by a Bayesian non-parametric approach. A stochastic process is defined whose sample paths may be assumed to be either increasing hazard rates or decreasing hazard rates by properly choosing the parameter functions of the process. The posterior distribution of the hazard rates are derived for both exact and censored data. Bayes estimates of hazard rates,c.d.f.'s, densities, and means, are found under squared error type loss functions. Some simulation is done and estimates graphed to better understand the estimators. Finally, estimates of the c.d.f. from some data in a paper by Kaplan and Meier are constructed. (Author)

227 citations


Journal ArticleDOI
TL;DR: In this paper, confirmatory factor analysis is considered from a Bayesian viewpoint, in which prior information on parameter is incorporated in the analysis, and an iterative algorithm is developed to obtain the Bayes estimates.
Abstract: Confirmatory factor analysis is considered from a Bayesian viewpoint, in which prior information on parameter is incorporated in the analysis. An iterative algorithm is developed to obtain the Bayes estimates. A numerical example based on longitudinal data is presented. A simulation study is designed to compare the Bayesian approach with the maximum likelihood method.

79 citations



Journal ArticleDOI
TL;DR: In this paper, the authors present a review of Bayesian procedures and results for analyzing sharp and non-sharp hypotheses with explicit use of prior information, including the use of power functions in practice.

70 citations


Journal ArticleDOI
TL;DR: In this paper, the analysis of transformation of observations in the linear model with normal errors proposed by Box & Cox (1964) is considered, and a different choice of noninformative unnormed prior is advocated, which is not outcome dependent.
Abstract: SUMMARY The analysis of transformation of observations in the linear model with normal errors proposed by Box & Cox (1964) is considered. A different choice of noninformative unnormed prior is advocated, which is not outcome dependent. This new selection of prior leads to a formal identity between likelihood and Bayesian inference, both for the estimation of the best transformation to normality and for the presence of homoscedasticity and additivity under this transformation. Extension to a related problem is mentioned.

66 citations



Journal ArticleDOI
01 Jan 1981
TL;DR: In this article, a Pareto optimal solution is proposed to the problem of finding a joint decision procedure for a group of n persons, which maximizes the generalized Nash product over the set of jointly achievable utility n-vectors.
Abstract: SUMMARY A solution is proposed to the problem of finding ajoint decision procedure for a group of n persons. It is any Pareto optimal solution which maximizes the generalized Nash product over the set of jointly achievable utility n-vectors. This result was originally proposed in the theory of bargaining but is readily adapted to the statistical context. The individuals involved need not have identical utility functions or identical prior (posterior) distributions. The solution may be a non-randomized rule but is randomized when the individual opinions or preferences are sufficiently diverse. Applications to hypothesis testing and estimation are included.

51 citations


Journal ArticleDOI
TL;DR: The optimum fixed interval smoothing problem is solved using a Bayesian approach, assuming that the signal is Markov and is corrupted by independent noise (not necessarily additive) and a recursive algorithm to compute the a posteriori smoothed density is obtained.
Abstract: The optimum fixed interval smoothing problem is solved using a Bayesian approach, assuming that the signal is Markov and is corrupted by independent noise (not necessarily additive). A recursive algorithm to compute the a posteriori smoothed density is obtained. Using this recursive algorithm, the smoothed estimate of a binary Markov signal corrupted by an independent noise in a nonlinear manner is determined demonstrating that the Bayesian approach presented in this paper is not restricted to the Gauss-Markov problem.

51 citations


Journal ArticleDOI
TL;DR: A general algorithm is proposed for approximating the similarity matrix and the resulting optimal partition and it appears that the algorithm is successful at identifying the optimal partitions as well as those units whose group membership is doubtful.
Abstract: SUMMARY The amount of computation required for implementing the Bayesian cluster analysis suggested by Binder (1978) is often too large for exact results to be feasible. A general algorithm is proposed for approximating the similarity matrix and the resulting optimal partition. This algorithm is applied to artificial and to real data. For the real data, it appears that the algorithm is successful at identifying the optimal partitions as well as those units whose group membership is doubtful.

Journal ArticleDOI
TL;DR: In this article, a Bayesian approach to regression models with time-varying parameters, or state vector models, is presented, which allows for multiple observations for each time period.


Journal ArticleDOI
TL;DR: In this paper, a quasi-Bayes approach is proposed to estimate the failure intensity of a non-homogeneous Poisson process at the time of failure n. The proposed estimate has the qualitative properties one anticipates from the ordinary Bayes estimate, but it is easy to compute.
Abstract: A non-homogeneous Poisson process has empirically been shown to be useful in tracking the reliability growth of a system as it undergoes development. It is of interest to estimate the failure intensity of this model at the time of failure n. The maximum likelihood estimate is known, but it is desirable to have a Bayesian estimate to allow for input of prior information. Since the ordinary Bayes approach appears to be mathematically intractable, a quasi-Bayes approach is taken. The proposed estimate has the qualitative properties one anticipates from the ordinary Bayes estimate, but it is easy to compute. A numerical example illustrates the Bayesian character of the proposed estimate. A simulation study shows that the proposed estimate, when considered in the classical framework, generally has smaller r.m.s. error than the maximum likelihood estimate.

Journal ArticleDOI
David A. Schum1
TL;DR: In this paper, a Bayesian likelihood-ratio formulation for the inferential value of direct and circumstantial evidence is analyzed under conditions in which testimony comes from observers whose task and behavior are consistent with certain Signal-Detection Theory (SDT) and Bayesian Inference Theory (BIT).

Journal ArticleDOI
TL;DR: Application of Bayes' theorem using data collected provides an insight based upon probabilities and odds in the way preoperative conditions and operative results affect the ultimate treatment result.

Journal ArticleDOI
TL;DR: In this article, the determination of a stopping rule for the detection of the time of an increase in the success probability of a sequence of independent Bernoulli trials is discussed, and the results indicate that the detection procedure is quite effective.

Journal ArticleDOI
TL;DR: In this article, the problem of estimating the risk ratio of two binomial probabilities is re-examined from a Bayesian viewpoint, and a simple graphical presentation of risk ratio assessment is given in such a way that sensitivity to the selected prior distribution can be readily examined.
Abstract: The classical confidence interval approach has failed to find exact intervals, or even a consensus on the best approximate intervals, for the ratio of two binomial probabilities, the so-called risk ratio. The problem is reexamined from a Bayesian viewpoint, and a simple graphical presentation of the risk ratio assessment is given in such a way that sensitivity to the selected prior distribution can be readily examined.

Journal ArticleDOI
TL;DR: A dynamic programming model is proposed and monotonic properties of the optimal expected cumulative discounted reward are proved and optimality properties are given for the case when one prior success probability is known.
Abstract: The paper is initially concerned with monotonic properties of the posterior success probabilities when the prior success probabilities are distributed according to an arbitrary joint distribution function (Bayesian approach). Next a dynamic programming model is proposed and monotonic properties of the optimal expected cumulative discounted reward are proved. Finally, optimality properties are given for the case when one prior success probability is known.

01 Dec 1981
TL;DR: The authors argued that both averaging and conservatism in the Bayesian task occur because subjects produce their judgments by using an adjustment strategy that is qualitatively equivalent to averaging, and two experiments were presented that support this view by showing qualitative errors in the direction of revisions in Bayesian inference that are well-accounted for by the simple adjustment strategy.
Abstract: : Two empirically well supported research findings in the judgment literature are that (1) human judgments often appear to follow an averaging rule, and (2) judgments in Bayesian inference tasks are usually conservative relative to optimal judgments. This paper argues that both averaging and conservatism in the Bayesian task occur because subjects produce their judgments by using an adjustment strategy that is qualitatively equivalent to averaging. Two experiments are presented that support this view by showing qualitative errors in the direction of revisions in the Bayesian task that are well-accounted for by the simple adjustment strategy. Two additional results are also discussed: (1) a tendency for subjects in one experiment to evaluate sample evidence according to representativeness rather than according to relative likelihood, and (2) a strong recency effect that may reflect the influence of the internal representation of sample information during the judgment process. (Author)


Journal ArticleDOI
TL;DR: A class of lower bounds is considered which unifies and extends some well-known bounds on the Bayesian probability of error by considering the ƒ-divergence between two hypotheses.



Journal ArticleDOI
01 Nov 1981
TL;DR: An interactive algorithm is proposed for the problem of selecting one of a finite number of alternatives where each is evaluated in terms of a number of conflicting criteria and ultimately implies probability distributions on the utilities of each alternative.
Abstract: An interactive algorithm is proposed for the problem of selecting one of a finite number of alternatives where each is evaluated in terms of a number of conflicting criteria. A simple form of utility function is assumed, and the possibility is modeled probabilistically that the decisionmaker may at any time indicate a preference between alternatives in conflict with his true utility. On this basis, a formal Bayesian inferential procedure is applied to a sequence of pairwise choices between alternatives made by the decisionmaker to yield estimates of the unknown parameters of the utility function. This ultimately implies probability distributions on the utilities of each alternative. The sequence of pairwise comparisons continues until a satisfactorily short list of alternatives remains after elimination of those inferred to be significantly worse than the best.

Journal ArticleDOI
TL;DR: This study addresses not only model selection, but also model occurrence, i.e., the process by which ‘nature’ chooses a statistical framework in which to generate the data of interest.

Journal ArticleDOI
TL;DR: The objective of the paper is to offer a strategy for progressively specifying a model within that class of linear models, and to display the precise role of each assumption, at offering alternatives to unnecessarily restrictive specifications, and at improving the robustness of the inference procedures the authors discuss.

Journal ArticleDOI
Gary Smith1
TL;DR: In this paper, an alternative hierarchical approach is described and illustrated, in which the identification of a limited number of distinct reasons for prior uncertainty can be converted into a full prior covariance matrix.
Abstract: Linear expenditure systems are widely used to describe consumption and portfolio decisions. However, the complexity of these models makes estimation a formidable task. In earlier work, an exchangeability assumption was used to incorporate subjective a priori information into the estimation of asset demand equations. Here, an alternative hierarchical approach is described and illustrated. This procedure provides a framework in which the identification of a limited number of distinct reasons for prior uncertainty can be converted into a full prior covariance matrix. Such a matrix can then be combined with prior means and the sample data to yield Bayesian parameter estimates.

ReportDOI
01 Mar 1981
TL;DR: In this article, the role of coherence considerations in the definition and measurement of subjective probability is discussed, and a general version of De Finetti's coherence theorem is proved using a variant of Farka's Lemma.
Abstract: : This report discusses the role of coherence considerations in the definition and measurement of subjective probability A general version of De Finetti's coherence theorem--that either a set of betting probabilities obeys the laws of probability or else a sure win is possible for the better--is proved, using a variant of Farka's Lemma This theorem provides the basis for several admissibility theorems for scoring-rule probabilities, under a generalization of scoring rules suggested by Lindley Linear programming methods for identifying and reconciling incoherence are discussed, and a comparison is made with Bayesian reconciliation methods

Journal ArticleDOI
TL;DR: A simple urn model is presented for earthquake prediction statistics that is equivalent to the Bayesian models of Collins, Guagenti and Scirocco, and Kijko.
Abstract: A simple urn model is presented for earthquake prediction statistics. This model is equivalent to the Bayesian models of Collins (1977), Guagenti and Scirocco (1980), and Kijko (1981).