scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian probability published in 1976"


Proceedings ArticleDOI
07 Jun 1976
TL;DR: A subjective Bayesian inference method that realizes some of the advantages of both formal and informal approaches, and modifications needed to deal with the inconsistencies usually found in collections of subjective statements are described.
Abstract: The general problem of drawing inferences from uncertain or incomplete evidence has invited a variety of technical approaches, some mathematically rigorous and some largely informal and intuitive. Most current inference systems in artificial intelligence have emphasized intuitive methods, because the absence of adequate statistical samples forces a reliance on the subjective judgment of human experts. We describe in this paper a subjective Bayesian inference method that realizes some of the advantages of both formal and informal approaches. Of particular interest are the modifications needed to deal with the inconsistencies usually found in collections of subjective statements.

540 citations



Journal ArticleDOI
TL;DR: Inference for a Bernoulli Process (a Bayesian view) as discussed by the authors is a Bayesian approach for Bayesian inference of statistics. But it is not a deterministic approach.
Abstract: (1976). Inference for a Bernoulli Process (a Bayesian View) The American Statistician: Vol. 30, No. 3, pp. 112-119.

121 citations



Posted Content
01 Jan 1976
TL;DR: In this article, Monte Carlo (MC) is used to estimate posterior moments of both structural and reduced form parameters of an equation system, making use of the prior density, the likelihood, and Bayes' Theorem.
Abstract: textabstractMonte Carlo (MC) is used to draw parameter values from a distribution defined on the structural parameter space of an equation system. Making use of the prior density, the likelihood, and Bayes' Theorem it is possible to estimate posterior moments of both structural and reduced form parameters. The MC method allows a rather liberal choice of prior distributions. The number of elementary operations to be preformed need not be an explosive function of the number of parameters involved. The method overcomes some existing difficulties of applying Bayesian methods to medium size models. The method is applied to a small scale macro model. The prior information used stems from considerations regarding short and long run behavior of the model and form extraneous observations on empirical long term ratios of economic variables. Likelihood contours for several parameter combinations are plotted, and some marginal posterior densities are assessed by MC.

74 citations


Journal ArticleDOI
TL;DR: The results of both experiments suggest the necessity of a two-factor theory for short-term memory and show that both θs and θτ decrease as a function of increasing retention interval.

54 citations


Journal ArticleDOI
TL;DR: This work presents a Bayesian procedure that does allow using both sets of test data gathered both at the component level of a multicomponent system and at the system level.
Abstract: On occasion, reliability analysts have test data gathered both at the component level of a multicomponent system and at the system level. When the system test data provide no information on component performance, classical statistical techniques in all but trivial cases do not allow using both sets of data. We present a Bayesian procedure that does allow using both sets. The procedure for attribute data makes use of a lemma that relates the moments of the prior and posterior distributions of reliability to the test data. The procedure for variables data assumes the time to failure distribution of each component is exponential.

48 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider sequential tests of the hypothesis Ho: 0 6 and compare the properties of the approximate Bayesian test, the sequential probability ratio test, and the fixed sample size test.
Abstract: SUMMARY Let X1, X2, ... denote independent random variables which are normally distributed with unknown mean 0 and unit variance. We consider sequential tests of the hypothesis Ho: 0 6. The tests which we consider were shown by Schwarz (1962) to approximate the optimal Bayesian tests with respect to a general loss structure and any prior density which is everywhere positive. Their continuation regions are bounded subsets of the (n, Sn) plane, where Sn is the cumulative sum. We give both inequalities and asymptotic expressions for the power function and the expected sample size. We also give comparisons of the properties of the approximate Bayesian test, the sequential probability ratio test, and the fixed sample size test.

43 citations


Journal ArticleDOI
TL;DR: In this article, likelihood and Bayesian methods are presented and shown to permit inference concerning the relative goodness of several potential model candidates with respect to a given set of flood events, further combined within a decision theoretic structure for examination of the anticipated economic consequences of model uncertainty regarding decisions concerning flood protection levels.
Abstract: Constrained by computational feasibility, attempts to describe random natural phenomena of complex origin analytically can lead to a multiplicity of simplistic potentially representative model forms, as has occurred in the case of flood frequency analysis. Classical statistical methods inadequately confront this model uncertainty. Likelihood and Bayesian methods are presented and shown to permit inference concerning the relative goodness of several potential model candidates with respect to a given set of flood events. The Bayesian inferences are further combined within a decision theoretic structure for examination of the anticipated economic consequences of model uncertainty regarding decisions concerning flood protection levels. Results show that Bayesian methods supply more precise information but require greater effort. Since a model world of simplistic forms may never be defined absolutely, both decision and inference remain subject to the astute judgment of the analyst.

37 citations


Journal ArticleDOI
TL;DR: The Bayesian algorithm presented in this paper provides a generalized procedure for determining the minimum cost sample size n* and acceptance number c* for single sample attribute acceptance plans.
Abstract: The Bayesian algorithm presented in this paper provides a generalized procedure for determining the minimum cost sample size n* and acceptance number c* for single sample attribute acceptance plans. The algorithm is applicable to a broad range of acceptance sampling problems, assuming only that the distributions of product quality are discrete, and that the sampling cost is either a linear or strictly convex function of the sample size. Experimental results are presented that compare the solution quality and the computational requirements of this algorithm with three types of previously reported procedures: 1 Bayesian decision tree methods, 2 analytic approximation methods, and 3 direct search techniques. The results indicate that the algorithm produces the optimal solution with minimal computational requirements over a wide range of acceptance sampling problem types.

31 citations


Journal ArticleDOI
TL;DR: In this article, a Bayesian full information analysis of the simultaneous equations model, based upon an extended natural conjugate prior density, is presented. But the analysis is restricted to the multivariate Student form.
Abstract: The paper reviews and extends a Bayesian full information analysis of the simultaneous equations model, based upon an extended natural conjugate prior density. The extended prior density belongs to a closed family and is compatible with the independent specification for each equation of a marginal prior density in the multivariate Student form. The paper establishes properties of the resulting posterior density, of some conditional and marginal densities, and of the posterior densities in the special case of seemingly unrelated regression equations.

Book ChapterDOI
01 Jan 1976
TL;DR: In this article, the authors give an account of Ramsey test conditionals adequate to construct iterative probability models free of all such difficulties, which is the main reason why Stalnaker has raised serious difficulties with my construction.
Abstract: In ‘Rational Belief Change, Popper Functions and Counterfactuals’ (Harper [1]), I used Popper’s treatment of conditional probability and an account of conditionals motivated by Ramsey’s test for acceptability of hypotheticals to construct iterative probability models.†An iterative probability model is to be an extension of the Bayesian representation of rational belief that allows for iterated shifting by conditionalization on new inputs even when this involves iterated revisions of previously accepted evidence. Stalnaker has raised serious difficulties with my construction. The source of these difficulties is my account of Ramsey test conditionals. In this paper I give an account of Ramsey test conditionals adequate to construct iterative probability models free of all such difficulties.

Journal ArticleDOI
TL;DR: A method is described in which distributions of parameters in problems of Bayesian statistical inference are handled as arrays in computer storage, which results in a very flexible approach to problems in nonlinear regression, fitting of frequency functions, model discrimination, etc.
Abstract: SUMMARY A method is described in which distributions of parameters in problems of Bayesian statistical inference are handled as arrays in computer storage. This results in a very flexible approach to problems in nonlinear regression, fitting of frequency functions, model discrimination, etc. It is particularly valuable in finding marginal and conditional parameter distributions and distributions of functions of the parameters in a model. It does so with no requirements for linearization of models or for specific forms for distribution functions. If more than a few parameters are present the requirements for computer storage may be large.

Journal ArticleDOI
TL;DR: An overview of Bayesian statistical decision theory is presented in the tutorial spirit and selected applications of the Bayesian approach to parameter estimation, pattern recognition, image processing, computer-aided medical diagnosis, optimal diagnostic test selection, and radiotherapy treatment planning are selected.
Abstract: An overview of Bayesian statistical decision theory is presented in the tutorial spirit. A section on fundamental principles is followed by selected applications of the Bayesian approach to parameter estimation, pattern recognition,image processing, computer‐aided medical diagnosis, optimal diagnostic test selection, and radiotherapytreatment planning.

Journal ArticleDOI
TL;DR: In this article, a Bayes decision theoretic approach consists in seeking a decision that minimizes the Bayes risk function, and also evaluates the decision taken and compares the expected cost of delaying the construction to the worth of additional information resulting from such a delay.
Abstract: Economic and sociopolitical aspects of land reclamation in areas that necessitate drainage are combined with technical problems to yield a set of possible decisions, among which an optimum design is chosen. The loss or objective function is the sum of the expected damage caused by the submersion of given crops resulting from extreme rainfall events, and the initial cost of the reclamation. When the parameter uncertainty in the probability distribution function of extreme events is taken into account in the loss function, a Bayes risk function is obtained. The Bayes decision theoretic (BDT) approach consists in seeking a decision that minimizes the Bayes risk function. The BDT also evaluates the decision taken and compares the expected cost of delaying the construction to the worth of additional information resulting from such a delay. The practical example of an intensive agricultural system with different type soils and crops illustrates the methodology. Crop loss functions and probability distributions of events are assumed on the basis of empirical observations.

Journal ArticleDOI
TL;DR: In this paper, a comparison of posterior producer and consumer risks in a Bayesian reliability demonstration test using the fitted inverted gamma and uniform distributions as the priors is made using the fitting inverted gamma distribution as the prior.
Abstract: Equipment mean time between failures (MTBF) is assumed to be a frequency random variable. The goodness of fit of the uniform prior as a probability model for the MTBF is compared to the goodness of fit of the inverted gamma prior for actual failure data. These distributions can both be adequately fitted to the same failure data when the method of moments is used to fit the distributions. A comparison of posterior producer and consumer risks in a Bayesian reliability demonstration test is made using the fitted inverted gamma and uniform distributions as the priors. There can be rather large differences in the values of the posterior risks even when the two priors fit the data equally well or equally poorly.

Journal ArticleDOI
TL;DR: In this paper, an extension of Bayesian analysis to dynamic systems is used to obtain an algorithm which describes the time history of variance (uncertainty) in estimates of water quality parameters.
Abstract: An explicit treatment of the uncertainty in the state of water quality in a body of water can provide a quantitative basis for sampling decisions. Filtering theory, an extension of Bayesian analysis to dynamic systems, is used to obtain an algorithm which describes the time history of variance (uncertainty) in estimates of water quality parameters. Uncertainties arising from measurement errors, incompleteness of data, and random fluctuations exhibited by natural phenomena are taken into account. Sampling design capabilities are illustrated in an evaluation of sampling frequencies for the National Eutrophication Survey. The adequacy of any sampling program is dependent on the available prior data and on the value associated with reductions in uncertainty.


Journal ArticleDOI
TL;DR: In this article, the posterior distributions and the posterior bounds of the reliability functions have been derived for the one and two-parameter exponential distributions. And the posteriors are tabulated and plotted and their robustness studied.
Abstract: The posterior distributions and the posterior bounds of the reliability functions have been derived for the one and twoparameter exponential distributions. Using Grubbs' (1971) data the posteriors are tabulated and plotted and their robustness studied

Journal ArticleDOI
TL;DR: In the testing of new vehicles to establish an estimate of a system failure rate or of system reliability the classical statistical approach does not take into account experience with previous systems or the intent of the designer.
Abstract: In the testing of new vehicles to establish an estimate of a system failure rate or of system reliability the classical statistical approach does not take into account experience with previous systems or the intent of the designer. We have applied the B..




Book ChapterDOI
01 Jan 1976
TL;DR: Use of Bayesian analysis in evaluating the decision implications of a point-of-purchase merchandising experiment shows that decisions about which products to buy depend on the amount and type of information that is available at the point of purchase.
Abstract: Use of Bayesian analysis in evaluating the decision implications of a point-of-purchase merchandising experiment.

Journal ArticleDOI
TL;DR: In this paper, a Bayesian estimation of CES production functions is presented, which is easier to use than methods so far developed and it allows a direct comparison with the maximum likelihood estimator.


01 Jan 1976
TL;DR: In this paper, the authors investigate the behavior of a two-parameter failure model subject to both behaving as random variables, and show that using the coefficient of variation as the prior in a Bayesian setting makes sense.
Abstract: : In the last five or six years there has been a considerable amount of rising interest in the Bayesian approach to reliability. To a practicing reliability scientist, such an approach would seem quite appealing because it provides a way for the formulation of a distributional form of the unknown parameters inherent within the failure model, based on prior convictions or information available to him. Evans and Drake have given an excellent account for the use of Bayesian theory in reliability. Furthermore, from a practical point of view, Feduccia states that employing the reliability prediction and its associated measure of uncertainty, the coefficient of variation as the 'prior' in a Bayesian setting makes sense. The aims of the present study is to investigate the behavior of a two parameter failure model subject to both behaving as random variables.

Proceedings ArticleDOI
25 May 1976
TL;DR: A multi-step method for forming variable valued logic hypotheses is given and an experiment applying this method to a problem in medical diagnosis is described.
Abstract: A multi-step method for forming variable valued logic hypotheses is given. An experiment applying this method to a problem in medical diagnosis is described. The results are compared to those of a direct (one-step) method and to a classical Bayesian approach.

Journal ArticleDOI
TL;DR: In this article, the authors developed the Bayesian estimation of the parameters of Solow's distributed lag model with implicit autocorrelation of disturbances in its autoregressive form.