scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian probability published in 1988"




Book ChapterDOI
01 Jan 1988
TL;DR: Bayesian methods provide a formalism for reasoning about partial beliefs under conditions of uncertainty, where propositions are given numerical parameters signifying the degree of belief accorded them under some body of knowledge, and the parameters are combined and manipulated according to the rules of probability theory.
Abstract: Publisher Summary Bayesian methods provide a formalism for reasoning about partial beliefs under conditions of uncertainty. In this formalism, propositions are given numerical parameters signifying the degree of belief accorded them under some body of knowledge, and the parameters are combined and manipulated according to the rules of probability theory. Bayesian philosophers see the conditional relationship as more basic than that of joint events, i.e., more compatible with the organization of human knowledge. Any joint probability function represents a complete probabilistic model. Joint distribution functions are mathematical constructs of primarily theoretical use. The prevailing convention in the Bayesian formalism is to assume that probabilistic summaries of virtual evidence are produced independently of previous information; they are interpreted as local binary relations between the evidence and the hypothesis upon which it bears, independent of other information in the system.

747 citations


Journal ArticleDOI
TL;DR: This paper examined several grounds for doubting the value of much of the special attention recently devoted to unit root econometrics and showed that unit root hypotheses are less well connected to economic theory than is often suggested or assumed.

545 citations


Journal ArticleDOI
TL;DR: This article found that subjects' judgments were indistinguishable from Bayesian performance when random sampling was only verbally asserted and observed by the subjects themselves, while their judgments conformed more to Bayesian theory than to the representativeness hypothesis.
Abstract: Do subjects, in probability revision experiments, generally neglect base rates due to the use of a representativeness heuristic, or does the use of base rates depend on what we call the internal problem representation? In Experiment 1, we used Kahneman and Tversky’s (1973) engineer-lawyer problem, where random sampling of descriptions is crucial to the internal representation of the problem as one in probability revision. If random sampling was performed and observed by the subjects themselves, then their judgments conformed more to Bayesian theory than to the representativeness hypothesis. If random sampling was only verbally asserted, judgments followed the representativeness heuristic. In Experiment 2 we used the soccer problem, which has the same formal structure but which the subjects’ every day experience already represents as a probability revision problem. With this change in content, subjects’ judgments were indistinguishable from Bayesian performance. We conclude that by manipulating presentation and content, one can elicit either base rate neglect or base rate use, as well as points in between. Th is result suggests that representativeness is neither an all-purpose mental strategy nor even a tendency, but rather a function of the content and the presentation of crucial information. From its origins circa 1660 until the mid-nineteenth century, probability theory was closely identifi ed with rational thinking. In Laplace’s famous phrase, probability theory was believed to be “only common sense reduced to calculus” (Laplace, 1814/1951, p. 196). For the classical probabilists, their calculus codifi ed the intuitions of an elite of reasonable men in the face of uncertainty. And if these reasonable intuitions deviated from the laws of probability theory, it was the latter that were cast into doubt. Such discrepancies actually infl uenced the way in which probability theory developed mathematically (Daston, 1980). In the early decades of the nineteenth century, probability theory shifted from being a description of the intuitions of rational individuals to one of the behavior of the irrational masses (Porter, 1986). But in the 1960s and 1970s experimental psychology reestablished the link between probability theory and rational thinking under uncertainty. However, the new alliance diff ered from the old in two important respects. First, it was now probability theory, rather than intuitive judgments, that was the normative standard. Although probabilists have from time to time doubted whether the additivity law holds in all cases (Shafer, 1978), and although there is evidence that diff erent statistical approaches suggest diff erent answers to the same problem (Birnbaum, 1983), psychologists have generally assumed that statistics spoke with one voice—a necessary assumption for the new normative approach. Second, the link between probability theory and human thinking has become the subject of experimental research. First, by using urn-and-balls problems (e.g., Edwards, 1968; Phillips & Edwards, 1966) and then more

365 citations


Book ChapterDOI
01 Jan 1988
TL;DR: In this paper, the authors apply the principles of Bayesian reasoning to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions, and propose a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.
Abstract: The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and likelihoods) are assigned in appropriate hypothesis spaces using the Maximum Entropy Principle, and then manipulated via Bayes’ Theorem. Bayesian hypothesis testing requires careful consideration of the prior ranges of any parameters involved, and this leads to a quantitive statement of Occam’s Razor. As an example of this general principle we offer a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.

274 citations


Book ChapterDOI
01 Jan 1988

222 citations


Proceedings Article
21 Aug 1988
TL;DR: A Bayesian technique for unsupervised classification of data and its computer implementation, AutoClass, which performs as well as or better than other automatic classification systems when run on the same data and contains no ad hoc similarity measures or stopping criteria.
Abstract: This paper describes a Bayesian technique for unsupervised classification of data and its computer implementation, AutoClass. Given real valued or discrete data, AutoClass determines the most probable number of classes present in the data, the most probable descriptions of those classes, and each object's probability of membership in each class. The program performs as well as or better than other automatic classification systems when run on the same data and contains no ad hoc similarity measures or stopping criteria. AutoClass has been applied to several databases in which it has discovered classes representing previously unsuspected phenomena.

195 citations


Journal ArticleDOI
TL;DR: In this paper, a review of Bayesian parameter estimation is given, where Tarantola and Valette can be derived within classical probability theory and the Bayesian approach allows a full resolution and uncertainty analysis which is discussed in Part II of the paper.
Abstract: This paper gives a review of Bayesian parameter estimation. The Bayesian approach is fundamental and applicable to all kinds of inverse problems. Its basic formulation is probabilistic. Information from data is combined with a priori information on model parameters. The result is called the a posteriori probability density function and it is the solution to the inverse problem. In practice an estimate of the parameters is obtained by taking its maximum. Well-known estimation procedures like least-squares inversion or l 1 norm inversion result, depending on the type of noise and a priori information given. Due to the a priori information the maximum will be unique and the estimation procedures will be stable except (in theory) for the most pathological problems which are very unlikely to occur in practice. The approach of Tarantola and Valette can be derived within classical probability theory. The Bayesian approach allows a full resolution and uncertainty analysis which is discussed in Part II of the paper.

178 citations


Journal Article
TL;DR: In this article, the authors define a Bayesian approximation of a belief function and show that combining the Bayesian approximations of belief functions is computationally less involving than combining the belief functions themselves.
Abstract: An often mentioned obstacle for the use of Dempster-Shafer theory for the handling of uncertainty in expert systems is the computational complexity of the theory. One cause of this complexity is the fact that in Dempster-Shafer theory the evidence is represented by a belief function which is induced by a basic probability assignment, i.e. a probability measure on the powerset of possible answers to a question, and not by a probability measure on the set of possible answers to a question, like in a Bayesian approach. In this paper, we define a Bayesian approximation of a belief function and show that combining the Bayesian approximations of belief functions is computationally less involving than combining the belief functions themselves, while in many practical applications replacing the belief functions by their Bayesian approximations will not essentially affect the result.

161 citations


Journal ArticleDOI
Andrew C. Lorenc1, O. Hammon1
TL;DR: In this paper, the authors provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities, and apply Bayes' theorem to derive the well-known formula for the combination of data with errors.
Abstract: This work attempts to provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities. Bayes' theorem is introduced, and is used in a simple example with Gaussian error distributions to derive the well-known formula for the combination of data with errors. A simple model is proposed whereby the error in each datum is either from a known Gaussian distribution, or a gross error, in which case the observation gives no useful information. Bayes' theorem is applied to this, and it is shown that usual operational practice, which is to reject outlying data and to treat the rest as if their errors are Gaussian, is a reasonable approximation to the correct Bayesian analysis. Appropriate rejection criteria are derived in terms of the observational error and the prior probability of a gross error. These ideas have been implemented in a computer program to check pressure, wind, temperature and position data from ships, weather ships, buoys and coastal synoptic reports. Historical information on the accuracies and reliabilities of various classifications of observation is used to provide prior estimates of observational errors and the prior probabilities of gross error. The latter are then updated in the light of information from a current forecast, and from nearby observations (allowing for the inaccuracies and possible gross errors in these) to give new estimates. The final probabilities can be used to reject or accept the data in an objective analysis. Results from trials of this system are given. It is shown to be possible using an archive generated by the system to update the prior error statistics necessary to make the method truly objective. Some practical case studies are shown, and compared with careful human quality control.

Journal ArticleDOI
TL;DR: In this paper, Bricogne et al. used the saddlepoint method to construct a joint probability distribution of an arbitrary collection of structure factors from one or several crystal forms of an unknown molecule, each comprising one or many isomorphous structures related by substitution operations, possibly containing solvent regions and known fragments, obeying a set of non-crystallographic symmetries.
Abstract: In this first of three papers on a full Bayesian theory of crystal structure determination, it is shown that all currently used sources of phase information can be represented and combined through a universal expression for the joint probability distribution of structure factors Particular attention is given to situations arising in macromolecular crystallography, where the proper treatment of non-uniform distributions of atoms is absolutely essential A procedure is presented, in stages of gradually increasing complexity, for constructing the joint probability distribution of an arbitrary collection of structure factors These structure factors may be gathered from one or several crystal forms of an unknown molecule, each comprising one or several isomorphous structures related by substitution operations, possibly containing solvent regions and known fragments, and/or obeying a set of non-crystallographic symmetries This universal joint probability distribution can be effectively approximated by the saddlepoint method, using maximum-entropy distributions of atoms [Bricogne (1984) Acta Cryst A40, 410-445] and a generalization of structure-factor algebra Atomic scattering factors may assume arbitrary complex values, so that this formalism applies to neutron as well as to X-ray diffraction methods This unified procedure will later be extended by the construction of conditional distributions allowing phase extension, and of likelihood functions capable of detecting and characterizing all potential sources of phase information considered so far, thus completing the formulation of a full Bayesian inference scheme for crystal structure determination

Journal ArticleDOI
TL;DR: In this paper, a Bayesian procedure is presented for estimating the reliability of a series system of independent binomial subsystems and components, and the posterior distribution of the overall missile-system reliability from which the required estimates are obtained is computed.
Abstract: A Bayesian procedure is presented for estimating the reliability of a series system of independent binomial subsystems and components. The method considers either test or prior data (perhaps both or neither) at the system, subsystem, and component level. Beta prior distributions are assumed throughout. Inconsistent prior judgments are averaged within the simple-to-use procedure. The method is motivated by the following practical problem. It is required to estimate the overall reliability of a certain air-to-air heat-seeking missile system containing five major subsystems with up to nine components per subsystem. The posterior distribution of the overall missile-system reliability from which the required estimates are obtained is computed.

Journal ArticleDOI
TL;DR: A number of canonical econometric problems are described and analyzed to illustrate the power of the Bayesian approach in econometrics and other areas of science.

Journal ArticleDOI
TL;DR: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny as discussed by the authors, and individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence has been of special concern.
Abstract: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny. Judges' and jurors' ability to understand and use this type of evidence has been of special concern. Finkelstein and Fairley (1970) proposed introducing Bayes' theorem into the courtroom to aid the fact-finder evaluate this type of evidence. The present study addressed individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence. One hundred and eighty continuing education students were presented with a transcript purportedly taken from an actual trial and were asked to make several subjective probabiliy judgments regarding blood-grouping evidence. The results extend to the trial process previous psychological research suggesting that individuals generally underutilize statistical information, as compared to a Bayesian model. In addition, subjects in this study generally ignored the expert's Bayesian explanation of the statistical evidence.

Journal ArticleDOI
TL;DR: In this paper, the authors developed conjugate prior distributions for the von Mises distribution, which they used to compute a posterior distribution of the location of an emergency transmitter in a downed aircraft.
Abstract: We study the problem of determining the location of an emergency transmitter in a downed aircraft. The observations are bearings read at fixed stations. A Bayesian approach, yielding a posterior map of probable locations, seems reasonable in this situation. We therefore develop conjugate prior distributions for the von Mises distribution, which we use to compute a posterior distribution of the location. An approximation to the posterior distribution yields accurate, rapidly computable answers. A common problem with this kind of data is the possibility that signals will reflect off orographic terrain features, resulting in wild bearings. Such bearings can affect the posterior distribution severely. We develop a sensitivity analysis, based on the idea of predictive distribution, to reject wild bearings. The method, which is based on an asymptotic argument, nonetheless performs well in a small simulation study. When the preceding approximation is used, the sensitivity analysis is practical in terms ...

Journal ArticleDOI
TL;DR: In this paper, Bayesian procedures for specification analysis or diagnostic checking of modeling assumptions for structural equations of econometric models are developed and applied using Monte Carlo numerical methods, and checks on the validity of identifying restrictions, exogeneity assumptions and other specifying assumptions are performed using posterior distributions for discrepancy vectors and functions representing departures from specifying assumptions.


Journal ArticleDOI
TL;DR: The decision theoretic/Bayesian approach is shown to provide a formal justification for the sample sizes often used in practice and shows the conditions under which such sample sizes are clearly inappropriate.
Abstract: A new strategy for the design of Phase II clinical trials is presented which utilizes the information provided by the prior distribution of the response rate, the costs of treating a patient, and the losses or gains resulting from the decisions taken at the completion of the study. A risk function is derived from which one may determine the optimal Bayes sampling plan. The decision theoretic/Bayesian approach is shown to provide a formal justification for the sample sizes often used in practice and shows the conditions under which such sample sizes are clearly inappropriate.

Journal ArticleDOI
TL;DR: Taking the Bayesian approach in solving the discrete-time parameter estimation problem has two major results: the unknown parameters are legitimately included as additional system states, and the computational objective becomes calculation of the entire posterior density instead of just its first few moments.
Abstract: Taking the Bayesian approach in solving the discrete-time parameter estimation problem has two major results: the unknown parameters are legitimately included as additional system states, and the computational objective becomes calculation of the entire posterior density instead of just its first few moments. This viewpoint facilitates intuitive analysis, allowing increased qualitative understanding of the system behavior. With the actual posterior density in hand, the true optimal estimate for any given loss function can be calculated. Although the computational burden of doing so might preclude online use, it does not provide a clearly justified baseline for comparative studies. These points are demonstrated by analyzing a scalar problem with a single unknown, and by comparing an established point estimator's performance to the true optimal estimate. >

Journal ArticleDOI
TL;DR: In this article, numerical integration strategies involve novel iterative, adaptive uses of Cartesian product and spherical rule quadrature formulae, together with importance sampling techniques, for irregular, multi-parameter likelihoods.

Journal ArticleDOI
TL;DR: Bayesian methods for obtaining point and interval estimates from data gathered from capture-recapture surveys are presented and a numerical example involving the estimation of the size of a fish population is given to illustrate the methods.
Abstract: To estimate the total size of a closed population, a multiple capture-recapture sampling design can be used. This sampling design has been used traditionally to estimate the size of wildlife populations and is becoming more widely used to estimate the size of hard-to-count human populations. This paper presents Bayesian methods for obtaining point and interval estimates from data gathered from capture-recapture surveys. A numerical example involving the estimation of the size of a fish population is given to illustrate the methods.


Book ChapterDOI
01 Jan 1988
TL;DR: In this paper, the joint posterior probability that multiple frequencies are present, independent of their amplitude and phase, and the noise level, is calculated for computer simulated data and for real data ranging from magnetic resonance to astronomy to economic cycles.
Abstract: Bayesian spectrum analysis is still in its infancy. It was born when E. T. Jaynes derived the periodogram2 as a sufficient statistic for determining the spectrum of a time sampled data set containing a single stationary frequency. Here we extend that analysis and explicitly calculate the joint posterior probability that multiple frequencies are present, independent of their amplitude and phase, and the noise level. This is then generalized to include other parameters such as decay and chirp. Results are given for computer simulated data and for real data ranging from magnetic resonance to astronomy to economic cycles. We find substantial improvements in resolution over Fourier transform methods.

Book
03 Mar 1988
TL;DR: This book is an elementary and practical introduction to probability theory and the per sonal (or subjective) view of probability is adopted throughout, and emphasis is placed on how values are assigned to probabilities in practice, i.e. the measurement of probabilities.
Abstract: This book is an elementary and practical introduction to probability theory. It differs from other introductory texts in two important respects. First, the per sonal (or subjective) view of probability is adopted throughout. Second, emphasis is placed on how values are assigned to probabilities in practice, i.e. the measurement of probabilities. The personal approach to probability is in many ways more natural than other current formulations, and can also provide a broader view of the subject. It thus has a unifying effect. It has also assumed great importance recently because of the growth of Bayesian Statistics. Personal probability is essential for modern Bayesian methods, and it can be difficult for students who have learnt a different view of probability to adapt to Bayesian thinking. This book has been produced in response to that difficulty, to present a thorough introduction to probability from scratch, and entirely in the personal framework."

Journal ArticleDOI
TL;DR: Hampel's concept of qualitative robustness (or stability) is applied to estimates of ‘generalized parameters’ (that is, estimates which take values in an abstract metric space) and the incompatibility between robustness and consistency is proved.

Journal ArticleDOI
TL;DR: In this article, the authors developed a model of sampling plans for variables with a polynomial loss function, in which the decision function is either one-sided or two-sided.
Abstract: SUMMARY In this paper, we develop a model of sampling plans for variables with a polynomial loss function, in which the decision function is either one-sided or two-sided. Based on a Bayesian approach, we suggest a simple finite algorithm for the determination of the optimal single sampling plan. Furthermore, for the case of a symmetric two-sided decision function, we propose an approximate method for determining its optimal single sampling plan.

Journal ArticleDOI
TL;DR: In this article, a Bayesian nonparametric approach to a (right) censored data problem was proposed, based on three assumptions: (a) the new patients and the previous sample patients are all deemed to be exchangeable with regard to survival time, (b) the posterior prediction rule, in the case of no censoring or ties among (say n) observed survival times, assigns equal probability of 1/(n + 1) to each of the n + 1 open intervals determined by these values.
Abstract: This article considers a Bayesian nonparametric approach to a (right) censored data problem. Although the results are applicable to a wide variety of such problems, including reliability analysis, the discussion centers on medical survival studies. We extend the posterior distribution of percentiles given by Hill (1968) to obtain predictive posterior probabilities for the survival of one or more new patients, using data from other individuals having the same disease and given the same treatment. The analysis hinges on three assumptions: (a) The new patients and the previous sample patients are all deemed to be exchangeable with regard to survival time. (b) The posterior prediction rule, in the case of no censoring or ties among (say n) observed survival times, assigns equal probability of 1/(n + 1) to each of the n + 1 open intervals determined by these values. (c) The censoring mechanisms are “noninformative.” Detailed discussion of these assumptions is presented from a Bayesian point of view. I...

Journal ArticleDOI
01 Nov 1988
TL;DR: It is shown that all of the examples of Dempster's rule of combination are identical to Bayesian probability theory, and that the Shafer-Dempster approach to uncertainty management is not fully illustrated.
Abstract: In a previously published paper (ibid., vol.17, no.6, p.968-77, 1987) P.L. Bogler made four points that require further clarification and/or correction concerning Bayesian probabilistic reasoning for the multisensor fusion of identification data: 1) the Bayesian approach forces a common level of abstraction to be defined for all sensors, a level of abstraction that is not meaningful for some sensors; 2) Bayesian results can be unstable and intuitively unsatisfying; 3) Bayesian results are not commutative; and 4) Bayesian results for friend/foe identification can force false inferences concerning the identification of specific aircraft types. These assertions are reviewed and shown to be incorrect. In addition, it is shown that all of the examples of Dempster's rule of combination are identical to Bayesian probability theory. The contention here is not that the Shafer-Dempster approach to uncertainty management is identical to Bayesian probability theory, but rather that the Shafer-Dempster approach is not fully illustrated. >

Journal Article
TL;DR: Introduction to a Seminal Issue of the Boston University Law Review on the Use of Various Formal Theories - including Probability Theory and Bayes' Theorem - to Dissect Factual Inference and Proof in Litigation.
Abstract: What is Bayesianism?.- A Reconceptualization of Civil Trials.- The New Evidence Scholarship: Analyzing the Process of Proof.- Analyzing the Process of Proof: A Brief Rejoinder.- The Role of Evidential Weight in Criminal Proof.- Do We Need a Calculus of Weight to Understand Proof Beyond a Reasonable Doubt?.- Second-Order Evidence and Bayesian Logic.- A Comment in Defense of Reverend Bayes.- A First Look at "Second-Order Evidence".- The Construction of Probability Arguments.- Beating and Boulting an Argument.- Probability and the Processes of Discovery, Proof, and Choice.- Insensitivity, Commitment, Belief, and Other Bayesian Virtues, or, Who Put the Snake in the Warlord's Bed?.- Mapping Inferential Domains.- Summing Up: The Society of Bayesian Trial Lawyers.- Name Index.