scispace - formally typeset
Search or ask a question

Showing papers on "Bayes' theorem published in 1988"


Book
01 Jan 1988

1,522 citations


Journal ArticleDOI
TL;DR: It is shown that Bayes's theorem is the optimal IPR for the particular information measures and criterion functional adopted and that its use leads to the output information being exactly equal to the given input information.
Abstract: In this article statistical inference is viewed as information processing involving input information and output information. After introducing information measures for the input and output information, an information criterion functional is formulated and optimized to obtain an optimal information processing rule (IPR). For the particular information measures and criterion functional adopted, it is shown that Bayes's theorem is the optimal IPR. This optimal IPR is shown to be 100% efficient in the sense that its use leads to the output information being exactly equal to the given input information. Also, the analysis links Bayes's theorem to maximum-entropy considerations.

295 citations


Book ChapterDOI
01 Jan 1988
TL;DR: In this paper, the authors apply the principles of Bayesian reasoning to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions, and propose a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.
Abstract: The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and likelihoods) are assigned in appropriate hypothesis spaces using the Maximum Entropy Principle, and then manipulated via Bayes’ Theorem. Bayesian hypothesis testing requires careful consideration of the prior ranges of any parameters involved, and this leads to a quantitive statement of Occam’s Razor. As an example of this general principle we offer a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.

274 citations


Journal ArticleDOI
Andrew C. Lorenc1, O. Hammon1
TL;DR: In this paper, the authors provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities, and apply Bayes' theorem to derive the well-known formula for the combination of data with errors.
Abstract: This work attempts to provide a theoretical framework for the quality control of data from a large variety of types of observations, with different accuracies and reliabilities. Bayes' theorem is introduced, and is used in a simple example with Gaussian error distributions to derive the well-known formula for the combination of data with errors. A simple model is proposed whereby the error in each datum is either from a known Gaussian distribution, or a gross error, in which case the observation gives no useful information. Bayes' theorem is applied to this, and it is shown that usual operational practice, which is to reject outlying data and to treat the rest as if their errors are Gaussian, is a reasonable approximation to the correct Bayesian analysis. Appropriate rejection criteria are derived in terms of the observational error and the prior probability of a gross error. These ideas have been implemented in a computer program to check pressure, wind, temperature and position data from ships, weather ships, buoys and coastal synoptic reports. Historical information on the accuracies and reliabilities of various classifications of observation is used to provide prior estimates of observational errors and the prior probabilities of gross error. The latter are then updated in the light of information from a current forecast, and from nearby observations (allowing for the inaccuracies and possible gross errors in these) to give new estimates. The final probabilities can be used to reject or accept the data in an objective analysis. Results from trials of this system are given. It is shown to be possible using an archive generated by the system to update the prior error statistics necessary to make the method truly objective. Some practical case studies are shown, and compared with careful human quality control.

147 citations


Journal ArticleDOI
TL;DR: Various approaches to inductive reasoning such as probability kinematics based on information measures, and the combination of uncertain or default information as studied in the field of Artificial Intelligence are discussed.

113 citations


Journal ArticleDOI
TL;DR: A mixed model using a gamma-Poisson distribution with a random scale parameter having an inverse gamma prior and an empirical Bayes approach is used to estimate relative risks for geographic regions and annual rates for demographic groups within each region.
Abstract: A mixed model is proposed for the analysis of geographic variability in mortality rates. In addition to demographic parameters and random geographic parameters, the model includes additional random-effects parameters to adjust for extra-Poisson variability. The model uses a gamma-Poisson distribution with a random scale parameter having an inverse gamma prior. An empirical Bayes approach is used to estimate relative risks for geographic regions and annual rates for demographic groups within each region. Lung cancer in Missouri is used to motivate and illustrate the procedure. Observed disease-specific death rates of specific age/sex groups, within regional units such as counties or cities, are generally quite unreliable for all but the largest units. The amount of information available from any one unit is generally limited. But modeling the variability between and within units can improve estimates, as demonstrated frequently in empirical Bayes examples. A numerical comparison with the fixed eff...

102 citations


Journal ArticleDOI
TL;DR: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny as discussed by the authors, and individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence has been of special concern.
Abstract: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny. Judges' and jurors' ability to understand and use this type of evidence has been of special concern. Finkelstein and Fairley (1970) proposed introducing Bayes' theorem into the courtroom to aid the fact-finder evaluate this type of evidence. The present study addressed individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence. One hundred and eighty continuing education students were presented with a transcript purportedly taken from an actual trial and were asked to make several subjective probabiliy judgments regarding blood-grouping evidence. The results extend to the trial process previous psychological research suggesting that individuals generally underutilize statistical information, as compared to a Bayesian model. In addition, subjects in this study generally ignored the expert's Bayesian explanation of the statistical evidence.

86 citations


Journal ArticleDOI
TL;DR: In this article, a transient stability study of the Taiwan power system using a probabilistic approach was performed using a Bayes' theorem, where the stochastic nature of prefault system loading conditions, as well as other initiating factors, such as the number of faulted circuits and the location and type of faults, was recognized during the compilation of the outage statistics of Taiwan power systems.
Abstract: A transient stability study of the Taiwan power system is performed using a probabilistic approach. The stochastic nature of prefault system loading conditions, as well as other initiating factors, such as the number of faulted circuits and the location and type of faults, was recognized during the compilation of the outage statistics of the Taiwan power system. Thus, a probabilistic stability index which takes these random characteristics of system faults into account is computed by using the concept of conditional probability. In addition, the contributions of various fault events to system instability are analyzed by using Bayes' theorem. The effect of load-level uncertainties on stability indices is also examined. >

76 citations


Journal ArticleDOI
TL;DR: In this article, some relevant theory, defines new criteria for identifying suitable quasirandom sequences and suggests some extensions to the basic integration rules, and various quasireandom methods are compared on the sort of integrals that arise in Bayesian inference and are shown to be much more efficient than Monte Carlo methods.
Abstract: Practical Bayesian statistics with realistic models usually gives posterior distributions that are analytically intractable, and inferences must be made via numerical integration. In many cases, the integrands can be transformed into periodic functions on the unit $d$-dimensional cube, for which quasirandom sequences are known to give efficient numerical integration rules. This paper reviews some relevant theory, defines new criteria for identifying suitable quasirandom sequences and suggests some extensions to the basic integration rules. Various quasirandom methods are then compared on the sort of integrals that arise in Bayesian inference and are shown to be much more efficient than Monte Carlo methods.

73 citations


Journal ArticleDOI
TL;DR: The utility of optimal sampling strategy coupled with adaptive study design in the determination of individual patient and population pharmacokinetic parameter values is evaluated and it is shown that the four optimal points analyzed with the maximum a posteriori probability Bayesian estimator faithfully reproduced both microscopic and hybrid pharmacokinetics parameter values for individual patients.
Abstract: We have evaluated the utility of optimal sampling strategy coupled with adaptive study design in the determination of individual patient and population pharmacokinetic parameter values. In 9 patients with cystic fibrosis receiving a short (1 minute) infusion of ceftazidime pharmacokinetic parameter values were determined with a nonlinear least-squares estimator analyzing a traditional, geometrically spaced set of 12 postinfusion serum samples drawn over 8 hours. These values were compared with values generated from four sample subsets of the 12 obtained at optimal times and analyzed by nonlinear least-squares estimator, as well as a maximum a posteriori probability Bayesian estimator with prior distributions placed on beta and clearance. The four sampling times were determined according to an adaptive design optimization technique that employs sequential updating of population prior distributions on parameter values. Compared with the 12-point determination, the four optimal points analyzed with the maximum a posteriori probability Bayesian estimator faithfully reproduced both microscopic and hybrid pharmacokinetic parameter values for individual patients and, consequently, also produced accurate measures of population central tendency and dispersion. This has important implications in being able to more efficiently derive target patient population pharmacokinetic information for new drugs. This should also allow generation of better concentration-effect relationships in populations of interest.

69 citations


Journal ArticleDOI
TL;DR: In this paper, a hierarchical Bayes approach to the problem of estimating N in the binomial distribution is presented, which provides a simple and flexible way of specifying prior information, and also allows a convenient representation of vague prior knowledge using limiting, improper, prior forms.
Abstract: SUMMARY A hierarchical Bayes approach to the problem of estimating N in the binomial distribution is presented. This provides a simple and flexible way of specifying prior information, and also allows a convenient representation of vague prior knowledge. It yields solutions to the problems of interval estimation, prediction and decision making, as well as that of point estimation. The Bayes estimator compares favourably with the best, previously proposed, point estimators in the literature. unknown parameters N and 0. Most of the literature about statistical analysis of this model has focused on point estimation of N, while interval estimation, prediction and decision making have been little considered; see ? 2. I adopt a hierarchical Bayes approach. This provides a simple way of specifying prior information, and also allows a convenient representation of vague prior knowledge using limiting, improper, prior forms. It leads to solutions of the problems of interval estimation, prediction and decision making, as well as that of point estimation. A difficulty with Bayesian analysis of this problem has been the absence of a sufficiently flexible and tractable family of prior distributions, mainly due to the fact that N is an integer. The present approach gets around this by first assuming that N has a Poisson distribution. The resulting hyperparameters are then continuous-valued, and one may use existing results about conjugate and vague priors in better understood settings.

Journal ArticleDOI
TL;DR: The separable suboptimal strategy is proposed and compared with the optimal one in respect of classification accuracy and the primary results are illustrated by simple examples.

Journal ArticleDOI
TL;DR: The decision theoretic/Bayesian approach is shown to provide a formal justification for the sample sizes often used in practice and shows the conditions under which such sample sizes are clearly inappropriate.
Abstract: A new strategy for the design of Phase II clinical trials is presented which utilizes the information provided by the prior distribution of the response rate, the costs of treating a patient, and the losses or gains resulting from the decisions taken at the completion of the study. A risk function is derived from which one may determine the optimal Bayes sampling plan. The decision theoretic/Bayesian approach is shown to provide a formal justification for the sample sizes often used in practice and shows the conditions under which such sample sizes are clearly inappropriate.

Journal ArticleDOI
TL;DR: The authors present a model for the behavior of software failures that fits into the general framework of empirical Bayes problems; however, they take a proper Bayes approach for inference by viewing the situation as a Bayes empirical-Bayes problem.
Abstract: The authors present a model for the behavior of software failures. Their model fits into the general framework of empirical Bayes problems; however, they take a proper Bayes approach for inference by viewing the situation as a Bayes empirical-Bayes problem. An approximation due to D.V. Lindley (1980) plays a central role in the analysis. They show that the Littlewood-Verall model (1973) is an empirical Bayes model and discuss a fully Bayes analysis of it using the Bayes empirical-Bayes setup. Finally, they apply both models to some actual software failure data and compare their predictive performance. >

Journal ArticleDOI
TL;DR: In this paper, multidimensional probability density and discriminate functions estimates are derived from the (C, 1) means of the multiple Fourier series and convergence conditions are presented and Bayes risk consistency is established under no restrictions put on the class conditional densities.

Journal ArticleDOI
TL;DR: In this article, the authors obtained Bayes estimates of the parameters and reliability function of a 3-parameter Weibull distribution and compared posterior standard-deviation estimates with the corresponding asymptotic standard deviation estimates of their maximum likelihood counterparts.
Abstract: The authors obtain Bayes estimates of the parameters and reliability function of a 3-parameter Weibull distribution and compare posterior standard-deviation estimates with the corresponding asymptotic standard-deviation estimates of their maximum likelihood counterparts. Numerical examples are given. >

Journal ArticleDOI
TL;DR: In this paper, an approximation for the posterior mean and standard deviation of the ability parameter in an item response model is proposed based on a Taylor series approximation of a posterior mean conditional on the item parameters.
Abstract: An approximation is proposed for the posterior mean and standard deviation of the ability parameter in an item response model. The procedure assumes that approximations to the posterior mean and covariance matrix of item parameters are available. It is based on the posterior mean of a Taylor series approximation to the posterior mean conditional on the item parameters. The method is illustrated for the two-parameter logistic model using data from an ACT math test with 39 items. A numerical comparison with the empirical Bayes method using n = 400 examinees shows that the point estimates are very similar but the standard deviations under empirical Bayes are about 2% smaller than those under Bayes. Moreover, when the sample size is decreased to n = 100, the standard deviation under Bayes is shown to increase by 14% in some cases.

Journal ArticleDOI
TL;DR: Bayesian methods for obtaining point and interval estimates from data gathered from capture-recapture surveys are presented and a numerical example involving the estimation of the size of a fish population is given to illustrate the methods.
Abstract: To estimate the total size of a closed population, a multiple capture-recapture sampling design can be used. This sampling design has been used traditionally to estimate the size of wildlife populations and is becoming more widely used to estimate the size of hard-to-count human populations. This paper presents Bayesian methods for obtaining point and interval estimates from data gathered from capture-recapture surveys. A numerical example involving the estimation of the size of a fish population is given to illustrate the methods.

Journal ArticleDOI
TL;DR: This article reintroduces a different form of Bayes' theorem that allows calculation of posttest probabilities by adding quantities known as "weights" that combine information found in both a test's sensitivity and specificity.
Abstract: This article reintroduces a different form of Bayes' theorem that allows calculation of posttest probabilities by adding quantities known as "weights." A weight combines information found in both a test's sensitivity and specificity. A single value can describe how a given test result changes the posttest probability of disease. The use of weights and this form of Bayes' theorem should allow more widespread understanding and use of probability theory in clinical practice.

Journal Article
TL;DR: Introduction to a Seminal Issue of the Boston University Law Review on the Use of Various Formal Theories - including Probability Theory and Bayes' Theorem - to Dissect Factual Inference and Proof in Litigation.
Abstract: What is Bayesianism?.- A Reconceptualization of Civil Trials.- The New Evidence Scholarship: Analyzing the Process of Proof.- Analyzing the Process of Proof: A Brief Rejoinder.- The Role of Evidential Weight in Criminal Proof.- Do We Need a Calculus of Weight to Understand Proof Beyond a Reasonable Doubt?.- Second-Order Evidence and Bayesian Logic.- A Comment in Defense of Reverend Bayes.- A First Look at "Second-Order Evidence".- The Construction of Probability Arguments.- Beating and Boulting an Argument.- Probability and the Processes of Discovery, Proof, and Choice.- Insensitivity, Commitment, Belief, and Other Bayesian Virtues, or, Who Put the Snake in the Warlord's Bed?.- Mapping Inferential Domains.- Summing Up: The Society of Bayesian Trial Lawyers.- Name Index.

Journal ArticleDOI
TL;DR: A method of computing upper and lower bounds on the pairwise Bayes risk for composite classes is developed and numerical examples of the application of the bounding techniques to a problem involving the classification of aircraft are discussed.
Abstract: Upper and lower bounds on the Bayes risk for multiple, composite-hypothesis classification are obtained. Bounds on the Bayes risk for M simple classes are derived in terms of the risk functions for (M-1) classes, and so on, until the desired result depends only on the pairwise (M=2) Bayes risks. A method of computing upper and lower bounds on the pairwise Bayes risk for composite classes is developed. Algorithms for computing the upper and lower bounds for the general M-class case and for composite-hypothesis classes are presented. Numerical examples of the application of the bounding techniques to a problem involving the classification of aircraft are discussed. Results for the bounds and other performance measures are compared for the most interesting cases. >


Journal ArticleDOI
TL;DR: In this article, the prediction problem for linear regression models with elliptical errors when the Bayes prior is non-informative is considered, and it is shown that the prediction density under the elliptical error assumption is exactly the same as that obtained with normally distributed errors.

Journal ArticleDOI
TL;DR: It is shown that this strategy for hypothesis change precludes the solution of certain problems of inductive inference by mechanical means—problems which are solvable by mechanical Means when the restriction to this Bayesian strategy is lifted.
Abstract: The price is failure on a class of inductive inference problems that are easily solved, in contrast, by nonBayesian mechanical learners. By “mechanical” is meant “simulable by Turing machine”. One of the central tenets of Bayesianism, which is common to the heterogeneous collection of views which fall under this rubric, is that hypothesis change proceeds via conditionalization on accumulated evidence, the posterior probability of a given hypothesis on the evidence being computed using Bayes's theorem. We show that this strategy for hypothesis change precludes the solution of certain problems of inductive inference by mechanical means—problems which are solvable by mechanical means when the restriction to this Bayesian strategy is lifted. Our discussion proceeds as follows. After some technical preliminaries, the concept of (formal) learner is introduced along with a criterion of inferential success. Next we specify a class of inductive inference problems, and then define the notion of “Bayesian behavior” on those problems. Finally, we exhibit an inductive inference problem from the specified class such that (a) some nonmechanical Bayesian learner solves the problem, (b) some nonBayesian mechanical learner solves the problem, (c) some mechanical learner manifests Bayesian behavior on the problem, but (d) no mechanical Bayesian learner solves the problem. Insofar as possible terminology and notation are drawn from Osherson, Stob, and Weinstein [1986].

Journal ArticleDOI
TL;DR: A Bayesian analysis of this model is presented, with emphasis on the situation where vague prior knowledge is represented by limiting, improper, prior forms, which provides a test for reliability growth estimates of the number of faults and an evaluation of current system reliability.
Abstract: : A system has an unknown number of faults. Each fault causes a failure of the system, and is then located and removed. The failure times are independent exponential random variables with common mean. A Bayesian analysis of this model is presented, with emphasis on the situation where vague prior knowledge is represented by limiting, improper, prior forms. This provides a test for reliability growth estimates of the number of faults, an evaluation of current system reliability, and a prediction of the time to full debugging. Three examples are given. Keywords: Bayes factor; Improper prior; Non-homogeneous Poisson process; Reliability growth; Software reliability.

Journal ArticleDOI
TL;DR: Bayes theorem and conditional dependence of symptoms: different models applied to data of upper gastrointestinal bleeding suggest that Bayes' theorem should be applied to the case of bleeding in women.
Abstract: Bayes theorem and conditional dependence of symptoms: different models applied to data of upper gastrointestinal bleeding. -

Journal ArticleDOI
TL;DR: An inspection procedure which is jointly optimal from the queue's operational characteristics and quality-control perspectives is found using a queue-inspection renewal cycle analysis (with overall expected profit per unit time as the optimization objective).
Abstract: This article provides a Bayes approach for the quality control of a production process which is defined by an M/G/1 queue. An inspection procedure which is jointly optimal from the queue's operational characteristics and quality-control perspectives is found using a queue-inspection renewal cycle analysis (with overall expected profit per unit time as the optimization objective). Numerical results are obtained, highlighting the relationships between quality control and (queue-like) production management.

Journal ArticleDOI
TL;DR: The Simplicity Postulate as discussed by the authors is a condition imposed by Jeffreys [1948] and [1961] on the so-called prior probability distributions of a test, and it has been interpreted as reasonable degrees of belief.
Abstract: This paper is about the Bayesian theory of inductive inference, and in particular about the status of a condition, called by him the Simplicity Postulate, imposed by Jeffreys [1948] and [1961] on the so-called prior probability distributions. I shall explain what the Simplicity Postulate says presently: first, some background. The context of the discussion will be a set of possible laws hi, ostensibly governing some given domain of phenomena, and a test designed to discriminate between them. The prior probabilities of the hi are here simply their pre-test probabilities; the posterior, or post-test, probability distribution is obtained by combining likelihoods with prior probabilities according to Bayes's Theorem. Posterior probability oc prior probability x likelihood, where the coefficient of proportionality is the prior probability of the test outcome e. The likelihood of hi given e is equal to the probability of e, conditional on hi, and in those cases where hi describes a well-defined statistical model which determines a probability distribution over a set of data-points of which e is one, the likelihood of hi on e, is just the probability assigned e by hi. The prior, and hence also the posterior probabilities, are understood to be relativised to a stock of well-confirmed background theories about the structure of the test, presumed to be neutral between the hi. These probabilities are interpreted by Jeffreys as reasonable degrees of belief. In such circumstances it might seem natural to make the prior probabilities of the hi equal. For reasons which will become apparent shortly, Jeffreys instead stipulates that they should be a decreasing function of the complexity of the hi, where the complexity of a hypothesis is measured by its number of independent adjustable parameters, i.e., the

Journal ArticleDOI
TL;DR: In this article, an approche bayesienne a la modelisation des observations aberrantes and l'examine en relation avec les membres de la famille exponentielle en general and avec la distribution exponentialle en particulier is presented.
Abstract: On presente une approche bayesienne a la modelisation des observations aberrantes et on l'examine en relation avec les membres de la famille exponentielle en general et avec la distribution exponentielle en particulier

Journal ArticleDOI
TL;DR: By applying maximum likelihood and empirical Bayes estimation techniques to a succession of log-linear models for Poisson data, one can incorporate the actuarial notions of risk classification, model-based smoothing, credibility theory, and experience rating under a unified statistical approach to loss prediction.
Abstract: For predicting accident frequencies, a succession of log-linear models for Poisson data, some of which include nested random effects, is introduced. By applying maximum likelihood and empirical Bayes estimation techniques to these models, one can incorporate the actuarial notions of risk classification, model-based smoothing, credibility theory, and experience rating under a unified statistical approach to loss prediction. The performance of these methods is evaluated by using accident data from California.