scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian inference published in 1983"


Journal ArticleDOI
TL;DR: It is shown how the successfully used Kalman filter can be easily understood by statisticians if the authors use a Bayesian formulation and some well-known results in multivariate statistics.
Abstract: This is an expository article. Here we show how the successfully used Kalman filter, popular with control engineers and other scientists, can be easily understood by statisticians if we use a Bayesian formulation and some well-known results in multivariate statistics. We also give a simple example illustrating the use of the Kalman filter for quality control work.

595 citations


01 Jan 1983
TL;DR: A particular nondeterministic operator is given, based on statistical mechanics, for updating the truth values of hypothcses, and a learning rule is described which allows a parallel system to converge on a set ofweights that optimizes its perccptt~al inferences.
Abstract: When a vision system creates an interpretation of some input datn, it assigns truth values or probabilities to intcrnal hypothcses about the world. We present a non-dctcrministic method for assigning truth values that avoids many of the problcms encountered by existing relaxation methods. Instead of rcprcscnting probabilitics with realnumbers, we usc a more dircct encoding in which thc probability \ associated with a hypotlmis is rcprcscntcd by the probability hat it is in one of two states, true or false. Wc give a particular nondeterministic operator, based on statistical mechanics, for updating the truth values of hypothcses. The operator ensures that the probability of discovering a particular combination of hypothcscs is a simplc function of how good that combination is. Wc show that thcrc is a simple relationship bctween this operator and Bayesian inference, and we describe a learning rule which allows a parallel system to converge on a set ofweights that optimizes its perccptt~al inferences.

542 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore the potential of Bayesian inference as a theoretical framework for describing how people evaluate hypotheses and identify a set of logically possible forms of non-Bayesian behavior.
Abstract: Bayesian inference provides a general framework for evaluating hypotheses. It is a normative method in the sense of prescribing how hypotheses should be evaluated. However, it may also be used descriptively by characterizing people's actual hypothesis-evaluation behavior in terms of its consistency with or departures from the model. Such a characterization may facilitate the development of psychological accounts of how that behavior is produced. This article explores the potential of Bayesian inference as a theoretical framework for describing how people evaluate hypotheses. First, it identifies a set of logically possible forms of nonBayesian behavior. Second, it reviews existing research in a variety of areas to see whether these possibilities are ever realized. The analysis shows that in some situations several apparently distinct phenomena are usefully viewed as special cases of the same kind of behavior, whereas in other situations previous investigations have conferred a common label (e.g., confirmation bias) to several distinct phenomena. It also calls into question a number of attributions of judgmental bias, suggesting that in some cases the bias is different than what has previously been claimed, whereas in others there may be no bias at all.

542 citations


Journal ArticleDOI
TL;DR: The present article shows that the previous normative analysis of solutions to problems such as the cab problem was incomplete, and that problems of this type require both a signal detection theory and a judgment theory for their proper Bayesian analysis.
Abstract: Several investigators concluded that humans neglect base rate information when asked to solve Bayesian problems intuitively. This conclusion is based on a comparison between normative (calculated) and subjective (responses by naive judges) solutions to problems such as the cab problem. The present article shows that the previous normative analysis was incomplete. In particular, problems of this type require both a signal detection theory and a judgment theory for their proper Bayesian analysis. In Bayes' theorem, posterior odds equals prior odds times the likelihood ratio. Previous solutions have assumed that the likelihood ratio is independent of the base rate, whereas signal detection theory (backed up by data) implies that this ratio depends on base rate. Before the responses of humans are compared with a normative analysis, it seems desirable to be sure that the normative analysis is accurate.

266 citations


Journal ArticleDOI
TL;DR: Birnbaum and Mellers as mentioned in this paper showed that neither Bayes' theorem nor a subjective Bayesian model that allows for "conservatism" due to misperception or response bias could account for the data.
Abstract: Michael H. BirnbaumUniversity of Illinois at Urbana-ChampaignBarbara A. MellersUniversity of California, BerkeleySubjects made judgments of the probability of an event given base-rate informationand the opinion of a source. Base rate and the source's hit and false-alarm rateswere manipulated in a within-subjects design. Hit rate and false-alarm rate weremanipulated to produce sources of varied expertise and bias. The base rate, thesource's opinion, and the source's expertise and bias all had large systematic effects.Although there was no evidence of a "base-rate fallacy," neither Bayes' theoremnor a subjective Bayesian model that allows for "conservatism" due to misperceptionor response bias could account for the data. Responses were consistent with a scale-adjustment averaging model developed by Birnbaum & Stegner (1979). In thismodel, the source's report corresponds to a scale value that is adjusted accordingto the source's bias. This adjusted value is weighted as a function of the source'sexpertise and averaged with the subjective value of the base rate. These results areconsistent with a coherent body of experiments in which the same model couldaccount for a variety of tasks involving the combination of information fromdifferent sources.The question, How should humans revisetheir beliefs? has been studied by philosophersand mathematicians, and the question, Howdo humans form opinions and revise them?has been investigated by psychologists. Earlyresearch that compared the two questionsconcluded that Bayes' theorem was a usefulstarting point for the description of humaninference but that humans are "conservative,"or revise their probability judgments in amanner less extreme than implied by Bayes'theorem (Edwards, 1968; Peterson & Beach,1967; Slovic & Lichtenstein, 1971).Edwards (1968) discussed three interpre-tations of conservatism: misperception, mis-aggregation, and response bias. Misperceptionincludes the possibility that objective proba-bilities are transformed to subjective proba-bilities by a psychophysical function. Misag-gregation refers to use of a non-Bayesian ruleto combine evidence. Response bias refers tononlinearity in the judgment function relatingjudged probabilities to subjective likelihoods.Early experimental work attempted to separate

244 citations


Journal ArticleDOI
TL;DR: Bayesian techniques for samples from classical, generalized and multivariate Pareto distributions are described and emphasis is placed on choosing proper prior distributions that do not lead to anomalous posterior densities.

153 citations


Journal ArticleDOI
J. R. Quinlan1
TL;DR: A new system call INFERNO is introduced, which is probabilistic but makes no assumptions whatsoever about the joint probability distributions of pieces of knowledge, so the correctness of inferences can be guaranteed.
Abstract: : Expert systems commonly employ some means of drawing inferences from domain and problem knowledge, where both the knowledge and its implications are less than certain. Methods used include subjective Bayesian reasoning, measures of belief and disbelief, and the Dempster-Shafer theory of evidence. Analysis of systems based on these methods reveals important deficiencies in areas such as the reliability of deductions and the ability to detect inconsistencies in the knowledge from which deductions were made. A new system call INFERNO addresses some of these points. Its approach is probabilistic but makes no assumptions whatsoever about the joint probability distributions of pieces of knowledge, so the correctness of inferences can be guaranteed. INFERNO informs the user of inconsistencies that may be present in the information presented to it, and can make suggestions about changing the information to make it consistent. An example from a Bayesian system is reworked, and the conclusions reached by that system and INFERNO are compared.

152 citations


Posted Content
TL;DR: This paper is an introduction to the analysis of games with incomplete information, using a Bayesian model, and the concept of virtual utility is developed as a tool for characterizing efficient incentive-compatible coordination mechanisms.
Abstract: This paper is an introduction to the analysis of games with incomplete information, using a Bayesian model. the logical foundations of the Bayesian model are discussed. To describe rational behavior of players in a Bayesian game, two basic solution concerts are present: Bayesian equilibrium, for games in which the players cannot communicate; and Bayesian incentive-compatibility, for games in which the players can communicate. The concept of virtual utility is developed as a tool for characterizing efficient incentive-compatible coordination mechanisms.

137 citations


Journal Article
TL;DR: The probability P is a real-valued function defined by the following axioms due to Kolmogorov: using the fact that A and B are the same, one obtains Bayes' theorem.
Abstract: An abstract definition of probability can be given by considering a set S, called the sample space, and possible subsets A, B,. .. , the interpretation of which is left open. The probability P is a real-valued function defined by the following axioms due to Kolmogorov [9]: From this definition and using the fact that A ∩ B and B ∩ A are the same, one obtains Bayes' theorem, From the three axioms of probability and the definition of conditional probability, one obtains the law of total probability,

121 citations


Book ChapterDOI
TL;DR: In this paper, the authors discuss numerical methods for evaluating key characteristics of posterior and predictive density functions, as well as analytical methods remain indispensable to evaluate these densities, either fully or conditionally on a few parameters amenable to numerical treatment.
Abstract: Publisher Summary This chapter discusses the Bayesian inference and identification. A Bayesian analysis of the scanning electron microscope (SEM) proceeds along the same lines as any other Bayesian analysis. Thus, if the analyst has chosen to work in a given parameter space, a prior density on that space is defined and Bayes theorem is applied to revise this prior density in the light of available data. The resulting posterior density is then used to solve problems of decision and inference. Predictive densities for future observations can also be derived. The chapter discusses numerical methods for evaluating key characteristics of posterior and predictive density functions. For models with many parameters, such as most simultaneous equation models, analytical methods remain indispensable to evaluate these densities—either fully, or conditionally on a few parameters amenable to numerical treatment, or approximately to construct importance functions for Monte Carlo integration. The classes of prior densities permitting analytical evaluation of the posterior density are limited. In most Bayesian analyses they comprise essentially the so-called noninformative and natural-conjugate families.

120 citations


Journal ArticleDOI
TL;DR: In this paper, a new integral identity is adapted from Carlson to represent the moments of quadratic forms under multivariate normal and, more generally, elliptically contoured distributions, which permits the computation of such moments by simple quadrature.
Abstract: This article reviews and interprets recent mathematics of special functions, with emphasis on integral representations of multiple hypergeometric functions. B.C. Carlson's centrally important parameterized functions R and ℛ, initially defined as Dirichlet averages, are expressed as probability-generating functions of mixed multinomial distributions. Various nested families generalizing the Dirichlet distributions are developed for Bayesian inference in multinomial sampling and contingency tables. In the case of many-way tables, this motivates a new generalization of the function ℛ. These distributions are also useful for the modeling of populations of personal probabilities evolving under the process of inference from statistical data. A remarkable new integral identity is adapted from Carlson to represent the moments of quadratic forms under multivariate normal and, more generally, elliptically contoured distributions. This permits the computation of such moments by simple quadrature.

Journal ArticleDOI
TL;DR: Bayesian methods have been applied to many problems, such as real estate tax assessment, economic forecasting, and monetary reform as mentioned in this paper, as well as the development of Bayesian computer programs.
Abstract: It is an honour to present this paper at St John's College, Cambridge, Sir Harold Jeffreys' college. As you all probably know, Sir Harold has made outstanding, pioneering contributions to the development of Bayesian statistical methodology and applications of it to many problems. In appreciation of his great work, our NBER-NSF Seminar on Bayesian Inference has recently published a book (Zellner, 1980a) honouring him. Jeffreys (1967) set a fine example for us by emphasizing both theory and applications in his work. It is this theme, the interaction between theory and application in Bayesian econometrics, that I shall emphasize in what follows. The rapid growth of Bayesian econometrics from its practically non-existent state in the early 1960s to the present (Zellner, 1981) has involved work on Bayesian inference and decision techniques, applications of them to econometric problems and development of Bayesian computer programs.? Selected applications include Geisel (1970, 1975) who used Bayesian prediction and odds ratios to compare the relative performance of simple Keynesian and Quantity of Money Theory macroeconomic models. Peck (1974) utilized Bayesian estimation techniques in an analysis of investment behaviour of firms in the electric utility industry. Varian (1975) developed and applied Bayesian methods for real estate tax assessment problems. Flood and Garber (1980a, b) applied Bayesian methods in study of monetary reforms using data from the German and several other hyperinflations. Evans (1978) employed posterior odds ratios in a study to determine which of three alternative models best explains the German hyperinflation data. Cooley and LeRoy (1981), Shiller (1973), Zellner and Geisel (1970), and Zellner and Williams (1973) employed a Bayesian approach in study of time series models for US money demand, investment and personal consumption data. Production function models have been analysed from the Bayesian point of view by Sankar (1969), Rossi (1980) and Zellner and Richard (1973). Tsurumi (1976) and Tsurumi and Tsurumi (1981) used Bayesian techniques to analyse structural change problems. Reynolds (1980) developed and applied Bayesian estimation and testing procedures in an analysis of survey data relating to health status, income and other variables. Litterman (1980) has formulated a Bayesian vector autoregressive model that he employed (and is employing) to generate forecasts of major US macroeconomic variables that compare very

Proceedings Article
08 Aug 1983
TL;DR: This paper presents a review of different approximate reasoning techniques which have been proposed for dealing with uncertain or imprecise knowledge, especially in expert systems based on production rule methodology.
Abstract: This paper presents a review of different approximate reasoning techniques which have been proposed for dealing with uncertain or imprecise knowledge, especially in expert systems based on production rule methodology. Theoretical approaches such as Bayesian inference, Shafer's belief theory or Zadeh's possibility theory as well as more empirical proposals such as the ones used in MYCIN or in PROSPECTOR, are considered. The presentation is focused on two basic inference schemes : the deductive inference and the combination of several uncertain or imprecise evidences relative to a same matter. Several kinds of uncertainty are taken into account in the models which are described in the paper : different degrees of certainty or of truth may be associated with the observed or produced facts or with the " if.., then..." rules; moreover the statements of facts or of rules may be imprecise or fuzzy and the values of the degrees of certainty which are used may be only approximately known. An extensive bibliography, to which it is referred in the text, is appended.



Journal ArticleDOI
TL;DR: In this paper, the expected value of a random vector with respect to a set-valued probability measure is defined, and a strong law of large numbers is derived in this setting.
Abstract: In this paper we define the expected value of a random vector with respect to a set-valued probability measure. The concepts of independent and identically distributed random vectors are appropriately defined, and a strong law of large numbers is derived in this setting. Finally, an example of a set-valued probability useful in Bayesian inference is provided.



Journal ArticleDOI
TL;DR: Bayesian procedures for inferences about the odds ratio and the proportionratio ⧜=p1/p2 are given in this paper. But their procedures are not suitable for the analysis of large numbers of samples.
Abstract: Bayesian procedures for inferences about the odds ratio and the proportionratio ⧜=p1/p2 are given. The CDF and moments for each of these quantities are calculated using a prior distribution on (p1,p2) that is the product of two independent beta densities. The Bayesian credible intervals obtained using these procedures appear to be somewhat shorter tham the intervals obotained using ancillary statistics.

01 Sep 1983
TL;DR: In this article, a general Bayes rule for the problem of selecting all populations which are close to a control or standard is derived and the rate of convergence of the empirical Bayes risk to the minimum risk is investigated.
Abstract: : This paper deals with the problems of selecting all populations which are close to a control or standard. A general Bayes rule for the above problem is derived. Empirical Bayes rules are derived when the populations are assumed to be uniformly distributed. Under some conditions on the marginal and prior distributions, the rate of convergence of the empirical Bayes risk to the minimum Bayes risk is investigated. (Author)


Journal ArticleDOI
TL;DR: In this paper, a multivariate linear model with missing observations in a nested pattern is discussed, where the predictive density of the missing observations is taken into account in determining the posterior distribution of B and its mean and variance matrix.
Abstract: We discuss the case of the multivariate linear model Y = XB + E with Y an (n × p) matrix, and so on, when there are missing observations in the Y matrix in a so-called nested pattern. We propose an analysis that arises by incorporating the predictive density of the missing observations in determining the posterior distribution of B, and its mean and variance matrix. This involves us with matric-T variables. The resulting analysis is illustrated with some Canadian economic data.

14 Sep 1983
TL;DR: The implementation of confidence computing methods based on Dempster-Shafer Theory, a generalization of Bayesian inference, are described, implemented in the rule-based system ROSIE, and apply to all valid mass assignments.
Abstract: : This report describes the implementation of confidence computing methods based on Dempster-Shafer Theory. The theory is applicable to tactical decision problems that can be formulated in terms of sets of mutually exclusive and exhaustive propositions. Dempster's combining procedure, a generalization of Bayesian inference, is used to combine probability mass assignments supplied by independent bodies of evidence. The computing procedures are implemented in the rule-based system ROSIE, and apply to all valid mass assignments. An ordering strategy is used to combine various kinds of assignments by using different procedures that exploit the special features of each. Applications to platform typing and contact association are demonstrated.


Journal ArticleDOI
TL;DR: The procedure is shown to have some justification on philosophical grounds and practical justification is given in that finding £Theta is computationally feasible in particular cases--those cases investigated here include median, minimum mean square error (MMSE), and maximum {\em a posteriori} probability (MAP) estimation.
Abstract: Statistical inference procedures are considered when less complete prior information is available than usually considered. For the purposes of this paper, the prior information is taken to be the specification of a set of probability measures \cal P . With any one prior probability measure the corresponding Bayes' estimate may be found; the recommended inference procedure when a whole set of prior probabilities \cal P is available is to find the whole set of estimates corresponding to \cal P --this is called the set of feasible estimates \^{\Theta} . The procedure is shown to have some justification on philosophical grounds. Practical justification is also given in that finding \^{\Theta} is computationally feasible in particular cases--those cases investigated here include median, minimum mean square error (MMSE), and maximum {\em a posteriori} probability (MAP) estimation.

Journal ArticleDOI
TL;DR: This paper examined the use of regression analysis for allocating indirect costs and developed alternative allocation approaches and measures of precision that use linear prediction theory and Bayesian inference using a university indirect cost study.
Abstract: This article examines the use of regression analysis for allocating indirect costs. When multiple regression is used to estimate the weights of several allocation factors, conventional standard errors and correlation coefficients can be misleading with respect to the statistical precision of the cost allocations. This article develops alternative allocation approaches and measures of precision that use linear prediction theory and Bayesian inference. The proposed methods are illustrated using a university indirect cost study.



Journal ArticleDOI
TL;DR: Excerpt Statistics is an indispensable tool in clinical research and disagreements over the use of various approaches such as those reflected in the letters-to-the-editor section of this issue (1,2) are reflected.
Abstract: Excerpt Statistics is an indispensable tool in clinical research. Disagreements over the use of various approaches such as those reflected in the letters-to-the-editor section of this issue (1,2) s...