scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian inference published in 1988"


Book
01 Jan 1988
TL;DR: In this article, the authors present a model for estimating parameters and fitting of probability distributions from the normal distribution. But the model is not suitable for the analysis of categorical data.
Abstract: 1. Probability. 2. Random Variables. 3. Joint Distributions. 4. Expected Values. 5. Limit Theorems. 6. Distributions Derived from the Normal Distribution. 7. Survey Sampling. 8. Estimation of Parameters and Fitting of Probability Distributions. 9. Testing Hypotheses and Assessing Goodness of Fit. 10. Summarizing Data. 11. Comparing Two Samples. 12. The Analysis of Variance. 13. The Analysis of Categorical Data. 14. Linear Least Squares. 15. Decision Theory and Bayesian Inference.

3,521 citations


Book ChapterDOI
01 Jan 1988
TL;DR: Bayesian methods provide a formalism for reasoning about partial beliefs under conditions of uncertainty, where propositions are given numerical parameters signifying the degree of belief accorded them under some body of knowledge, and the parameters are combined and manipulated according to the rules of probability theory.
Abstract: Publisher Summary Bayesian methods provide a formalism for reasoning about partial beliefs under conditions of uncertainty. In this formalism, propositions are given numerical parameters signifying the degree of belief accorded them under some body of knowledge, and the parameters are combined and manipulated according to the rules of probability theory. Bayesian philosophers see the conditional relationship as more basic than that of joint events, i.e., more compatible with the organization of human knowledge. Any joint probability function represents a complete probabilistic model. Joint distribution functions are mathematical constructs of primarily theoretical use. The prevailing convention in the Bayesian formalism is to assume that probabilistic summaries of virtual evidence are produced independently of previous information; they are interpreted as local binary relations between the evidence and the hypothesis upon which it bears, independent of other information in the system.

747 citations


Journal ArticleDOI
TL;DR: This paper examined several grounds for doubting the value of much of the special attention recently devoted to unit root econometrics and showed that unit root hypotheses are less well connected to economic theory than is often suggested or assumed.

545 citations


Book ChapterDOI
01 Jan 1988
TL;DR: In this paper, the authors apply the principles of Bayesian reasoning to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions, and propose a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.
Abstract: The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and likelihoods) are assigned in appropriate hypothesis spaces using the Maximum Entropy Principle, and then manipulated via Bayes’ Theorem. Bayesian hypothesis testing requires careful consideration of the prior ranges of any parameters involved, and this leads to a quantitive statement of Occam’s Razor. As an example of this general principle we offer a solution to an important problem in regression analysis; determining the optimal number of parameters to use when fitting graphical data with a set of basis functions.

274 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem in empirical geomagnetic modeling, with critical examination of recently published studies.
Abstract: The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

161 citations


Journal ArticleDOI
01 May 1988
TL;DR: This paper shows that the difficulties McDermott described are a result of insisting on using logic as the language of commonsense reasoning, and if (Bayesian) probability is used, none of the technical difficulties found in using logic arise.
Abstract: The paper examines issues connected with the choice of the best method for representing and reasoning about common sense. McDermott (1978) has shown that a direct translation of common sense reasoning into logical form leads to insurmountable difficulties. It is shown, in the present work, that if Bayesian probability is used instead of logic as the language of such reasoning, none of the technical difficulties found in using logic arise. Bayesian inference is applied to a simple example of linguistic information to illustrate the potential of this type of inference for artificial intelligence.

161 citations


Journal ArticleDOI
TL;DR: In this paper, Bricogne et al. used the saddlepoint method to construct a joint probability distribution of an arbitrary collection of structure factors from one or several crystal forms of an unknown molecule, each comprising one or many isomorphous structures related by substitution operations, possibly containing solvent regions and known fragments, obeying a set of non-crystallographic symmetries.
Abstract: In this first of three papers on a full Bayesian theory of crystal structure determination, it is shown that all currently used sources of phase information can be represented and combined through a universal expression for the joint probability distribution of structure factors Particular attention is given to situations arising in macromolecular crystallography, where the proper treatment of non-uniform distributions of atoms is absolutely essential A procedure is presented, in stages of gradually increasing complexity, for constructing the joint probability distribution of an arbitrary collection of structure factors These structure factors may be gathered from one or several crystal forms of an unknown molecule, each comprising one or several isomorphous structures related by substitution operations, possibly containing solvent regions and known fragments, and/or obeying a set of non-crystallographic symmetries This universal joint probability distribution can be effectively approximated by the saddlepoint method, using maximum-entropy distributions of atoms [Bricogne (1984) Acta Cryst A40, 410-445] and a generalization of structure-factor algebra Atomic scattering factors may assume arbitrary complex values, so that this formalism applies to neutron as well as to X-ray diffraction methods This unified procedure will later be extended by the construction of conditional distributions allowing phase extension, and of likelihood functions capable of detecting and characterizing all potential sources of phase information considered so far, thus completing the formulation of a full Bayesian inference scheme for crystal structure determination

137 citations


Book ChapterDOI
TL;DR: The chapter discusses the nonparametric inference about stress–strength reliability, non parametric inferenceabout stress– strength reliability, the normal and the Weibull stress-strength models, extensions of the basic stress– Strength model, and Bayesian inference procedures.
Abstract: Publisher Summary When ascertaining the reliability of equipment or the viability of a material, it is also necessary to take into account the stress conditions of the operating environment. That is, uncertainty about the actual environmental stress to be encountered should be modeled as random. The terminology stress-strength model makes explicit that both stress and strength are treated as random variables. In the simplest stress-strength model, X is the stress placed on the unit by the operating environment and Y is the strength of the unit. A unit is able to perform its intended function if its strength is greater than the stress imposed upon it. Reliability is defined as the probability that the unit is strong enough to overcome the stress. This model has found an increasing number of applications in civil, mechanical, and aerospace engineering. The chapter discusses the nonparametric inference about stress–strength reliability, nonparametric inference about stress–strength reliability, the normal and the Weibull stress–strength models, extensions of the basic stress–strength model, and Bayesian inference procedures.

128 citations


Journal Article
TL;DR: In this paper, the transferable belief model (TBM) is defined as a generalization of the Bayesian model or of the upper and lower probabilities model, and its interpretation is discussed.
Abstract: Two models are proposed to quantify someone's degree of belief, based respectively on probability functions, the Bayesian model, and on belief functions, the transferable belief model (Shafer 1976). The first, and by far the oldest, is well established and supported by excellent axiomatic and behaviour arguments. The model based on belief functions is often understood as some kind of generalization either of the Bayesian model or of the upper and lower probabilities model. Therefore we present our interpretation of the model developed initially by Shafer in his book (1977) and called here the transferable belief model. The major point of our interpretation is the fact-we try to dissociate completely the transferable belief model from any model based on probability functions.

91 citations


Journal ArticleDOI
TL;DR: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny as discussed by the authors, and individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence has been of special concern.
Abstract: The use of statistics and probabilities as legal evidence has recently come under increased scrutiny. Judges' and jurors' ability to understand and use this type of evidence has been of special concern. Finkelstein and Fairley (1970) proposed introducing Bayes' theorem into the courtroom to aid the fact-finder evaluate this type of evidence. The present study addressed individuals' ability to use statistical information as well as their ability to understand and use an expert's Bayesian explanation of that evidence. One hundred and eighty continuing education students were presented with a transcript purportedly taken from an actual trial and were asked to make several subjective probabiliy judgments regarding blood-grouping evidence. The results extend to the trial process previous psychological research suggesting that individuals generally underutilize statistical information, as compared to a Bayesian model. In addition, subjects in this study generally ignored the expert's Bayesian explanation of the statistical evidence.

86 citations


Journal ArticleDOI
TL;DR: A new approach to evaluation function learning using classical pattern classification methods based on Bayesian learning is presented, which resulted in dramatic improvements over a linear evaluation function that has performed at world championship level.

Journal ArticleDOI
TL;DR: In this article, some relevant theory, defines new criteria for identifying suitable quasirandom sequences and suggests some extensions to the basic integration rules, and various quasireandom methods are compared on the sort of integrals that arise in Bayesian inference and are shown to be much more efficient than Monte Carlo methods.
Abstract: Practical Bayesian statistics with realistic models usually gives posterior distributions that are analytically intractable, and inferences must be made via numerical integration. In many cases, the integrands can be transformed into periodic functions on the unit $d$-dimensional cube, for which quasirandom sequences are known to give efficient numerical integration rules. This paper reviews some relevant theory, defines new criteria for identifying suitable quasirandom sequences and suggests some extensions to the basic integration rules. Various quasirandom methods are then compared on the sort of integrals that arise in Bayesian inference and are shown to be much more efficient than Monte Carlo methods.

Journal ArticleDOI
TL;DR: The authors present a model for the behavior of software failures that fits into the general framework of empirical Bayes problems; however, they take a proper Bayes approach for inference by viewing the situation as a Bayes empirical-Bayes problem.
Abstract: The authors present a model for the behavior of software failures. Their model fits into the general framework of empirical Bayes problems; however, they take a proper Bayes approach for inference by viewing the situation as a Bayes empirical-Bayes problem. An approximation due to D.V. Lindley (1980) plays a central role in the analysis. They show that the Littlewood-Verall model (1973) is an empirical Bayes model and discuss a fully Bayes analysis of it using the Bayes empirical-Bayes setup. Finally, they apply both models to some actual software failure data and compare their predictive performance. >

Book ChapterDOI
04 Jul 1988
TL;DR: The interpretation of the model developed initially by Shafer in his book (1977) is presented and called here the transferable belief model, and the major point of the interpretation is the fact-the authors try to dissociate completely the transferability belief model from any model based on probability functions.
Abstract: Two models are proposed to quantify someone's degree of belief, based respectively on probability functions, the Bayesian model, and on belief functions, the transferable belief model (Shafer 1976). The first, and by far the oldest, is well established and supported by excellent axiomatic and behaviour arguments. The model based on belief functions is often understood as some kind of generalization either of the Bayesian model or of the upper and lower probabilities model. Therefore we present our interpretation of the model developed initially by Shafer in his book (1977) and called here the transferable belief model. The major point of our interpretation is the fact-we try to dissociate completely the transferable belief model from any model based on probability functions.

Journal ArticleDOI
TL;DR: In this article, numerical integration strategies involve novel iterative, adaptive uses of Cartesian product and spherical rule quadrature formulae, together with importance sampling techniques, for irregular, multi-parameter likelihoods.

Journal ArticleDOI
TL;DR: Hampel's concept of qualitative robustness (or stability) is applied to estimates of ‘generalized parameters’ (that is, estimates which take values in an abstract metric space) and the incompatibility between robustness and consistency is proved.

Journal ArticleDOI
01 Nov 1988
TL;DR: It is shown that all of the examples of Dempster's rule of combination are identical to Bayesian probability theory, and that the Shafer-Dempster approach to uncertainty management is not fully illustrated.
Abstract: In a previously published paper (ibid., vol.17, no.6, p.968-77, 1987) P.L. Bogler made four points that require further clarification and/or correction concerning Bayesian probabilistic reasoning for the multisensor fusion of identification data: 1) the Bayesian approach forces a common level of abstraction to be defined for all sensors, a level of abstraction that is not meaningful for some sensors; 2) Bayesian results can be unstable and intuitively unsatisfying; 3) Bayesian results are not commutative; and 4) Bayesian results for friend/foe identification can force false inferences concerning the identification of specific aircraft types. These assertions are reviewed and shown to be incorrect. In addition, it is shown that all of the examples of Dempster's rule of combination are identical to Bayesian probability theory. The contention here is not that the Shafer-Dempster approach to uncertainty management is identical to Bayesian probability theory, but rather that the Shafer-Dempster approach is not fully illustrated. >

Journal ArticleDOI
TL;DR: In this paper, it was shown that $B$-identifiability allows a Bayesian solution to the testing problem, where an equivalence relation is defined over parametrizations of probability functions.
Abstract: Identifiability problems have previously precluded a general approach to testing the hypothesis of a "pure" distribution against the alternative of a mixture of distributions. Three types of identifiability are defined, and it is shown that $B$-identifiability allows a Bayesian solution to the testing problem. First, an equivalence relation is defined over parametrizations of probability functions. Then the projection onto the quotient space is shown to give a $B$-identifiable parametrization. Bayesian inference proceeds using the Bayes factor as a "test" criterion.


Journal ArticleDOI
TL;DR: A Bayesian approach to the estimation of an odds ratio from case-control data is considered and a log-normal approximation to the density is shown to be adequate for practical purposes.
Abstract: A Bayesian approach to the estimation of an odds ratio from case-control data is considered. The exact posterior density of the odds ratio and its moments are derived. A log-normal approximation to the density is shown to be adequate for practical purposes. Mechanisms for setting prior parameters are discussed and some examples are presented.

Journal ArticleDOI
TL;DR: In this article, it is shown that paradoxes arise in conditional probability calculations due to incomplete specification of the problem at hand, and that Renyi's axiomatic setup does not resolve them.


Journal ArticleDOI
TL;DR: In this paper, the authors modeled the log per capita real gross domestic product as a third-order autoregression with a pair of complex roots whose amplitude is smaller than the amplitude of the real root, and interpreted the behavior of this terms series in terms of these two amplitudes, the periodicity of the complex roots, and the standard deviation of the disturbance.
Abstract: Log per capita real gross domestic product is modeled as a third-order autoregression with a pair of complex roots whose amplitude is smaller than the amplitude of the real root. The behavior of this terms series is interpreted in terms of these two amplitudes, the periodicity of the complex roots, and the standard deviation of the disturbance. Restrictions are evaluated and inference is conducted using the likelihood principle, applying Monte Carlo integration with importance sampling. These Bayesian procedures efficiently cope with restrictions that are awkward taking a classical approach. We find very little difference in the amplitudes of real roots between countries and of complex roots relative to within-country uncertainty. There are some substantial differences in the periodicities of complex roots, and the greatest differences between countries are found in the standard deviation of the disturbance.

Journal ArticleDOI
TL;DR: In this paper, the reliability function of a parallel redundant system whose components share a common unknown environment cannot be characterized by any of the well-known classes of distributions that have been proposed in the mathematical theory of reliability.
Abstract: A multivariate distribution for describing the life-lengths of the components of a system which operates in an environment that is different from the test bench environment has been proposed by Lindley and Singpurwalla (1986). In this paper, the properties of the reliability function of such a system are studied and comparisons made with the reliability function obtained under the assumption of independence. It is interesting to note that the reliability function of parallel redundant systems whose components share a common unknown environment cannot be characterized by any of the well-known classes of distributions that have been proposed in the mathematical theory of reliability. This observation suggests the need for defining a new class of failure distributions. A formula for making Bayesian inferences for the reliability function is also given. RELIABILITY OF DEPENDENT COMPONENT SYSTEMS; CROSSING PROPERTIES; GOLDEN RATIO; BAYESIAN INFERENCE IN RELIABILITY; CLASSES OF LIFE DISTRIBUTIONS; ROBUSTNESS OF THE INDEPENDENCE ASSUMPTION

Journal ArticleDOI
TL;DR: In this article, a Bayesian approach for the general formulation of the problem, using a class of priors which involve the unknown ratio explicitly, is presented, and the posterior distribution of the ratio is obtained analytically and its properties are investigated.
Abstract: A variety of statistical problems (e.g., slope-ratio and parallel-line bioassay, calibration, bioequivalence) can be viewed as questions of inference on the ratio of two coefflcients in a suitably constructed linear model. This paper develops a Bayesian approach for the general formulation of the problem, using a class of priors which involve the unknown ratio explicitly. The posterior distribution of the ratio is obtained analytically and its properties are investigated, especially the sensitivity to the choice of the prior. Examples are given of applications to slope-ratio bioassay, comparison of the mean effects of two drugs, and a bioequivalence problem.

Journal ArticleDOI
TL;DR: The Simplicity Postulate as discussed by the authors is a condition imposed by Jeffreys [1948] and [1961] on the so-called prior probability distributions of a test, and it has been interpreted as reasonable degrees of belief.
Abstract: This paper is about the Bayesian theory of inductive inference, and in particular about the status of a condition, called by him the Simplicity Postulate, imposed by Jeffreys [1948] and [1961] on the so-called prior probability distributions. I shall explain what the Simplicity Postulate says presently: first, some background. The context of the discussion will be a set of possible laws hi, ostensibly governing some given domain of phenomena, and a test designed to discriminate between them. The prior probabilities of the hi are here simply their pre-test probabilities; the posterior, or post-test, probability distribution is obtained by combining likelihoods with prior probabilities according to Bayes's Theorem. Posterior probability oc prior probability x likelihood, where the coefficient of proportionality is the prior probability of the test outcome e. The likelihood of hi given e is equal to the probability of e, conditional on hi, and in those cases where hi describes a well-defined statistical model which determines a probability distribution over a set of data-points of which e is one, the likelihood of hi on e, is just the probability assigned e by hi. The prior, and hence also the posterior probabilities, are understood to be relativised to a stock of well-confirmed background theories about the structure of the test, presumed to be neutral between the hi. These probabilities are interpreted by Jeffreys as reasonable degrees of belief. In such circumstances it might seem natural to make the prior probabilities of the hi equal. For reasons which will become apparent shortly, Jeffreys instead stipulates that they should be a decreasing function of the complexity of the hi, where the complexity of a hypothesis is measured by its number of independent adjustable parameters, i.e., the

Posted ContentDOI
TL;DR: In this paper, the authors weigh the advantages and disadvantages of stochastic coefficients and suggest procedures to address the identification and estimation problem with weaker and noncontradictory assumptions, arguing that the real aim of inference is prediction and that "imprecise" parameter estimates of a coherent model are acceptable if they forecast well.
Abstract: A general stochastic coefficients model developed by Swamy and Tinsley serves as a reference point for discussion in this second of a series of three articles Other well-known specifications are related to the model. The authors weigh the advantages and disadvantages of stochastic coefficients and suggest procedures to address the identification and estimation problem with weaker and noncontradictory assumptions They argue that the real aim of inference is prediction and that "imprecise" parameter estimates of a coherent model are acceptable if they forecast well.

Journal ArticleDOI
TL;DR: In this paper, a new approach for inference from accelerated life tests is presented, which formulates such problems as inference under Kalman filter models with correlated observation errors, and focuses on exponential life distributions, and the power rule as a time transformation function.
Abstract: SUMMARY We present a new approach for inference from accelerated life tests. Our approach formulates such problems as inference under Kalman filter models with correlated observation errors. We restrict attention to exponential life distributions, and the power rule as a time transformation function. Extensions to other time transformation functions are straightforward; however, extensions to other distributions involve non-linear filtering and are not considered. The advantages of our formulation are that we are able to incorporate uncertainty in the time transformation function and also are able to allow it to change with the stress. To validate our approach we consider some simulated data; to give it a sense of reality we apply it to actual data.

Journal ArticleDOI
TL;DR: A sequential, Bayesian, probabilistic indexing model that explicitly combines expert opinion with data about the system's performance is presented, which has a recursive formula that makes the model computationally feasible for large information bases.
Abstract: A decision maker's performance relies on the availability of relevant information. In many environments, the relation between the decision maker's informational needs and the information base is complex and uncertain. A fundamental concept of information systems, such as decision support and document retrieval, is the probability that the retrieved information is useful to the decision maker's query. This paper presents a sequential, Bayesian, probabilistic indexing model that explicitly combines expert opinion with data about the system's performance. The expert opinion is encoded into probability statements. These statements are modified by the users' feedback about the relevance of the retrieved information to their queries. The predictive probability that a datum in the information base is applicable to the current query is a logistic function of the expert opinion and the feedback. This feedback enters the computation through a measure of association between the current query-datum pair with previous...

Journal Article
TL;DR: In this paper, a Bayesian analysis of accident data is used in the identification of hazardous locations, and empirical comparisons of the results from the Bayesian model and from classical statistical analyses are also included.
Abstract: A Bayesian analysis of accident data is used in the identification of hazardous locations. The Bayesian model used in the analysis is developed and discussed. Empirical comparisons of the results from the Bayesian analysis and from classical statistical analyses are also included. These comparisons suggest that there is an appreciable difference among the various identification techniques and that some classically based statistical techniques may be prone to err in the direction of false negatives.