scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian inference published in 1978"


Book ChapterDOI
TL;DR: In this article, an objective procedure of evaluation of the prior distribution in a Bayesian model is developed and the classical ignorance prior distribution is newly interpreted as the locally impartial prior distribution.
Abstract: In developing an estimate of the distribution of a future observation it becomes natural and necessary to consider a distribution over the space of parameters. This justifies the use of Bayes procedures in statistical inference. An objective procedure of evaluation of the prior distribution in a Bayesian model is developed and the classical ignorance prior distribution is newly interpreted as the locally impartial prior distribution.

146 citations


Journal ArticleDOI
TL;DR: Shafer's recent book as discussed by the authors offers an original and challenging alternative to the Bayesian theory of epistemic probability, and he rejects the traditional Bayesian rule of conditioning as too limited a procedure for revising prior opinions in the light of new evidence.
Abstract: Glenn Shafer's recent book offers an original and challenging alternative to the Bayesian theory of epistemic probability. Shafer has two principal objections to Bayesian inference. In the first place he rejects as too restrictive the orthodox requirement that degrees of belief should be additive. In his view this has the consequence, amongst others, that the Bayesian theory is unable to represent complete ignorance. 'It does not allow one to withhold belief from a proposition without according that belief to the negation of the proposition' (p. 23). In the second place he rejects the Bayesian rule of conditioning as too limited a procedure for revising prior opinions in the light of new evidence. The rule is criticised for requiring that the new evidence be expressible as a proposition, that the proposition be one for which the prior degree of belief is already established, and that the proposition be known with certainty. For Shafer, a body of evidence is 'a segment of our experience and natural knowledge' (p. 175) and cannot, in general and without distortion, be conceptualised in the way required. In place of the Bayesian rule of conditioning Shafer advocates a rule of 'combination', originally proposed by A. P. Dempster (1967), whose purpose is to combine two or more belief functions based on distinct bodies of evidence into a single belief function based on the combined evidence. I shall now look in more detail at these charac- teristic features of Shafer's theory. If degrees of belief are interpreted by the normal means of betting rates, the condition of additivity follows essentially from the conventions ruling the

41 citations


Journal ArticleDOI
TL;DR: An extension of the so-called secretary problem in which each alternative has an observable value drawn from a distribution unknown a priori is given, and it is shown that when maximizing the probability of selecting the best candidate, learning does not contribute to the solution.
Abstract: Consideration is given to an extension of the so-called secretary problem in which each alternative has an observable value drawn from a distribution unknown a priori. A uniform distribution is considered here, because this gives analytical solutions which are easily compared with previous work. It is shown that when maximizing the probability of selecting the best candidate, learning does not contribute to the solution. When maximizing expected value, learning does play a role, giving a solution intermediate between that based on ranks and that based on known distributions.

40 citations


Journal ArticleDOI
TL;DR: Issues such as optimal parameter estimation, use of synthetic generation in design problems, and the effects of parameter uncertainty on statistical estimation are discussed and applied to the problem of reservoir slorage-yield analysis.

33 citations



Journal ArticleDOI
TL;DR: In this article, a Bayesian treatment of the problem of inference about the reliability of a multicomponent stress-strength system which functions if s or more of k identical components simultaneously operate is provided.
Abstract: This paper provides a Bayesian treatment of the problem of inference about the reliability of a multicomponent stress-strength system which functions if s or more of k identical components simultaneously operate. All stresses and strengths are assumed to be independent, exponentially distributed random variables. Exact and approximate asymptotic posterior distributions for the reliability are derived, and the results are illustrated by a numerical example.

25 citations



Journal ArticleDOI
TL;DR: A modeling approach based on random differential equations, the principle of maximum entropy, and Bayesian inference is more useful than the deterministic modeling for applied problems in ecology and resource management.
Abstract: The usefulness of the Bayesian inference for specifying and updating the parameters of the models of ecological systems based on long-term investigations has been examined. Our analyses indicate that in these systems this inferential procedure provides a rational basis for specifying the parameters and other input quantities of the model. Thus a modeling approach based on random differential equations, the principle of maximum entropy, and Bayesian inference is more useful than the deterministic modeling for applied problems in ecology and resource management.

9 citations


Journal ArticleDOI
TL;DR: A response to the recent discussions critical of the Bayesian learning procedure on the basis of empirically observed deviations from its prescriptions and examples of surprising learning behaviours and decision strategies are generated.
Abstract: A response is made to the recent discussions critical of the Bayesian learning procedure on the basis of empirically observed deviations from its prescriptions. Bayes' theorem is embedded in a more general class of learning rules which allow for departure from the demands of idealized rational behaviour. Such departures are termed learning impediments or disabilities. Some particular forms and interpretations of impediment functions are presented. Consequences of learning disabilities for the likelihood principle, stable estimation and admissible decision-making are explored. Examples of surprising learning behaviours and decision strategies are generated. Deeper understanding of Bayesian learning and its characteristics results.

6 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the usual confidence intervals for the correlation coefficient are approximately Bayesian in the case of four pairs of observations, and the yield of such betting procedures is used as the basis for measures of the nearness to Bayesian form of the odds quoted by non-Bayesian bookmakers.
Abstract: Many mathematical proofs of the necessity of Bayesian inference show, in effect, that betting procedures which have positive expected return for every value of an unknown parameter can be constructed to win money from non-Bayesian bookmakers. The yield of such betting procedures is here used as the basis for measures of the nearness to Bayesian form of the odds quoted by non-Bayesian bookmakers. Using one of these measures it is shown that the usual confidence intervals for the correlation coefficient are approximately Bayesian in the case of four pairs of observations.

6 citations


Journal ArticleDOI
TL;DR: The interpretation of the subjective Bayesian approach to statistical inference is examined, and an alternative approach in which the 'probabilities' that the authors assign to the models are regarded as in themselves uninterpretable, but nevertheless capable of giving rise to a predictive distribution that can be interpreted in a degree of belief sense is suggested.
Abstract: This paper examines the interpretation of the subjective Bayesian approach to statistical inference. In section z, I outline the structure of the Bayesian approach. In sections 2 and 3, I argue that it is difficult to assign degrees of belief to probability models, as seems to be required in a subjective Bayesian argument. I then propose two ways out of this difficulty, and examine their implications for current Bayesian practice: (i) First, I suggest that we should link our models, which are to be regarded as models for our degrees of belief, to subsets of the set of all possible data sets that could arise from the cases in which we envisage making predictions. Degrees of belief are then assigned to the subsets (sections 4-6). (ii) In view of some difficulties with interpretation (i), I suggest an alternative approach in which we regard the 'probabilities' that we assign to the models as in themselves uninterpretable, but nevertheless capable of giving rise to a predictive distribution that can be interpreted in a degree of belief sense (sections 7 and 8).

01 Mar 1978
TL;DR: In this article, the posterior of the parameters involved when sampling is from the xi-normal with parameters alpha and beta is investigated. But the inference problem is from a Bayesian point of view.
Abstract: : Recently, Saunders 1974 has generalized the so-called reciprocal property for normality. Distributions having this property are called xi-normal, and it is of interest to make statistical inferences about the relevant parameters when sampling from such a distribution. Previous work on such problems has been from the sampling viewpoint. This paper approaches the inference problem from the Bayesian point of view and investigate the posterior of the parameters involved when sampling is from the xi-normal with parameters alpha and beta. Two spectral cases are examined.

ReportDOI
01 Sep 1978
TL;DR: Results of several simulations showed that the degree of confidence that a decisionmaker can have in his decision is markedly affected by values of the interactive influence of certain parameters on the probability of deciding that an examinee had attained a specified degree of mastery through a program of instruction.
Abstract: : In any testing or evaluation program, there will be some percentage of false positives and false negatives, i.e., misclassifications will occur. A decisionmaker therefore needs to make a best estimate about the true level of proficiency of an examinee. A multiparameter, programmable model was developed to examine the interactive influence of certain parameters on the probability of deciding that an examinee had attained a specified degree of mastery through a program of instruction. The parameters, readily obtainable from decisionmakers, include: (1) the number of assumed mastery states ('master,' 'intermediate,' 'nonmaster'), (2) the prior distribution of scores from similar examinee groups, and (3) the number of test trials or items that could be given. Results of several simulations showed that the degree of confidence that a decisionmaker can have in his decision (e.g., 'x%' certainty that an examinee is a master) is markedly affected by values for the abovementioned parameters. A key feature of a Bayesian model is that testing time, manpower, expense, and test length can be reduced if the 'prior' information is accurate and valid for the particular tested group. If not, little can be gained from a Bayesian model. Simulated test results also showed that a test can be too short to be of any decisionmaking value.

Journal ArticleDOI
TL;DR: The intent of this paper is to provide theoretical guidance to the practical problem of group reliability prediction and show that a weighted average is a satisfactory and perhaps the only satisfactory method of arriving at group predictions.
Abstract: On the basis of an assumed Bayesian model and a set of criteria for judging reliability predictions, it is shown that: 1. Group predictions outperform individual predictions. 2. A weighted average is a satisfactory and perhaps the only satisfactory method of arriving at group predictions. 3. For a series system, group predictions of component reliabilities combined to form a prediction of system reliability are preferred to a group prediction of system reliability. 4. Prior averaging and updating via Bayes' rule is s-consistent and seems preferable to averaging posterior individual predictions. The intent of this paper is to provide theoretical guidance to the practical problem of group reliability prediction.

ReportDOI
01 Jun 1978
TL;DR: A procedure called decomposed error analysis is proposed, which takes quantified assessments of different kinds of error, such as random sampling fluctuations and mismeasurement, and synthesizes them into a global assessment of error.
Abstract: : The practical problem of appraising the accuracy of estimates--before or after they have been obtained--is analysed. A procedure called decomposed error analysis is proposed, which takes quantified assessments of different kinds of error, such as random sampling fluctuations and mismeasurement, and synthesizes them into a global assessment of error. It replaces and enlarges classical statistical inference approaches in a personalist format which does not depend on Bayesian updating. Applications from the private and public sector are presented.

Journal ArticleDOI
TL;DR: Empirical Bayes techniques are developed for observer inference aboutc whenπ is known to the subject and unknown, and a Bayes rule, a minimax rule and a beta-minimax rule are constructed for the subject when he is uncertain aboutπ.
Abstract: An observer is to make inference statements about a quantityp, called apropensity and bounded between 0 and 1, based on the observation thatp does or does not exceed a constantc. The propensityp may have an interpretation as a proportion, as a long-run relative frequency, or as a personal probability held by some subject. Applications in medicine, engineering, political science, and, most especially, human decision making are indicated. Bayes solutions for the observer are obtained based on prior distributions in the mixture of beta distribution family; these are then specialized to power-function prior distributions. Inference about logp and log odds is considered. Multiple-action problems are considered in which the focus of inference shifts to theprocess generating the propensitiesp, both in the case of a process parameterπ known to the subject and unknown. Empirical Bayes techniques are developed for observer inference aboutc whenπ is known to the subject. A Bayes rule, a minimax rule and a beta-minimax rule are constructed for the subject when he is uncertain aboutπ.

Book ChapterDOI
01 Jan 1978
TL;DR: In this article, the authors develop the apparatus for studying multivariate relationships in the context of economic models with more than two variables, where the dependence is formulated to be linear in the parameters leading to the so-called general linear model.
Abstract: Most of the economic relationships that are studied empirically involve more than two variables. For example, the demand for food on the part of a given consumer would depend on the consumer’s income, the price of food, and the prices of other commodities. Similarly, the demand for labor on the part of a firm would depend on anticipated output and relative factor prices. One can give many more such examples. What is common among them is that often the dependence is formulated to be linear in the parameters, leading to the so-called general linear model. Once the parameters of the model are estimated we are interested in the probability characteristics of the estimators. Since we are, typically, dealing with the estimators of more than one parameter simultaneously, it becomes important to develop the apparatus for studying multivariate relationships. This we shall do in the discussion to follow.