scispace - formally typeset
Search or ask a question

Showing papers on "Bayesian inference published in 1977"


Journal ArticleDOI
TL;DR: The modification required to the face-value likelihood is investigated and conditions under which no modification is necessary and the data can be taken at face value are derived.
Abstract: If the reported data of an experiment have been subject to selection, then inference from such data should be modified accordingly. We investigate the modification required to the face-value likelihood. In particular, we derive conditions under which no modification is necessary and the data can be taken at face value.

70 citations


Journal ArticleDOI
TL;DR: In this article, the tail area for a nested sharp hypothesis is compared to the Bayes factor based on the event of "significance" considered as data, which is expressed as a weighted average of full-data Bayes factors.
Abstract: Inequalities are given relating the tail area for a nested sharp hypothesis to the Bayes factor based on the event of “significance” considered as data. This Bayes factor based on an insufficient statistic is, in turn, expressed as a weighted average of full-data Bayes factors. Lindley's “statistical paradox” is generalized and other comparisons made in the normal sampling context. A new Bayesian interpretation is given for the traditional two-tailed critical level. An example and the discussion suggest a negative answer to the question in the title.

56 citations


Journal ArticleDOI
TL;DR: The problem of estimating the reliability of a system which is undergoing development testing is considered from a Bayesian standpoint in this paper, where m sets of binomial trials are performed under conditions which lead to an ordering,?1
Abstract: The problem of estimating the reliability of a system which is undergoing development testing is considered from a Bayesian standpoint. Formally, m sets of binomial trials are performed under conditions which lead to an ordering, ?1 < ?2 < ... < ?m, of the binomial parameters. The parameter of interest is ?m, the final underlying reliability of the system. The marginal posterior pdf for ?m is easily obtained when uniform prior pdf's are assumed. The method is illustrated.

49 citations


Journal ArticleDOI
TL;DR: An invariance principle is used to show that λ−1 is the prior density representing ignorance concerning the parameter λ of a Poisson process, and a compatibility principle is applied to find an ignorance prior for the probabilities pi of the categories in a multinomial model as mentioned in this paper.
Abstract: An invariance principle is used to show that λ−1 is the prior density representing ignorance concerning the parameter λ of a Poisson process, and a compatibility principle is applied to find an ignorance prior for the probabilities pi of the categories in a multinomial model. It is shown that IIpi −1 is the only positive and exchangeable prior that satisfies the compatibility condition.

41 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an alternative statistical sampling model that proceeds from a different perspective and, consequently, avoids some of the assumptions made by more conventional methods, and brings together, in a Bayesian framework, the book value, audit value and informed judgment sources of evidence that can be used to evaluate a composite account.
Abstract: A typical part of the public auditor's opinion formulation process is the evaluation of composite account balances resulting from the aggregation of a large number of subsidiary components. Numerous statistical methods have been proposed as a means of assisting the auditor in estimating the total value of such accounts or, equivalently, in estimating the error in the stated book value of such accounts. These methods, for the most part, are a direct adaptation of standard classical and Bayesian statistical procedures. Recent work by Kaplan [1973b] and Neter and Loebbecke [1975] has raised serious questions about the adequacy of applying many of these methods to the auditing environment. These difficulties arise when the auditor chooses to develop a statistical inference on the total error, based upon the relationship between book value and sample audit value of the subsidiary components of the account. Because of the usual low error rates experienced in such accounting populations, these auxiliary (ratio, regression, differences, etc.) estimators all suffer from a lack of normal distribution robustness of the sampling distribution. The problem is particularly apparent within the customary range of auditing sample sizes. In this paper, we present an alternative statistical sampling model that proceeds from a different perspective and, consequently, avoids some of the assumptions made by more conventional methods. Our analysis brings together, in a Bayesian framework, the book value, audit value, and informed judgment sources of evidence that can be used to evaluate a composite account. When a noninformative prior judgment

32 citations



Journal ArticleDOI
TL;DR: Bayesian model comparison methods are shown to lead to a predictive distribution for the decision problem without the intermediate step of model selection in an advertising budget decision in which the functional form of sales response to advertising is unknown.
Abstract: Analysis for marketing decisions often involves the consideration of several alternative statistical models. As an intermediate step prior to making policy decisions, a single model is typically selected and used to guide subsequent decisions. In this paper, Bayesian model comparison methods are shown to lead to a predictive distribution for the decision problem without the intermediate step of model selection. This approach utilizes all available information bearing on the validity of the alternative models, as well as information concerning model parameters, in order to draw inferences regarding the criterion of interest relative to the decision to be made. The procedure is illustrated in the context of an advertising budget decision in which the functional form of sales response to advertising is unknown.

14 citations


Journal ArticleDOI
01 Oct 1977-Synthese
TL;DR: Shaman monism as discussed by the authors is a form of personalistic statistical inference that is either pure or sham, and it is used to conceal the existence of objective or physical probabilities in statistical inference.
Abstract: Current systems of statistical inference which utilize personal probabilities (in short, personalistic systems) are either dualistic or monistic. Dualistic systems postulate the existence of objective or physical probabilities (which are usually identified with limiting relative frequencies of observable events), whereas monistic systems do not countenance such probabilities. The central thesis of monism is that statistics can get along quite well without physical probability and the related concepts of objective randomness and random process. Monistic systems may be either pure or sham, Sham monists pay lip service to monism but covertly introduce physical probabilities, and thus trivialize the central thesis. They accomplish this by introducing the same sort of probability models that dualist statisticians do, under the guise of 'personal' probability distributions of observable random variables conditional on the unknown value of a physical parameter. For example, a sham monist will treat problems that a dualist would describe as involving the unknown mean of a normally distributed population in the same way the dualist would with conditionally independent trials governed by a normal law except that he refuses to call the probabilities determined by the law 'physical probabilities'. He insists that they are merely special kinds of personal probabilities. The same sort of approach is used to treat all the standard problems of statistics, i.e., the probability models which govern the sham monist's observable random variables are going to be the same as the ones used by dualists and objectivists, except that they will be labelled differently. A notable adherent of sham monism was the late L. J. Savage, who advocated pure monism when he was theorizing on a foundational level, but who shifted ground while he tried to incorporate the standard problems of statistics into his theoretical framework (cf. [3], [13]). The difference between sham monists and dualists is that the latter overtly postulate the existence of physical probabilities, whereas the former covertly postulate

10 citations


Journal ArticleDOI
TL;DR: In many scientific disciplines, there is frequently a need to describe purely spatial interactions among objects located at the sites of a lattice as mentioned in this paper, and the equilibrium states of physical and biological phenomena occurring simultaneously at sites in more than one dimension (for example, ferromagnetism, crystal formation, patterns of infection).
Abstract: In many scientific disciplines, there is frequently a need to describe purely spatial interactions among objects located at the sites of a lattice. Of particular interest are the equilibrium states of physical and biological phenomena occurring simultaneously at sites in more than one dimension (for example, ferromagnetism, crystal formation, patterns of infection). Markov random fields form a wide class of intuitively appealing models for spatial interaction. Binary ones have been studied extensively in statistical mechanics where they are known as Ising models.

8 citations


Journal ArticleDOI
TL;DR: In this paper, an adaptive control algorithm is formulated which provides, in real time, an optimal adjustment to a controller Bet-point in response to changes in the operating characteristics of the process.
Abstract: SUMMARY An adaptive control algorithm is formulated which provides, in real time, an optimal adjustment to a controller Bet-point in response to changes in the operating characteristics of the process. Such an algorithm is applicable to a variety of industrial processes: for example, those in which the product is either a containerized consumable or a sheet of some type. A Bayesian model is developed for the univariate case and the optimal control policy found. Multivariate extensions are also considered, as is the problem of process overhaul.

8 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present some proved arguments which they believe also support Fisher's view of the maximum likelihood estimate (MLE) as being the most informative estimate of an unknown parameter.
Abstract: Bayesian methods have been widely discussed from both a philosophical and a practical point of view. Lindley's review (1970) contains interesting arguments in support of Bayesian methods, and his monograph contains an extensive bibliography of the subject area. Although accepted by many, Bayesian methods have also been severely criticized, Fisher being among the critics. Savage (1975) reviewed Fisher's work, and his paper provides insights into Fisher's objections to Bayesian methods. An extensive bibliography of the Bayesian subject area is also contained in Savage's paper. From Savage's comments [1975, pp. 456-57], one learns that Fisher regarded the maximum likelihood estimate (MLE) as being the most informative estimate of an unknown parameter. Certainly, this view has theoretical support in an asymptotic sense, but as Efron (1975) stated, "Fisher believes that the MLE is optimum as an information gathering statistic in finite samples, not just asymptotically". The papers of Godambe (1960) and Bhopkar (1972) tend to support Fisher's view of the MLE. Here we present some proved arguments which we believe also support Fisher's view. Ironically, our arguments utilize the Bayesian notions of prior and posterior distributions. It should be noted that we do not attempt to define a mathematical measure of information. Rather, we regard information simply as "acquired knowledge". Suppose we assign a prior probability density function g (0) to a parameter 0, and suppose we assume that a vector of observations X has a sampling density f(x i 8). The prior density expresses our knowledge of the value of 0 before experimentation, and the posterior density defined by

Journal ArticleDOI
TL;DR: The Bayesian inference paradigm is proposed as a highly promising vehicle through which social researchers and policy makers may work and communicate more effectively, not only with each other, but also among themselves.

Journal ArticleDOI
TL;DR: In this article, a linear discriminant function which maximizes the difference between these two groups means can be found, and the linear discriminative function is used to classify property-liability (P-L) insurance firms into either a solvent or distressed group based on a number of financial characteristics.
Abstract: In a recent article in this Journal' Cooley has presented an excellent analysis of the use of Bayesian Inference in discriminant analysis. His objective was to classify property-liability (P-L) insurance firms into either a solvent or distressed group based on a number of financial characteristics. The linear discriminant function which maximizes the difference between these two groups means can be written

Journal ArticleDOI
TL;DR: In this article, a simple linear stochastic learning model and a Bayesian model are used to estimate the probabilities of reciprocated choice in a simple coalitions game, and the results illustrate the way in which models developed in one area of research, probability learning, may find useful application in another, coalition formation.
Abstract: This paper concerns decision processes in systems at the group level. The accuracy and domain of a model of rational choice developed by Siegel are increased through the use of complementary models of the process by which subjects estimate the probabilities of reciprocated choice in a simple coalitions game. Two alternative models are presented, one a simple linear stochastic learning model, the other a Bayesian model. Both are tested in conjunction with the Siegel model using data from experiments on coalition formation conducted by Ofshe and Ofshe. Although both models perform significantly better than a static null model, which makes predictions corresponding closely to the original Siegel model, the performance of the linear model is superior to that of the Bayesian model in all but one case. The results illustrate the way in which models developed in one area of research, probability learning, may find useful application in another, coalition formation.

Journal ArticleDOI
TL;DR: A scheme in which model construction and operation are considered as distinct processes has been designed for the differential diagnosis of goiters and the influence of classification and observation errors and of the recognition method on the diagnostic accuracy has been determined.
Abstract: A scheme in which model construction and operation are considered as distinct processes has been designed for the differential diagnosis of goiters. The influence of classification and observation errors and of the recognition method on the diagnostic accuracy has been determined.

ReportDOI
01 Mar 1977
TL;DR: Development of Bayesian Software Correction Limit Policies designed to determine the optimum time value that minimizes the long run average cost of debugging at two levels and an Imperfect Software Debugging Model that assumes errors are not corrected with certainty are completed.
Abstract: : Work has been completed on development of Bayesian Software Correction Limit Policies designed to determine the optimum time value that minimizes the long run average cost of debugging at two levels - correction action undertaken by the programmer (Phase I) and action undertaken by a system analyst or system designer, if the error is not corrected in Phase I (Phase II). Two models are developed - one assuming the cost of observations of error occurrence and correction time, prior to implementation of the optimum policy, is negligible; the other incorporating the cost of observations. Work was also completed on an Imperfect Software Debugging Model that assumes errors are not corrected with certainty. By assuming the initial number of errors, probability of successfully correcting an error, and constant error occurence rate are all known, formulas for such quantities as distribution of time to completely debugged software, distribution of time to a specified number of remaining errors, and expected number of errors detected by time t can be derived. Work is currently in progress in extending the Imperfect Debugging Model to incorporate error correction time, estimation of model parameters and development of a Bayesian model; developing bivariate software reliability models where system errors are classified as serious and non-serious; development of empirical models for software error data; development of software reliability demonstration plans for making accept/reject decisions for software packages; and investigating the effects of changes in prior distributions and/or model parameters on quantities of interest. (Author)

01 Dec 1977
TL;DR: The Bayesian model demonstrated several advantages over previous models: use at the beginning of the program, inclusion of subjective information, and giving weight to future program budgets.
Abstract: : The Bayesian model developed to predict costs-at-completion on weapon system programs is an extension of research done by M. Zaki El-Sabban. The model assumes cost is a random variable and is normally distributed. Budgeted costs are used to develop the prior probability distribution. Actual cost information is used for the Bayesian updating of the probability distribution. The mean of the updated probability distribution is the new estimated cost-at-completion for the program. The model was compared with a non-linear regression model and a linear extrapolation model on five weapon system programs. On three of the programs the non-linear regression model estimated the final cost the greater percentage of the time. On the remaining two programs the Bayesian model estimated the final cost the greater percentage of the time. The Bayesian model demonstrated several advantages over previous models: use at the beginning of the program, inclusion of subjective information, and giving weight to future program budgets.

Journal ArticleDOI
TL;DR: In this article, the authors apply the Harsanyi's Bayesian model to two-person zero-sum differential games, where the players have incomplete information on the initial state, and introduce behavioural strategies as projective limits of finite tensor products of probability measures.


Journal Article
TL;DR: In this paper, uncertainties in the non-destructive inspection techniques and the information on the probability distribution function of defect in a given structure were taken into account to use effectively the results of NDI.
Abstract: The uncertainties in the non-destructive inspection techniques and the information on the probability distribution function of defect in a given structure were taken into account to use effectively the results of NDI. Baysian analysis was found effective for the estimation of the probability of defect in the inspected structure even if there were very few data on the detectability of NDI and PDF of the initially existed defects.