scispace - formally typeset
Search or ask a question

Showing papers in "The Statistician in 1986"



Journal ArticleDOI
TL;DR: These notes and other information about the course are available on www.cam.ac.uk/∼beresty/teach/StoCal/stocal.html Christina Goldschmidt, Stefan Grosskinsky, Gregory Miermont, and James Norris have all contributed to these notes.
Abstract: These notes and other information about the course are available on www.statslab.cam.ac.uk/∼beresty/teach/StoCal/stocal.html Christina Goldschmidt, Stefan Grosskinsky, Gregory Miermont, and James Norris have all contributed to these notes. However, errors are my responsibility only. Comments and corrections are most welcome and should be sent to N.Berestycki AT statslab.cam.ac.uk Date last modified: April 23, 2010

827 citations


Journal ArticleDOI
TL;DR: This book uses realistic examples to explore a wide variety of applications, such as inventory and production control, reliability, maintenance, queueing computer and communication systems, and will be of considerable interest to practitioners and researchers in operations research, statistics, computer science and engineering.
Abstract: The main concern of this text is the application of stochastic models to practical situations involving uncertainty and dynamism. A unique feature is the integrated treatment of models and computational methods for stochastic design and stochastic optimization problems. The book uses realistic examples to explore a wide variety of applications, such as inventory and production control, reliability, maintenance, queueing computer and communication systems. Exercises and suggestions for further reading are provided at the end of each chapter. The book was written with advanced students in mind, however, as it contains a wealth of material not found in other texts, it will also be of considerable interest to practitioners and researchers in operations research, statistics, computer science and engineering.

822 citations



Journal ArticleDOI
TL;DR: In this paper, Spectral Estimates of Stochastic Processes and Random Fields are investigated, and the results show that the latter is a more accurate estimator of the former than the Spectral Estimation of Higher Order Spectra.
Abstract: I. Cumulant Estimates of Stochastic Processes and Random Fields. II. Spectral Estimates of Stochastic Processes and Random Fields. III. Investigation of Spectral Estimates of the Grenander-Rosenblatt Type. IV. Investigations of Lag Window Estimates. V. Estimates of Cross-Spectra of Multivariate Gaussian Sequences. VI. Statistical Estimation of Higher Order Spectra. Appendices. References. Index.

250 citations




Journal ArticleDOI

163 citations



Journal ArticleDOI
TL;DR: In the first edition of the First Edition of as mentioned in this paper, the authors introduced the concept of central tendency measures of variance correlation and some fundamental foundations of measurement can You Trust the Conclusions? (A Brief Digression on the Design of Research Studies).
Abstract: Preface to the First Edition Making Numbers Make Sense Concepts of Central Tendency Measures of Variability Correlation Some Fundamentals of Measurement Can You Trust the Conclusions? (A Brief Digression on the Design of Research Studies) Elements of Statistical Inference Nuts and Bolts of Estimation The Logic of Hypothesis Testing Inferences Involving Averages Inferences Involving Correlation Coefficients Inferences Involving Statistical Independence One-Way Analysis of Variance Two-Way Analysis of Variance Some Advanced Topics

104 citations


Journal ArticleDOI
TL;DR: This book discusses the development of process quality, design and improvement in the modern business environment, and methods and philosophy of Statistical Process Control.
Abstract: Quality Improvement in the Modern Business Environment.STAISTICAL METHODS USEFUL IN QUALITY IMPROVEMENT.Modeling Process Quality.Inferences About Process Quality.BASIC METHODS OF STATISTICAL PROCESS CONTROL AND CAPABILITY ANALYSIS.Methods and Philosophy of Statistical Process Control.Control Charts for Variables.Control Charts for Attributes.Process and Measurement Systems System Capability Analysis.OTHER STATISTICAL PROCESS MONITORING AND CONTROL TECHNIQUES.Cumulative Sum and Exponentially Weighted Moving Average Control Charts.Other Univariate SPC Techniques.Multivariate Process Monitoring and Control.Engineering Process Control and SPC.PROCESS DESIGN AND IMPROVEMENT WITH DESIGNED EXPERIMENTS.Factorial and Fractional Factorial Designs for Process Design and Improvement.Process Optimization with Designed Experiments.ACCEPTANCE SAMPLING.Lot--by--Lot Acceptance Sampling for Attributes.Other Acceptance Sampling Techniques.Appendix.Bibliography.Answers to Selected Exercises.Index.




Journal ArticleDOI
TL;DR: Fisher's logic is not consistent with Bayes' theorem as mentioned in this paper, which is the best known theory for tests of significance, due largely to Fisher's theory is very well received.
Abstract: The best (most widely) received theory for tests of significance is that due largely to Fisher. Embellished with Neyman's mathematics, Fisher's theory is very well received. But Fisher's logic is not consistent with Bayes' theorem. And Bayes' theorem is beyond reproach. Thus, Fisher's logic is deficient. However, in practice, there is often some redress. Indeed, sometimes Fisher's level of significance P coincides mathematically with the posterior probability of the null hypothesis, i.e. P=p(hOIE), where E is the sample event (evidence). More generally, a good Fisherian tends intuitively (although certainly not inevitably) toward the inference he would make if he employed Bayes' theorem with explicit subjective priors. In effect, he is almost Bayesian. 1 In theory

Journal ArticleDOI
G. A. Whitmore1
TL;DR: In this article, l'utilisation des distributions des temps de premier passage liees au mouvement brownien multidimensionnel comme modeles de donnees de duree soumis aux risques simultanes is discussed.
Abstract: On etudie l'utilisation des distributions des temps de premier passage liees au mouvement brownien multidimensionnel comme modeles de donnees de duree soumis aux risques simultanes

Journal ArticleDOI
TL;DR: Samples, Censuses, and Sampling Error Measurement Errors and Invalidity Reliability concluded that samples and censuses should be considered as valid sources of error and sampling errors.
Abstract: Samples, Censuses, and Sampling Error Measurement Errors and Invalidity Reliability Conclusion


Journal ArticleDOI
Nicholas N. N. N. Nuamah1
TL;DR: In this article, a methode parmi celles des moindres carres ordinaires et l'analyse de covariance is used to analyse the covariance.
Abstract: On utilise une methode parmi celles des moindres carres ordinaires et l'analyse de covariance


Journal ArticleDOI
TL;DR: The continuity-corrected arc sine transformation for binomial variates (Walters, 1979) provides a simple and accurate method for calculating power for the "exact" test for a 2 x 2 contingency table.
Abstract: The continuity-corrected arc sine transformation for binomial variates (Walters, 1979) provides a simple and accurate method for calculating power for the "exact" test for a 2 x 2 contingency table. Sample size calculations using this method, however, require iteration. A modification is presented which yields accurate approximations for equal and unequal sample sizes using closed-form expressions.

Journal ArticleDOI
TL;DR: In this paper, the role of diagnostics dans le choix d'un modele bayesien is discussed, and the authors suggere que leur utilisation peut etre justifiee par l'approximation selon laquelle la probabilite qu'un simple soit vrai vaut 1
Abstract: On considere le role des diagnostics dans le choix d'un modele bayesien. On suggere que leur utilisation peut etre justifiee par l'approximation selon laquelle la probabilite qu'un modele simple soit vrai vaut 1

Journal ArticleDOI
TL;DR: In this article, the authors studied the geometry of bivariate regression from a sampling viewpoint and showed that enhancement occurs in one half of the space of possible observations, which is the case of classical suppression.
Abstract: The geometry of bivariate regression is studied from a sampling viewpoint. This leads to an intuitive, but rigorous proof that enhancement occurs in one half of the space of possible observations. In addition, we obtain specific results concerning the relative size of the spaces in which 'classical', 'net', and 'cooperative suppression' occur. Introduction It is sometimes possible to increase the observed multiple correlation coefficient of a regression, R2, by including a second independent variable, X2, that is uncorrelated with Y. When this happens, X2 makes up for its lack of correlation with Y by correlation with that part of X, that is orthogonal to Y In that case, "X2 increases the variance accounted for in Y by 'suppressing' some of the variance in X, that is irrelevant to Y"' (Cohen & Cohen, 1975, p. 87). This phenomenon is sometimes called 'classical suppression', as defined by Horst (1941). Darlington (1968, see also Lubin, 1957) extended the concept of a suppressor variable to be a variable that has non-negative correlation with Y, but a negative partial regression coefficient when used in conjunction with another variable that is positively correlated with Y Conger (1974) further broadened the definition of suppression, by shifting the emphasis from the suppressor to the variable whose variation is suppressed. He defined a suppressor as a variable whose inclusion increases the magnitude of the partial regression coefficient of the other predictor variable (in effect, suppressing extraneous variation that hindered the performance of the predictor). The concept of suppression has not only been widely used in psychology (Cohen & Cohen, 1975) but also appears under different names in other fields of research (Conger, 1974). Currie & Korabinski (1984) shifted the emphasis away from partial regression coefficients entirely, to the increase in the multiple correlation coefficient. They defined 'enhancement' to occur when the squared multiple correlation coefficient exceeds the sum of squared simple correlations with Y Clearly, the concepts of enhancement and suppression are related. Currie & Korabinski (1984) noted that enhancement implies suppression (an unfortunate conflict of terminology) and studied the occurrence of these phenomena in the context of a fixed orientation between the observation vector, Y, and the Xis. We propose to study these phenomena from the more natural sampling viewpoint, where the orientation between the Xis is fixed and Y is allowed to vary. This leads to an intuitive, but rigorous geometric interpretation that clarifies the meaning of enhancement and the various types of suppression. An immediate consequence of the geometry is that enhancement occurs in half of the space of possible observations. This result explains the empirical findings of Currie & Korabinski who observed, by a careful counting technique for a special case, that This content downloaded from 157.55.39.255 on Wed, 25 May 2016 04:53:16 UTC All use subject to http://about.jstor.org/terms

Journal ArticleDOI
TL;DR: In this article, the authors report their experiences with using tests of linearity in time series proposed by McLeod & Li (1983) and Keenan (1985) and report their application to detect nonlinearity on a large collection of time series.
Abstract: We report our experiences with using tests of linearity in time series proposed by McLeod & Li (1983) and Keenan (1985). Empirical significance levels are checked and power studies reported when the tests are applied to bilinear and SETAR time serious models. We also report their application to detect nonlinearity on a large collection of time series. In general the performance of the former statistic is poor except for large sample sizes whilst the latter is superior both for bilinear and SETAR time series. In an appendix we give the MINITAB macros so that anyone with this package could easily implement the tests used.

Journal ArticleDOI
TL;DR: On souligne des attitudes bayesiennes pour la strategie de modelisation et le choix du modele a la fois pour l'analyse formelle et non-formelle.
Abstract: On souligne des attitudes bayesiennes pour la strategie de modelisation et le choix du modele a la fois pour l'analyse formelle et non formelle


Journal ArticleDOI
TL;DR: In this article, the authors present a range of techniques for making tables readily readable through linear and logistic regression to log-linear and random-effects models, and show how good research takes account both of statistical theory and real world.
Abstract: Thu, 07 Feb 2019 04:38:00 GMT interpreting and using regression quantitative pdf Interpreting and Using Regression (Quantitative Applications in the Social Sciences) Home ; Interpreting and Using Regression (Quantitative Applications in the Social Sciences) Tue, 12 Feb 2019 05:01:00 GMT Interpreting and Using Regression (Quantitative ... interpreting and using regression quantitative applications in the social sciences Sat, 12 Sep 2015 23:59:00 GMT interpreting and using regression quantitative pdf Tue, 19 Feb 2019 14:11:00 GMT Interpreting And Using Regression Quantitative ... Interpreting and utilizing Regression units out the particular tactics researchers hire, locations them within the framework of statistical thought, and exhibits how stable learn takes account either one of statistical conception and genuine international calls for. Tue, 19 Feb 2019 12:38:00 GMT 's Interpreting and Using Regression (Quantitative ... Click Download or Read Online button to get interpreting quantitative data in pdf book now. This site is like a library, Use search box in the widget to get ebook that you want. This site is like a library, Use search box in the widget to get ebook that you want. Tue, 19 Feb 2019 20:09:00 GMT Download PDF EPUB Interpreting Quantitative Data PDF and ... Using Regression Quantitative Applications In The Social Sciences By Achen Christopher H Published By Sage Publications Inc 1982 available for free PDF download. You may find Ebook Pdf Interpreting And Using Regression Quantitative Applications In The Social Sciences By Achen Christopher H Published By Sage Publications Inc 1982 document other than just manuals as we also make available many ... Sun, 10 Feb 2019 14:57:00 GMT Interpreting And Using Regression Quantitative ... Interpreting And Using Regression Series Quantitative Applications In The Social Sciences 29 Applications In The Social Sciences 29 is available in various format such as PDF, DOC and ePUB which you can directly download and save in in to Sun, 10 Feb 2019 17:21:00 GMT Document for Interpreting And Using Regression Series ... Download interpreting quantitative data ebook free in PDF and EPUB Format. interpreting quantitative data also available in docx and mobi. Read interpreting quantitative data online, read in mobile or Kindle. [PDF] Interpreting Quantitative Data Download eBook for Free a detailed explanation about Ebook Pdf Interpreting And Using Regression Series Quantitative Applications In The Social Sciences 29, its contents of the package, names of things and what they do, setup, and operation. Document for Interpreting And Using Regression Series ... Interpreting And Using Regression Quantitative Applications In The Social Sciences By Achen Christopher H Published By Sage Publ More references related to interpreting and using regression quantitative applications in the social sciences by achen christopher h published by sage publ Major Pacifierclean Romance Army Romance Navy Seal Romance New Adult Romance Billionaire Romance Military Major ... Interpreting And Using Regression Quantitative ... \"Quantitative Data Analysis, by Donald J. Treiman, is a well-written demonstration of how to answer questions using statistics. The range of techniques is broad, ranging from simple advice for making tables readily readable through linear and logistic regression to log-linear and random-effects models ... Download [PDF] Interpreting Quantitative Data Free Online ... Interpreting and Using Regression sets out the actual procedures researchers employ, places them in the framework of statistical theory, and shows how good research takes account both of statistical theory and real world

Journal ArticleDOI
TL;DR: In this article, the authors present two examples of statistical modeling in a Bayesian framework, to demonstrate both the scope of current integration methods and the ease with which a sensitivity analysis can be carried out.
Abstract: Numerical integration techniques now exist which permit a very flexible approach to Bayesian modelling. Low dimensional summaries can be extracted from joint posterior distributions of up to 15 dimensions. The sensitivity of particular summaries to changes in both the model and the prior can thus be investigated. This paper discusses the various aspects of sensitivity in a Bayesian analysis and demonstrates something of the power of numerical integration via two examples. Practical data analysis consistent with the Bayesian paradigm has recently been given a substantial boost with the development of efficient high-dimensional numerical integration methods. These allow the computation of posterior moments and low dimensional summaries such as univariate and bivariate marginal distributions from high-dimensional posterior densities. For early examples using Monte Carlo integra- tion methods, see Stewart (1979) or van Dijk & Kloek (1978). Naylor & Smith (1982) describe an iterative procedure which makes repeated use of Gauss-Hermite rules over three dimensional cartesian product grids. A general review of progress in this area is given by Smith et al. (1985). In 1983, a research project funded by the SERC was established at the University of Nottingham to extend the ideas of Naylor & Smith to higher dimensions. That project has confirmed that numerical integration in up to 15 dimensions can be carried out routinely, if mixed strategies involving cartesian product rules, spherical rules and Monte Carlo procedures are used. Details are given by Shaw (1985a,b) and will be published elsewhere. Freed from constraints imposed by analytical tractability, a Bayesian analysis is straightforward and extremely flexible. In computing terms, all that is required is a routine for evaluating the likelihood for selected values of the model parameters. Prior distributions need not be restricted to conjugate classes and can more effectively represent prior opinion. The sensitivity of particular inferences to changes in the form of either the model or the prior can be readily investigated. This paper presents two examples of statistical modelling in a Bayesian framework, to demonstrate both the scope of current integration methods and the ease with which a sensitivity analysis can be carried out. Although a Bayesian analysis is commonly portrayed as the revision of subjective beliefs in the light of available data, the idea of a Bayesian sensitivity analysis is not new. It has frequently been advocated that a single data set should motivate many prior to posterior analyses. These arguments are reviewed in Section 2. What is still needed is a practical definition of sensitivity in this context. Consider a graph in which several versions of a marginal posterior density are superimposed. The fact that some of these curves are distinguishable from each other, at the level of resolution employed by the display, does not automatically imply that the margin is sensitive to choice of either the model or the prior. In some circum- stances, gross discrepancies in the tails of the distributions may be unimportant providing the mode or the mean is stable. In other circumstances, the estimation of extreme tail area probabilities may be the sole purpose of the analysis. One possibility

Journal ArticleDOI
TL;DR: In this article, it was shown that incorporation of J variables fictives is equivalent to l'estimation avec T-J points (T etant le nombre total d'observations).
Abstract: On montre que l'incorporation de J variables fictives est equivalente a l'estimation avec T-J points (T etant le nombre total d'observations)