scispace - formally typeset
Search or ask a question

Showing papers in "The Statistician in 1987"



Journal ArticleDOI
TL;DR: In this article, a computer program for modelling financial time series is presented, based on the Random Walk Hypothesis, which is used to forecast trends in prices in futures markets.
Abstract: Features of Financial Returns Modelling Price Volatility Forecasting Standard Deviations The Accuracy of Autocorrelation Estimates Testing the Random Walk Hypothesis Forecasting Trends in Prices Evidence Against the Efficiency of Futures Markets Valuing Options Appendix: A Computer Program for Modelling Financial Time Series.

1,115 citations



Journal ArticleDOI
TL;DR: Programs, Policies and Evaluation Tailoring Evaluations Identifying Issues and Formulating Questions Assessing the Need for a Program Expressing and Assessing Program Theory Monitoring Program Process and Performance Strategies for Impact Assessment Randomized Designs for impact assessment Quasi-Experimental Impact Assessments Assessment of Full-Coverage Programs Measuring Efficiency The Social Context of Evaluation
Abstract: Programs, Policies and Evaluation Tailoring Evaluations Identifying Issues and Formulating Questions Assessing the Need for a Program Expressing and Assessing Program Theory Monitoring Program Process and Performance Strategies for Impact Assessment Randomized Designs for Impact Assessment Quasi-Experimental Impact Assessments Assessment of Full-Coverage Programs Measuring Efficiency The Social Context of Evaluation

300 citations



Journal ArticleDOI
TL;DR: The method of residual maximum likelihood (REML) is used to illustrate some of the many uses of variance components, including recovery of inter-block information and efficient combination of results from several trials, or series of trials.
Abstract: Methods of estimating components of variance are described and compared. The method of residual maximum likelihood (REML) is used to illustrate some of the many uses of variance components. Particular attention is paid to the analysis of unbalanced data, including recovery of inter-block information and efficient combination of results from several trials, or series of trials. A powerful and easy-to-use computer program, REML, is available for use on most modern mini- computers.

214 citations


Journal ArticleDOI
TL;DR: In this paper, the authors use the running mean and standard deviation of all observations made on the process since start-up as substitutes for the unknown true values of the process mean and variance.
Abstract: In some quality control problems, it is not known what the exact process mean and standard deviation are under control but it is desired to determine whether there have been drifts from the conditions obtained at the process start-up. This situation is not well-covered by standard cumulative sum procedures, which generally assume known process parameters. This paper uses the running mean and standard deviation of all observations made on the process since start-up as substitutes for the unknown true values of the process mean and standard deviation. Using some theoretical properties of independence of residuals, two pairs of cusums are set up: one testing for constancy of location of the process, and the other for constancy of the spread. While the process is under control, both these cusum pairs are of approximately normal N(O, 1) quantities (and therefore are well understood), but if the location, the spread or both change, then non-centrality is introduced into one or both of the location and scale cusum pairs, and it drifts out-of-control. It is shown that the procedure performs well in detecting changes in the process, even in comparison with the often utopian situation in which the process mean and variance are known exactly prior to the start of the cusum. precisely if large numbers of spurious signals are to be avoided. While it has usually been assumed in discussion of cusums that the process specification is known exactly, there are many circumstances in which it is to some extent uncertain, and under these conditions the assumption of known mean and variance is inappropriate. A case of particular interest to the author arises in the control of assaying in a chemical laboratory for bias and precision. A good way of doing this is (Mandel 1964) by using reference materials. A reference material is a batch of the same sort of material as is assayed in the laboratory (obtained for example from a previous consignment), portions of which are assayed regularly along with the production material. A depar- ture of the assays of the reference material from their previous mean or standard deviation indicates a change in bias or precision. Note that the control required is simply for any change in the values obtained for the reference material. It is immaterial what the true assay of the reference material (the ideal target process mean) actually is, and the quality control system should not be predicated on the assumption of an exact knowledge of the process mean. This indeterminancy presents

199 citations





Journal ArticleDOI
TL;DR: Novel numerical integration and interpolation methods, which exploit the opportunities offered by modern interactive computing and graphics facilities, are outlined and illustrated.
Abstract: One of the main obstacles to the routine implementation of Bayesian methods has been the absence of efficient algorithms for carrying out the computational tasks implicit in the Bayesian approach. In this paper, recent progress towards overcoming this problem is reviewed. In particular, novel numerical integration and interpolation methods, which exploit the opportunities offered by modern interactive computing and graphics facilities, are outlined and illustrated.


Journal ArticleDOI
TL;DR: This work presents some fundamental objections to the Monte Carlo method of numerical integration, which has long been known to numerical analysts and was brought to the attention of the Bayesian statistics community by Kloek & van Dijk (1978).
Abstract: We present some fundamental objections to the Monte Carlo method of numerical integra- tion. 1 Background As Bayesian inference is applied to more and more complex and realistic models combined with more and more realistic prior distributions, we become increasingly dependent on numerical methods to explore the resulting complex, high-dimensional, posterior distributions. In particular, there has been considerable interest lately in techniques of numerical integration. The Monte Carlo method, which has long been known to numerical analysts, was brought to the attention of the Bayesian statistics community by Kloek & van Dijk (1978), although Stewart had been using it in this context several years earlier. See Stewart & Johnson (1971). There are many variations and elaborations of Monte Carlo integration, but for our purposes it is enough to study the most basic problem. Consider the one-dimensional integral 00 k= f f(x)dx.

Journal ArticleDOI
TL;DR: In this paper, new models for multiple time series are introduced and illustrated in an application to international currency exchange rate data, based on matrix-variate normal extensions of the dynamic linear model (DLM), provide a tractable, sequential procedure for estimation of unknown covariance structure between series.
Abstract: New models for multiple time series are introduced and illustrated in an application to international currency exchange rate data. The models, based on matrix-variate normal extensions of the dynamic linear model (DLM), provide a tractable, sequential procedure for estimation of unknown covariance structure between series. A principal components analysis is carried out providing a basis for easy model assessment. A practically important elaboration of the model incorporates time- variation in covariance matrices.

Journal ArticleDOI
TL;DR: In this paper, the authors present a system framework for water engineering reliability and risk in water supply systems based on a Bayesian analysis of uncertainty in Hydrological Reliability and Risk Models.
Abstract: I. Introduction.- Water Engineering Reliability and Risk: A System Framework.- II. Reliability and Risk in Structures.- II.1 Design Concepts Based on Risk and Reliability of Structures for Uncorrelated Inputs.- Reliability in Hydraulic Design.- Engineering Risk in Regional Drought Studies.- Incidents and Failures of Hydraulic Structures Subject to Independent Floods.- Reliability of Hydraulic Structures Possessing Random Loading and Resistance.- Probabilistic Design of Water-Retaining Structures.- II.2 Risk Based Assessment of Dam Safety.- Use of Risk-Based Analysis in Making Decisions on Dam Safety.- A Comparison of Methods for Risk Assessment of Dams.- Risk Analysis Considerations for Dam Safety.- Consequences of the Failure of a Water Storage System.- III. Reliability and Risk in Water Supply Systems.- III.1 Water Supply Systems: Uncorrelated Inputs.- Reliability of Water Supply Systems.- Application of Models for Reliability Assessment in Reservoir Operation.- III.2 Water Supply Systems: Correlated Inputs.- The Return Period of a Reservoir System Failure.- Reliability in Multipurpose Reservoir Operation: Case Studies with Correlated Inflows.- Engineering Risk in Flood Studies Using Multivariate Partial Duration Series.- Conjunctive Use of Surface and Groundwater in a Problem of Environmental Protection: A Case in Salento Peninsula in Southern Italy.- IV. Reliability and Risk as Factors in Decision Making.- IV. 1 Elements of Uncertainty Analysis for Decision-Making.- The Impact of Catchment Modeling on Hydrologic Reliability.- Empirical and Causal Models in Hydrologic Reliability Analysis.- Elements of Bayesian Analysis of Uncertainty in Hydrological Reliability and Risk Models.- IV. 2 Applications and Advances.- Reliability Estimation of Underground Water Control Systems Under Natural and Sample Uncertainty.- Target-Related Reliability in Surface Water System Operation.- Bayesian Analysis: Further Advances and Applications.- IV. 3 Multicriterion and Conflict Analysis.- Risk Aspects in the Determination of Optimal Cropping Patterns Hiessl.- Reliability Aspects of Multicriterion Watershed Management.- A Min-Max Operating Rule for the Management of a Multipurpose Reservoir.- Formal Incorporation of Risk into Conflict Analysis.


Journal ArticleDOI
TL;DR: In this article, it was argued that there is no need to depart from traditional understandings of probability in this problem, and that separate pieces of evidence about the individual issues can be combined to modify the probability of the case as a whole.
Abstract: If, in a civil suit, the plaintiff has to establish several distinct issues in order to succeed, is it necessary for the probability of their conjunction to exceed one half, or is it sufficient to establish each component issue with probability exceeding one half? This paper analyses the way in which separate pieces of evidence about the individual issues can be combined to modify the probability of the case as a whole. Contrary to a conclusion of Jonathan Cohen, it is argued that there is no need to depart from traditional understandings of probability in this problem.

Journal ArticleDOI
TL;DR: In this article, the authors focus on the problem of evaluating forensic science evidence against two narrower alternatives: C, the event that the suspect was at the crime scene and C, a scenario where the suspect did not attend the trial at all, and they use the Bayes Theorem to identify the most appropriate range of questions which the scientist should address to be of greatest assistance to the investigator or to the court.
Abstract: The debate about whether or not Bayesian inference provides a model for the process of assessing evidence in a court of law has generated a good amount of literature and some strongly voiced opinions. Let me start by making my own position clear: I am a forensic scientist and I am interested in the modelling of the legal process only to the extent that it enables the role of the scientist's evidence to be defined. The broader arguments centre on concepts such as the odds on guilt, but my professional colleagues jealously guard their detachment from the deliberations of guilt or otherwise and, personally, I am reluctant to handle equations which contain guilt probabilities. I shall concentrate on the problem of evaluating forensic science evidence against two narrower alternatives: C, the event that the suspect was at the crime scene and C, the event that the suspect was not at the crime scene. The great advantage of Bayesian inference is that it enables us to identify and, in principle, to answer the most appropriate range of questions which the scientist should address to be of greatest assistance to the investigator or to the court. Bayes Theorem shows us that, while the investigator or court is concerned with questions of the type: "what is the probability that the suspect was at the crime scene?", the scientist, through the likelihood ratio, should address questions of the type "what is the probability of the evidence given that the suspect was at the crime scene?" and "what is the probability of the evidence given that the suspect was not at the crime scene?". While this might appear almost self-evident to a practising Bayesian, it is only recently that forensic scientists have begun to be converted. Nevertheless there is now a growing realisation that Bayesian methods have something to offer and I am optimistic that over the coming years we can do a lot to fan that small flame. I propose next to describe the essentials of the forensic science transfer problem; then to describe briefly some of the work that has been done so far on developing solutions; then to look at ways in which developments could take place from here, emphasising the major challenges that are to be faced.

Journal ArticleDOI
TL;DR: This chapter discusses the analysis of Covariance Structure Models for Multivariance Analysis with Qualitative Dependent Variables using Categorical Regression to Analyze Multivariate Contingency Tables.
Abstract: PART ONE: MEASUREMENT STRATEGIES Introduction The Analysis of Covariance Structure Models Q Technique and Method Nonparametric Multidimensional Scaling and Individual Difference Scaling PART TWO: TECHNIQUES FOR NONINTERVAL DATA Introduction Logit and Probit Models for Multivariance Analysis with Qualitative Dependent Variables Using Categorical Regression to Analyze Multivariate Contingency Tables PART THREE: DYNAMIC ANALYSIS Introduction Interrupted Time Series Transfer Function Analysis The Uses of Limits and Possibility in Social Science Research

Journal ArticleDOI
TL;DR: Chang et al. as discussed by the authors introduced the concept of Bimeasures and nonstationary processes in Hilbert Space and Likelihood Ratios (Alan Krinik). Chains in Banch Spaces (Stephen V. Noltie).
Abstract: Bimeasures and Nonstationary Processes, (Derek K. Chang & M. M. Rao). Bimeasures. Stochastic Analysis. Applications. Besicovitch-Orlicz Spaces of Almost Periodic Functions (Theodore R. Hillmann). Diffusion Processes in Hilbert Space and Likelihood Ratios (Alan Krinik). Chains in Banch Spaces (Stephen V. Noltie). Two Parameter Stochastic Differential Equations, (J. Yeh). Index.


Journal ArticleDOI
TL;DR: In this article, new dynamic Bayesian models for survival data analysis are applied in a study of contributory factors to unemployment, which treat survival data as time series data, reflecting the need to model time varying relationships with explanatory variables that cannot be accommodated within standard proportional hazard models.
Abstract: New dynamic Bayesian models for survival data analysis are applied in a study of contributory factors to unemployment. The models treat survival data as time series data, reflecting the need to model time varying relationships with explanatory variables that cannot be accommodated within standard proportional hazard models. In the present study, prior expectations that the effects of various socio-economic variables on unemployment spells are time dependent are verified. Assessment of these effects are given and uses of the model in predicting unemployment illustrated.


Journal ArticleDOI
TL;DR: The purpose and environment of Bayesian forecasting systems are described and reviewed, stressing foundational concepts, component models, the discount concept and intervention, and interactive analyses using a purpose-built suite of APL functions.
Abstract: We describe and review the purpose and environment of Bayesian forecasting systems, stressing foundational concepts, component models, the discount concept and intervention, and interactive analyses using a purpose-built suite of APL functions.

Journal ArticleDOI
TL;DR: In this article, the generalized least squares estimator was used to test linear hypotheses confidence regions for linear parameters and regression functions using Bayesian methods and structural inference experimental design methods, and the confidence regions were used to model causal relationships.
Abstract: Statistical problems in modelling causal relationships estimating linear parameters estimating linear parameters using additional information admissibility and improvements of the generalized least squares estimator testing linear hypotheses confidence regions for linear parameters and regression functions Bayesian methods and structural inference experimental design methods.

Journal ArticleDOI
TL;DR: On considere des problemes dans lesquels les observations peuvent etre obtenues seulement a partir de certaines portions selectionnees de la population as mentioned in this paper.
Abstract: On considere des problemes dans lesquels les observations peuvent etre obtenues seulement a partir de certaines portions selectionnees de la population


Journal ArticleDOI
TL;DR: A summary of the crucial 14 Points for Management is presented in this paper, abstracted and adapted from a number of versions which have appeared over the years, including the one presented in this paper.
Abstract: Dr W. Edwards Deming modestly describes himself as a 'consultant in statistical studies'. Others have called him the father of the third wave of the Industrial Revolution. It is now becoming widely accepted that the dramatic turnround in Japan's industrial fortunes dates from Dr Deming's visit, at the invitation of JUSE, in mid-1950. His philosophy combines widespread use of statistical ideas and methods throughout organisations with an approach to management which is, in most part, diametrically opposed to traditional and current practice in the Western world. The management approach creates an environment where the importance of statistical practice is recognised to an otherwise unprecedented extent. This approach is not normally taught in management and business schools, and so the statistical consultant, needs to become familiar with, and to encourage the adoption of, the management philosophy as much as the statistical aspects. In this paper, a summary of Dr Deming's crucial 14 Points for Management is presented, abstracted and adapted from a number of versions which have appeared over the years.

Journal ArticleDOI
TL;DR: In this paper, Monte Carlo Simulations of Surface Reactions are used to simulate the effects of laser irradiation on the surface of semiconductor surfaces and interfaces, as well as the effect of laser-induced desorption from insulators and compound Semiconductors.
Abstract: Electronic Structure at Semiconductor Surfaces and Interfaces.- Molecule-Surface Interaction: Vibrational Excitations.- Melting and Surfaces.- Short-Pulse Surface Interactions.- Nonequilibrium Phase Transitions.- Dislocation Microstructures in Nonequilibrium Materials.- Transport Properties of Laser-Generated Non-Equilibrium Plasmas in Semiconductors.- Nonequilibrium Phases and Phase Transitions in the Surface Melt Morphology of Laser Irradiated Silicon.- Adsorption, Desorption, and Surface Reactions.- Theory of Spectroscopy and Dynamics in Laser-Irradiated Adspecies-Surface Systems.- Monte-Carlo Simulations of Surface Reactions.- Mechanisms of Laser-Induced Desorption from Insulators and Compound Semiconductors.- Gas-Surface Interactions Stimulated by Laser Radiation: Bases and Applications.- Photochemistry of Transition Metal Complexes.- Kinetics of Laser-Induced Pyrolytic Chemical Processes and the Problem of Temperature Measurements.- Diffusion in Liquids.- The Solid-Solid Interface Under Laser-Irradiation.- Photochemistry with Particulate Semiconductors and Electrodes.- Laser Enhanced Electroplating.- UV Laser Ablation of Polymers.- Thermochemical Laser Lithography on the Basis of Local Oxidation of Thin Metal Films.- Laser Induced Metal Oxidation.- Optically Enhanced Oxidation.- U.V. Light Induced Oxidation of GaAs.- Participants.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the coefficients of observation-specific dummy variables cannot be estimated in a large class of dichotomous choice models, which includes probit as well as logit models.
Abstract: Summary. In an article recently published in this journal, Oksanen demonstrated that the coefficients of observation-specific dummy variables are not estimable in dichotomous logit models. This paper extends Oksanen's result in two ways. First, it is shown that observation-specific dummy variables cannot be estimated in a large class of dichotomous choice models, which include probit as well as logit models. Second, it is shown that the dummy variable need not be observation-specific. The dummy variable can describe a group with more than one member as long as each member of the group makes the same choice. Oksanen's result is thus a special case in which the dichotomous choice model is the logit model, and the observation-specific dummy variable represents a group with one member. In an article recently published in this journal, Oksanen (1986) explores the use of observation-specific dummy variables in linear probability and logit models. An observation-specific dummy variable takes on the value 1 for one observation and 0 for all others. Oksanen demonstrates that, while the coefficients of observation-specific dummy variables can be estimated in linear probability models, the coefficients are undefined and cannot be estimated in logit models. The purpose of this paper is to extend Oksanen's result in two ways. The first is to demonstrate that the coefficients of observation-specific dummy variables cannot be estimated for an entire class of dichotomous choice models which includes logit models as a special case. Second, it is shown that the coefficients of more generally defined dummy variables cannot be estimated in dichotomous choice models. Oksanen's result that coefficients of observation-specific dummy variables cannot be estimated is extended to a large class of dichotomous choice models. The class includes all dichot6mous choice models in which the probability of choice can be expressed as the integral of a continuous density function, f(t), that is positive for all t. This stipulation can also be stated in terms of the cumulative distribution function F. No finite numbers a and b can exist such that F(a) = 0 and F(b)= 1. This class includes probit models as well as logit models.