scispace - formally typeset
Search or ask a question
Author

D. V. Hinkley

Bio: D. V. Hinkley is an academic researcher from Stanford University. The author has contributed to research in topics: Sampling distribution & Asymptotic distribution. The author has an hindex of 8, co-authored 13 publications receiving 1842 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the problem of making inference about the point in a sequence of zero-one variables at which the binomial parameter changes is discussed, and the asymptotic distribution of the maximum likelihood estimate of the change-point is derived in computable form using random walk results.
Abstract: : The report discusses the problem of making inference about the point in a sequence of zero-one variables at which the binomial parameter changes. The asymptotic distribution of the maximum likelihood estimate of the change-point is derived in computable form using random walk results. The asymptotic distributions of likelihood ratio statistics are obtained for testing hypotheses about the change-point. Some exact numerical results for these asymptotic distributions are given and their accuracy as finite sample approximations is discussed. (Author)

766 citations

Journal ArticleDOI
TL;DR: In this paper, the authors examined a secondary aspect, where the departure from initial conditions has taken place in a sequence of normal random variables, where initially the mean and the variance o2 were known.
Abstract: SUMMARY The point of change in mean in a sequence of normal random variables can be estimated from a cumulative sum test scheme. The asymptotic distribution of this estimate and associated test statistics are derived and numerical results given. The relation to likelihood inference is emphasized. Asymptotic results are compared with empirical sequential results, and some practical implications are discussed. The cumulative sum scheme for detecting distributional change in a sequence of random variables is a well-known technique in quality control, dating from the paper of Page (1954) to the recent expository account by van Dobben de Bruyn (1968). Throughout the literature on cumulative sum schemes the emphasis is placed on tests of departure from initial conditions. The purpose of this paper is to examine a secondary aspect: estimation of the index T in a sequence {xt}, where the departure from initial conditions has taken place. The work is closely related to an earlier paper by Hinkley (1970), in which maximum likelihood estimation and inference were discussed. We consider specifically sequences of normal random variables x1, ..., xT, say, where initially the mean 00 and the variance o2 are known. A cumulative sum, cusum, scheme is used to detect possible change in mean from 00, and for simplicity suppose that it is a one-sided scheme for detecting decrease in mean. Then the procedure is to compute the cumulative sums t

473 citations

Journal ArticleDOI
TL;DR: In this paper, a procedure for obtaining maximum likelihood estimates and likelihood confidence regions in the intersecting two-phase linear regression model is presented, illustrated on a small set of data, and the distributional properties are examined empirically.
Abstract: Procedures are outlined for obtaining maximum likelihood estimates and likelihood confidence regions in the intersecting two-phase linear regression model. The procedures are illustrated on a small set of data, and the distributional properties are examined empirically.

306 citations

Journal ArticleDOI
Abstract: Statistical Aspects of ARCH and Scholastic Volatility Likelihood-Based Inference for Cointegration of Some Non-Stationary Time Series Forecasting in Macroeconomics Longitudinal Panel Data: An Overview of Current Methodology

176 citations

Journal ArticleDOI
TL;DR: In this paper, a classification for the special situation where observations after a certain time no longer come from the initial population is considered, and the discussion focuses on inference about that time, often called the change-point.
Abstract: SUMMARY Classification for the special situation where observations after a certain time no longer come from the initial population is considered. The discussion focuses on inference about that time, often called the change-point. Some useful approximations are derived for the distributions of change-point statistics, and large-sample results are established for nuisance-parameter situations.

72 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, the authors considered tests for parameter instability and structural change with unknown change point, and the results apply to a wide class of parametric models that are suitable for estimation by generalized method of moments procedures.
Abstract: This paper considers tests for parameter instability and structural change with unknown change point. The results apply to a wide class of parametric models that are suitable for estimation by generalized method of moments procedures. The asymptotic distributions of the test statistics considered here are nonstandard because the change point parameter only appears under the alternative hypothesis and not under the null. The tests considered here are shown to have nontrivial asymptotic local power against all alternatives for which the parameters are nonconstant. The tests are found to perform quite well in a Monte Carlo experiment reported elsewhere. Copyright 1993 by The Econometric Society.

4,348 citations

Journal ArticleDOI
TL;DR: In this paper, the stability over time of regression relationships is investigated using recursive residuals, defined to be uncorrelated with zero means and constant variance, and tests based on the cusum and cusume of squares of recursive residual coefficients are developed.
Abstract: Methods for studying the stability over time of regression relationships are considered. Recursive residuals, defined to be uncorrelated with zero means and constant variance, are introduced and tests based on the cusum and cusum of squares of recursive residuals are developed. Further techniques based on moving regressions, in which the regression model is fitted from a segment of data which is moved along the series, and on regression models whose coefficients are polynomials in time are studied. The Quandt log-likelihood ratio statistic is considered. Emphasis is placed on the use of graphical methods. The techniques proposed have been embodied in a comprehensive computer program, TIMVAR. Use of the techniques is illustrated by applying them to three sets of data.

4,125 citations

Journal ArticleDOI
TL;DR: A joinpoint regression model is applied to describe continuous changes in the recent trend and the grid-search method is used to fit the regression function with unknown joinpoints assuming constant variance and uncorrelated errors.
Abstract: The identification of changes in the recent trend is an important issue in the analysis of cancer mortality and incidence data. We apply a joinpoint regression model to describe such continuous changes and use the grid-search method to fit the regression function with unknown joinpoints assuming constant variance and uncorrelated errors. We find the number of significant joinpoints by performing several permutation tests, each of which has a correct significance level asymptotically. Each p-value is found using Monte Carlo methods, and the overall asymptotic significance level is maintained through a Bonferroni correction. These tests are extended to the situation with non-constant variance to handle rates with Poisson variation and possibly autocorrelated errors. The performance of these tests are studied via simulations and the tests are applied to U.S. prostate cancer incidence and mortality rates.

3,950 citations

Journal ArticleDOI
TL;DR: A unified framework for the design and the performance analysis of the algorithms for solving change detection problems and links with the analytical redundancy approach to fault detection in linear systems are established.
Abstract: This book is downloadable from http://www.irisa.fr/sisthem/kniga/. Many monitoring problems can be stated as the problem of detecting a change in the parameters of a static or dynamic stochastic system. The main goal of this book is to describe a unified framework for the design and the performance analysis of the algorithms for solving these change detection problems. Also the book contains the key mathematical background necessary for this purpose. Finally links with the analytical redundancy approach to fault detection in linear systems are established. We call abrupt change any change in the parameters of the system that occurs either instantaneously or at least very fast with respect to the sampling period of the measurements. Abrupt changes by no means refer to changes with large magnitude; on the contrary, in most applications the main problem is to detect small changes. Moreover, in some applications, the early warning of small - and not necessarily fast - changes is of crucial interest in order to avoid the economic or even catastrophic consequences that can result from an accumulation of such small changes. For example, small faults arising in the sensors of a navigation system can result, through the underlying integration, in serious errors in the estimated position of the plane. Another example is the early warning of small deviations from the normal operating conditions of an industrial process. The early detection of slight changes in the state of the process allows to plan in a more adequate manner the periods during which the process should be inspected and possibly repaired, and thus to reduce the exploitation costs.

3,830 citations

Journal ArticleDOI
TL;DR: In this paper, the first-order term is removed from the asymptotic bias of maximum likelihood estimates by a suitable modification of the score function, and the effect is to penalize the likelihood by the Jeffreys invariant prior.
Abstract: SUMMARY It is shown how, in regular parametric problems, the first-order term is removed from the asymptotic bias of maximum likelihood estimates by a suitable modification of the score function. In exponential families with canonical parameterization the effect is to penalize the likelihood by the Jeffreys invariant prior. In binomial logistic models, Poisson log linear models and certain other generalized linear models, the Jeffreys prior penalty function can be imposed in standard regression software using a scheme of iterative adjustments to the data.

3,362 citations