Author

# Kenneth D. West

Other affiliations: University of Pittsburgh, Alcatel-Lucent, University of Wisconsin-Madison ...read more

Bio: Kenneth D. West is an academic researcher from Princeton University. The author has contributed to research in topics: Physics & Polariton. The author has an hindex of 59, co-authored 245 publications receiving 44956 citations. Previous affiliations of Kenneth D. West include University of Pittsburgh & Alcatel-Lucent.

Topics: Physics, Polariton, Monetary policy, Exchange rate, Interest rate

##### Papers published on a yearly basis

##### Papers

More filters

••

TL;DR: In this article, a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction is described.

Abstract: This paper describes a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction. It also establishes consistency of the estimated covariance matrix under fairly general conditions.

18,117 citations

•

TL;DR: In this article, a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction is described.

Abstract: This paper describes a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction. It also establishes consistency of the estimated covariance matrix under fairly general conditions.

5,822 citations

•

TL;DR: A nonparametric method for automatically selecting the number of autocovariances to use in computing a heteroskedasticity and autocorrelation consistent covariance matrix is proposed and proved to be asymptotically equivalent to one that is optimal under a mean squared error loss function.

Abstract: We propose a nonparametric method for automatically selecting the number of autocovariances to use in computing a heteroskedasticity and autocorrelation consistent covariance matrix. For a given kernel for weighting the autocovariances, we prove that our procedure is asymptotically equivalent to one that is optimal under a mean squared error loss function. Monte Carlo simulations suggest that our procedure performs tolerably well, although it does result in size distortions.

2,798 citations

••

TL;DR: In this paper, a nonparametric method for automatically selecting the number of autocovariances to use in computing a heteroskedasticity and autocorrelation consistent covariance matrix was proposed.

Abstract: Authors proposes a nonparametric method for automatically selecting the number of autocovariances to use in computing a heteroskedasticity and autocorrelation consistent covariance matrix. For a given kernel for weighting the autocovariances, we prove that our procedure is asymptotically equivalent to one that is optimal under a mean-squared error loss function. Monte Carlo simulations suggest that our procedure performs tolerably well, although it does result in size distortions.

2,515 citations

••

TL;DR: In this paper, the mean squared prediction error (MSPE) from the parsimonious model is adjusted to account for the noise in the large model's model. But, the adjustment is based on the nonstandard limiting distributions derived in Clark and McCracken (2001, 2005a) to argue that use of standard normal critical values will yield actual sizes close to, but a little less than, nominal size.

Abstract: Forecast evaluation often compares a parsimonious null model to a larger model that nests the null model. Under the null that the parsimonious model generates the data, the larger model introduces noise into its forecasts by estimating parameters whose population values are zero. We observe that the mean squared prediction error (MSPE) from the parsimonious model is therefore expected to be smaller than that of the larger model. We describe how to adjust MSPEs to account for this noise. We propose applying standard methods (West (1996)) to test whether the adjusted mean squared error difference is zero. We refer to nonstandard limiting distributions derived in Clark and McCracken (2001, 2005a) to argue that use of standard normal critical values will yield actual sizes close to, but a little less than, nominal size. Simulation evidence supports our recommended procedure.

1,540 citations

##### Cited by

More filters

•

01 Jan 2001

TL;DR: This is the essential companion to Jeffrey Wooldridge's widely-used graduate text Econometric Analysis of Cross Section and Panel Data (MIT Press, 2001).

Abstract: The second edition of this acclaimed graduate text provides a unified treatment of two methods used in contemporary econometric research, cross section and data panel methods. By focusing on assumptions that can be given behavioral content, the book maintains an appropriate level of rigor while emphasizing intuitive thinking. The analysis covers both linear and nonlinear models, including models with dynamics and/or individual heterogeneity. In addition to general estimation frameworks (particular methods of moments and maximum likelihood), specific linear and nonlinear methods are covered in detail, including probit and logit models and their multivariate, Tobit models, models for count data, censored and missing data schemes, causal (or treatment) effects, and duration analysis. Econometric Analysis of Cross Section and Panel Data was the first graduate econometrics text to focus on microeconomic data structures, allowing assumptions to be separated into population and sampling assumptions. This second edition has been substantially updated and revised. Improvements include a broader class of models for missing data problems; more detailed treatment of cluster problems, an important topic for empirical researchers; expanded discussion of "generalized instrumental variables" (GIV) estimation; new coverage (based on the author's own recent research) of inverse probability weighting; a more complete framework for estimating treatment effects with panel data, and a firmly established link between econometric approaches to nonlinear panel data and the "generalized estimating equation" literature popular in statistics and other fields. New attention is given to explaining when particular econometric methods can be applied; the goal is not only to tell readers what does work, but why certain "obvious" procedures do not. The numerous included exercises, both theoretical and computer-based, allow the reader to extend methods covered in the text and discover new insights.

28,298 citations

••

TL;DR: In this article, a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction is described.

Abstract: This paper describes a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction. It also establishes consistency of the estimated covariance matrix under fairly general conditions.

18,117 citations

••

TL;DR: In this article, the authors proposed new tests for detecting the presence of a unit root in quite general time series models, which accommodate models with a fitted drift and a time trend so that they may be used to discriminate between unit root nonstationarity and stationarity about a deterministic trend.

Abstract: SUMMARY This paper proposes new tests for detecting the presence of a unit root in quite general time series models. Our approach is nonparametric with respect to nuisance parameters and thereby allows for a very wide class of weakly dependent and possibly heterogeneously distributed data. The tests accommodate models with a fitted drift and a time trend so that they may be used to discriminate between unit root nonstationarity and stationarity about a deterministic trend. The limiting distributions of the statistics are obtained under both the unit root null and a sequence of local alternatives. The latter noncentral distribution theory yields local asymptotic power functions for the tests and facilitates comparisons with alternative procedures due to Dickey & Fuller. Simulations are reported on the performance of the new tests in finite samples.

16,874 citations

••

TL;DR: In this paper, the estimation and testing of long-run relations in economic modeling are addressed, starting with a vector autoregressive (VAR) model, the hypothesis of cointegration is formulated as a hypothesis of reduced rank of the long run impact matrix.

Abstract: The estimation and testing of long-run relations in economic modeling are addressed. Starting with a vector autoregressive (VAR) model, the hypothesis of cointegration is formulated as the hypothesis of reduced rank of the long-run impact matrix. This is given in a simple parametric form that allows the application of the method of maximum likelihood and likelihood ratio tests. In this way, one can derive estimates and test statistics for the hypothesis of a given number of cointegration vectors, as well as estimates and tests for linear hypotheses about the cointegration vectors and their weights. The asymptotic inferences concerning the number of cointegrating vectors involve nonstandard distributions. Inference concerning linear restrictions on the cointegration vectors and their weights can be performed using the usual chi squared methods. In the case of linear restrictions on beta, a Wald test procedure is suggested. The proposed methods are illustrated by money demand data from the Danish and Finnish economies.

12,449 citations

••

TL;DR: In this article, the authors show that strategies that buy stocks that have performed well in the past and sell stocks that had performed poorly in past years generate significant positive returns over 3- to 12-month holding periods.

Abstract: This paper documents that strategies which buy stocks that have performed well in the past and sell stocks that have performed poorly in the past generate significant positive returns over 3- to 12-month holding periods. We find that the profitability of these strategies are not due to their systematic risk or to delayed stock price reactions to common factors. However, part of the abnormal returns generated in the first year after portfolio formation dissipates in the following two years. A similar pattern of returns around the earnings announcements of past winners and losers is also documented

10,806 citations