scispace - formally typeset
Search or ask a question

Showing papers by "Neil Shephard published in 2011"


Journal ArticleDOI
TL;DR: In this article, a multivariate realised kernel is proposed to estimate the ex-post covariation of log-prices, which is guaranteed to be positive semi-definite and robust to measurement noise of certain types.

441 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm, which is perhaps surprising given the celebrated results on maximum simulated likelihood estimation.
Abstract: Suppose we wish to carry out likelihood based inference but we solely have an unbiased simulation based estimator of the likelihood. We note that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm. This result has recently been introduced in statistics literature by Andrieu, Doucet, and Holenstein (2007) and is perhaps surprising given the celebrated results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics and flnancial econometrics. One way of generating unbiased estimates of the likelihood is by the use of a particle fllter. We illustrate these methods on four problems in econometrics, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model we can carry out likelihood based inference using its simulations.

197 citations


Journal ArticleDOI
TL;DR: It is found that subsampling is highly advantageous for estimators based on discontinuous kernels, such as the truncated kernel, and for kinked kernels, it is shown that subsAmpling is impotent, in the sense that subsAMpling has no effect on the asymptotic distribution.

114 citations


01 Jan 2011
TL;DR: In this article, a composite realized kernel is proposed to estimate the ex-post covariation of asset prices, where the covariance estimate is composed of univariate realized kernels to estimate variances and bivariate realized kernel to estimate correlations.
Abstract: We propose a composite realized kernel to estimate the ex-post covariation of asset prices. Composite realized kernels are a data efficient method where the covariance estimate is composed of univariate realized kernels to estimate variances and bivariate realized kernels to estimate correlations. We analyze the merits of our composite realized kernels in an ultra high dimensional environment, making economic decisions every day solely based on the previous day’s data. The first application is a minimum variance portfolio exercise and this is followed by an investigation of portfolio tracking. The data set is tickby-tick data comprising 473 US equities over the sample period 2006-2009. We show that our estimator is able to deliver a significantly lower portfolio variance than its competitors.

50 citations


Posted Content
TL;DR: In this article, the authors investigate the properties of the composite likelihood (CL) method for (T × NT ) GARCH panels and show that when T is reasonably large, CL performs well.
Abstract: We investigate the properties of the composite likelihood (CL) method for (T × NT ) GARCH panels. The defining feature of a GARCH panel with time- series length T is that, while nuisance parameters are allowed to vary across NT series, other parameters of interest are assumed to be common. CL pools informa- tion across the panel instead of using information available in a single series only. Simulations and empirical analysis illustrate that when T is reasonably large CL performs well. However, due to the presence of nuisance parameters, CL is subject to the "incidental parameter" problem for small T.

31 citations


Posted Content
TL;DR: In this article, a new class of multivariate volatility models that utilize high-frequency data is introduced, and the HEAVY model outperforms the multivariate GARCH model out-of-sample, with the gains being particularly significant at short forecast horizons.
Abstract: This paper introduces a new class of multivariate volatility models that utilizes high-frequency data. We discuss the models' dynamics and highlight their differences from multivariate GARCH models. We also discuss their covariance targeting specification and provide closed-form formulas for multi-step forecasts. Estimation and inference strategies are outlined. Empirical results suggest that the HEAVY model outperforms the multivariate GARCH model out-of-sample, with the gains being particularly significant at short forecast horizons. Forecast gains are obtained for both forecast variances and correlations.

1 citations