scispace - formally typeset
Search or ask a question

Showing papers by "Neil Shephard published in 2008"


Journal ArticleDOI
TL;DR: In this article, realised kernels are used to carry out efficient feasible inference on the expost variation of underlying equity prices in the presence of simple models of market frictions, where the weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which is close to that of the maximum likelihood estimator in the parametric version of this problem.
Abstract: This paper shows how to use realised kernels to carry out efficient feasible inference on the expost variation of underlying equity prices in the presence of simple models of market frictions. The issue is subtle with only estimators which have symmetric weights delivering consistent estimators with mixed Gaussian limit theorems. The weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which is close to that of the maximum likelihood estimator in the parametric version of this problem. Realised kernels can also be selected to (i) be analysed using endogenously spaced data such as that in databases on transactions, (ii) allow for market frictions which are endogenous, (iii) allow for temporally dependent noise. The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice.

1,269 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement, which is due to non-trivial liquidity effects.
Abstract: Realised kernels use high frequency data to estimate daily volatility of individual stock prices. They can be applied to either trade or quote data. Here we provide the details of how we suggest implementing them in practice. We compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement. We identify some features of the high frequency data which are challenging for realised kernels. They are when there are local trends in the data, over periods of around 10 minutes, where the prices and quotes are driven up or down. These can be associated with high volumes. One explanation for this is that they are due to non-trivial liquidity effects.

543 citations


Posted Content
TL;DR: In this paper, a multivariate realised kernel is proposed to estimate the ex-post covariation of log-prices, which is guaranteed to be positive semi-definite and robust to measurement noise of certain types.
Abstract: We propose a multivariate realised kernel to estimate the ex-post covariation of log-prices. We show this new consistent estimator is guaranteed to be positive semi-definite and is robust to measurement noise of certain types and can also handle non-synchronous trading. It is the first estimator which has these three properties which are all essential for empirical work in this area. We derive the large sample asymptotics of this estimator and assess its accuracy using a Monte Carlo study. We implement the estimator on some US equity data, comparing our results to previous work which has used returns measured over 5 or 10 minutes intervals. We show the new estimator is substantially more precise.

399 citations


Journal ArticleDOI
TL;DR: In this paper, a new measure of risk based entirely on downward moves measured using high frequency data is proposed, based on the theory of probability theory, drawing on some new results from probability theory.
Abstract: We propose a new measure of risk, based entirely on downward moves measured using high frequency data. Realised semivariances are shown to have important predictive qualities for future market volatility. The theory of these new measures is spelt out, drawing on some new results from probability theory.

287 citations


Posted Content
TL;DR: In this paper, the authors propose a novel and fast way of estimating models of time-varying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets.
Abstract: Building models for high dimensional portfolios is important in risk management and asset allocation. Here we propose a novel and fast way of estimating models of time-varying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets. Indeed we can handle the case where the cross-sectional dimension is larger than the time series one. The theory of this new strategy is developed in some detail, allowing formal hypothesis testing to be carried out on these models. Simulations are used to explore the performance of this inference strategy while empirical examples are reported which show the strength of this method. The out of sample hedging performance of various models estimated using this method are compared.

211 citations


Posted Content
TL;DR: In this paper, a new measure of risk based entirely on downwards moves measured using high frequency data is proposed, which is shown to have important predictive qualities for future market volatility, drawing on some new results from probability theory.
Abstract: We propose a new measure of risk, based entirely on downwards moves measured using high frequency data. Realised semivariances are shown to have important predictive qualities for future market volatility. The theory of these new measures is spelt out, drawing on some new results from probability theory.

153 citations


Journal ArticleDOI
TL;DR: In this paper, realised kernels are used to carry out efficient feasible inference on the ex-post variation of underlying equity prices in the presence of simple models of market frictions, where the weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance close to that of the maximum likelihood estimator in the parametric version of this problem.
Abstract: This paper shows how to use realised kernels to carry out efficient feasible inference on the ex-post variation of underlying equity prices in the presence of simple models of market frictions The issue is subtle with only estimators which have symmetric weights delivering consistent estimators with mixed Gaussian limit theorems The weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which is close to that of the maximum likelihood estimator in the parametric version of this problem Realised kernels can also be selected to (i) be analysed using endogenously spaced data such as that in databases on transactions, (ii) allow for market frictions which are endogenous, (iii) allow for temporally dependent noise The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice

109 citations


01 Jan 2008
TL;DR: SsfPack™ (Extended) version 3.0 is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form with easy-to-use functions for Ox.
Abstract: SsfPack™ (Extended) version 3.0 is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form with easy-to-use functions for Ox. SsfPack requires Ox 4 or above to run. SsfPack allows for a full range of different state space forms: from a simple time-invariant model to a complicated multivariate time-varying model. Functions are provided to put standard models such as SARIMA, unobserved components, time-varying regressions and cubic spline models into state space form. Basic functions are available for filtering, moment smoothing and simulation smoothing. Ready-to-use functions are provided for standard tasks such as likelihood evaluation, forecasting and signal extraction. SsfPack can be used for implementing, fitting and analysing Gaussian models relevant to many areas of econometrics and statistics. It provides all relevant tools for the treatment of non-Gaussian and nonlinear state space models. In particular, tools are available to implement simulation based estimation methods such as importance sampling and Markov chain Monte Carlo (MCMC) methods.

77 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose and analyse the Autoregressive Conditional Root (ACR) time series model, which is a multivariate dynamic mixture autoregression which allows for nonstationary epochs, and establish consistency and asymptotic normality of the maximum likelihood estimators in the ACR model.
Abstract: In this paper we propose and analyse the Autoregressive Conditional Root (ACR) time series model. It is a multivariate dynamic mixture autoregression which allows for nonstationary epochs. It proves to be an appealing alternative to existing nonlinear models such as e.g. the threshold autoregressive or Markov switching classes of models, which are commonly used to describe nonlinear dynamics as implied by arbitrage in presence of transaction costs. Simple conditions on the parameters of the ACR process and its innovations, are shown to imply geometric ergodicity, stationarity and existence of moments. Furthermore, we establish consistency and asymptotic normality of the maximum likelihood estimators in the ACR model. An application to real exchange rate data illustrates the conclusions and analysis.

53 citations


Posted Content
TL;DR: In this article, the authors provide an asymptotic distribution theory for some nonparametric tests of the hypothesis that asset prices have continuous sample paths and apply the tests to exchange rate data and show that the null of a continuous sample path is frequently rejected.
Abstract: In this article we provide an asymptotic distribution theory for some nonparametric tests of the hypothesis that asset prices have continuous sample paths. We study the behaviour of the tests using simulated data and see that certain versions of the tests have good finite sample behavior. We also apply the tests to exchange rate data and show that the null of a continuous sample path is frequently rejected. Most of the jumps the statistics identify are associated with governmental macroeconomic announcements.

46 citations


Posted Content
TL;DR: In this article, the authors propose a novel and fast way of estimating models of time-varying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets.
Abstract: Building models for high dimensional portfolios is important in risk management and asset allocation. Here we propose a novel and fast way of estimating models of time-varying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets. Indeed we can handle the case where the cross-sectional dimension is larger than the time series one. The theory of this new strategy is developed in some detail, allowing formal hypothesis testing to be carried out on these models. Simulations are used to explore the performance of this inference strategy while empirical examples are reported which show the strength of this method. The out of sample hedging performance of various models estimated using this method are compared.

Posted Content
01 Jan 2008
TL;DR: In this article, the authors review the history and recent developments of stochastic volatility, which is the main way financial economists and mathematical finance specialists model time varying volatility, and present a survey of the main ways financial economists have used it.
Abstract: In this paper we review the history and recent developments of stochastic volatility, which is the main way financial economists and mathematical finance specialists model time varying volatility.(This abstract was borrowed from another version of this item.)

Posted Content
TL;DR: In this paper, it was shown that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm, which is perhaps surprising given the celebrated results on maximum simulated likelihood estimation.
Abstract: Suppose we wish to carry out likelihood based inference but we solely have an unbiased simulation based estimator of the likelihood. We note that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm. This result has recently been intro- duced in statistics literature by Andrieu, Doucet, and Holenstein (2007) and is perhaps surprising given the celebrated results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics and financial econometrics. One way of generating unbiased estimates of the likelihood is by the use of a particle filter. We illustrate these methods on four problems in econometrics, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model we can carry out likelihood based inference using its simulations.

Posted Content
01 Jan 2008
TL;DR: In this article, a new measure of risk based entirely on downwards moves measured using high frequency data is proposed, which is shown to have important predictive qualities for future market volatility, drawing on some new results from probability theory.
Abstract: We propose a new measure of risk, based entirely on downwards moves measured using high frequency data. Realised semivariances are shown to have important predictive qualities for future market volatility. The theory of these new measures is spelt out, drawing on some new results from probability theory.