scispace - formally typeset
Search or ask a question

Showing papers by "Neil Shephard published in 2009"


Journal ArticleDOI
TL;DR: In this paper, the authors compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement, which is due to non-trivial liquidity effects.
Abstract: Summary Realized kernels use high-frequency data to estimate daily volatility of individual stock prices. They can be applied to either trade or quote data. Here we provide the details of how we suggest implementing them in practice. We compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement. We identify some features of the high-frequency data, which are challenging for realized kernels. They are when there are local trends in the data, over periods of around 10 minutes, where the prices and quotes are driven up or down. These can be associated with high volumes. One explanation for this is that they are due to non-trivial liquidity effects.

459 citations



Book ChapterDOI
Abstract: Stochastic volatility is the main way time-varying volatility is modelled in financial markets. The development of stochastic volatility is reviewed, placing it in a modeling and historical context. Some recent trends in the literature are highlighted.

119 citations


Journal ArticleDOI
TL;DR: In this article, the authors use extreme value theory to empirically assess the appropriateness of importance sampling in the stochastic volatility model, where importance sampling is commonly used for maximum likelihood estimation of the parameters of the model.

102 citations


Journal Article
TL;DR: In this article, an analysis of the indicator saturation estimator as a robust regression estimator is presented, based on the Survey of Professional Forecasters, with a focus on the U.S. natural rate dynamics reconsidered.
Abstract: 1. An analysis of the indicator saturation estimator as a robust regression estimator 2. Empirical Identification of the Vector Autoregression: The Causes and Effects of U.S. M2 3. Retrospective Estimation of Causal Effects Through Time 4. Autometrics 5. High Dimenson Dynamic Correlations 6. Pitfalls in Modeling Dependence Structures: Explorations with Copulas 7. Forecasting in Dynamic Factor Models Subject to Structural Instability 8. Internal consistency of survey respondents forecasts: Evidence based on the Survey of Professional Forecasters 9. Factor-augmented Error Correction Models 10. In Praise Of Pragmatic In Econometrics 11. On Efficient Simulations In Dynamic Models 12. Simple Wald Tests of the Fractional Integration Parameter: An Overview of New Results 13. When is a Time Series I(0)? 14. Model Identification and Non-unique Structure 15. Does it matter how to measure aggregates? The case of monetary transmission mechanisms in the Euro area 16. U.S. natural rate dynamics reconsidered 17. Constructive Data Mining: Modeling Argentine Broad Money Demand

72 citations


Posted Content
TL;DR: It is shown how jittering can be designed to improve the performance of the SIR algorithm, and its performance in practice in the context of three filtering problems is illustrated.
Abstract: A key ingredient of many particle filters is the use of the sampling importance resampling algorithm (SIR), which transforms a sample of weighted draws from a prior distribution into equally weighted draws from a posterior distribution. We give a novel analysis of the SIR algorithm and analyse the jittered generalisation of SIR, showing that existing implementations of jittering lead to marked inferior behaviour over the base SIR algorithm. We show how jittering can be designed to improve the performance of the SIR algorithm. We illustrate its performance in practice in the context of three filtering problems.

22 citations


Posted Content
01 Jan 2009
TL;DR: In this article, the authors investigated the properties of the composite likelihood (CL) method for (T × N_T ) GARCH panels with time series length T and showed that in reasonably large T CL performs well, but due to the estimation error introduced through nuisance parameter estimation, CL is subject to the "incidental parameter" problem for small T.
Abstract: We investigate the properties of the composite likelihood (CL) method for (T ×N_T ) GARCH panels. The defining feature of a GARCH panel with time series length T is that, while nuisance parameters are allowed to vary across N_T series, other parameters of interest are assumed to be common. CL pools information across the panel instead of using information available in a single series only. Simulations and empirical analysis illustrate that in reasonably large T CL performs well. However, due to the estimation error introduced through nuisance parameter estimation, CL is subject to the “incidental parameter” problem for small T.

15 citations


Posted Content
TL;DR: The economic basis of the current system: education creates positive externalities so education should be supported by the state; education creates private benefit so graduates should contribute towards the cost of their tuition; returns to education are highly uncertain so someone (e.g. the state) should provide insurance (income contingency) in case the graduate's earnings turn out to be modest; makes sense as mentioned in this paper.
Abstract:  The system allows access by UK based students to full-time undergraduate education in the UK irrespective of ability to pay, subject to getting a place at a university. This should be kept.  The economic basis of the current system: o Education creates positive externalities so education should be supported by the state; o Education creates private benefit so graduates should contribute towards the cost of their tuition; o Returns to education are highly uncertain so someone (e.g. the state) should provide insurance (income contingency) in case the graduate’s earnings turn out to be modest; makes sense. This should be kept.

7 citations


Posted Content
TL;DR: In this article, a class of high frequency based volatility (HEAVY) models are studied, which are direct models of daily asset return volatility based on realized measures constructed from high frequency data.
Abstract: This paper studies in some detail a class of high frequency based volatility (HEAVY) models. These models are direct models of daily asset return volatility based on realized measures constructed from high frequency data. Our analysis identifies that the models have momentum and mean reversion effects, and that they adjust quickly to structural breaks in the level of the volatility process. We study how to estimate the models and how they perform through the credit crunch, comparing their fit to more traditional GARCH models. We analyse a model based bootstrap which allow us to estimate the entire predictive distribution of returns. We also provide an analysis of missing data in the context of these models.

6 citations


Posted Content
TL;DR: In this article, the authors argue that the fiscal position of the UK means it will be very hard for the next government to allow the undergraduate fee cap to increase beyond the rate of inflation.
Abstract: I show that the fiscal position of the UK means it will be very hard for the next government to allow the undergraduate fee cap to increase beyond the rate of inflation. The funding postion of the higher education sector can be improved by the government removing the interest rate subsidy it currently gives to students. However, even this does not really allow the fee cap to increase markedly as any increase would lead to the Government’s loan book expanding. I suggest each university should be allowed to introduce its own income contingent fee, on top of the existing national funding structure. Each graduate would only have to pay these fees to its university if their income rises beyond the point of paying off their maintenance and state tuition loans. I show these new fees are fiscally neutral and have no impact on the loan book or the financial position of the universities which do not introduce such fees. Such fees have the potential to provide a long-run solution to the repeated underfunding of undergraduate education at a number of English universities and reduce the fiscal pressure the state is under.

1 citations