scispace - formally typeset
Search or ask a question

Showing papers by "Neil Shephard published in 1999"


Journal ArticleDOI
TL;DR: This article analyses the recently suggested particle approach to filtering time series and suggests that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution.
Abstract: This article analyses the recently suggested particle approach to filtering time series. We suggest that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution. Here we tackle the first of these problems.

2,608 citations


Journal ArticleDOI
TL;DR: It is shown that SsfPack can be easily used for implementing, fitting and analysing Gaussian models relevant to many areas of econometrics and statistics.
Abstract: This paper discusses and documents the algorithms of SsfPack 2.2. SsfPack is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form. The emphasis is on documenting the link we have made to the Ox computing environment. SsfPack allows for a full range of different state space forms: from a simple time-invariant model to a complicated time-varying model. Functions can be used which put standard models such as ARMA and cubic spline models in state space form. Basic functions are available for filtering, moment smoothing and simulation smoothing. Ready-to-use functions are provided for standard tasks such as likelihood evaluation, forecasting and signal extraction. We show that SsfPack can be easily used for implementing, fitting and analysing Gaussian models relevant to many areas of econometrics and statistics. Some Gaussian illustrations are given.

456 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed using compound Poisson processes to model trade-by-trade financial data and developed specific types of Cox processes in order to accurately depict the trading process, and studied the implication for price changes over pre-specified intervals of times, such as 30 seconds, 20 minutes or a day.
Abstract: In this chapter we propose using compound Poisson processes to model trade-by-trade financial data. Our main focus will be on developing specific types of Cox processes in order to accurately depict the trading process. We study the problem of signal extracting the intensity of the trading process. We finish by studying the implication for price changes over pre-specified intervals of times, such as 30 seconds, 20 minutes or a day and assessing the empirical plausibility of OU based models for the intensity of the trading process.

67 citations


Journal ArticleDOI
TL;DR: In this paper, a decomposition of the joint distribution of price changes of assets recorded trade-by-trade is introduced, which can be easily extended in a great number of directions, including using durations and volume as explanatory variables.
Abstract: In this paper we introduce a decomposition of the joint distribution of price changes of assets recorded trade--by--trade. Our decomposition means that we can model the dynamics of price changes using quite simple and interpretable models which are easily extended in a great number of directions, including using durations and volume as explanatory variables. Thus we provide an econometric basis for empirical work on micro market structure using time series of transactions data. We use maximum likelihood estimation and testing methods to assess the fit of the model to a year of IBM stock price data taken from the New York Stock Exchange.

55 citations


Journal ArticleDOI
TL;DR: In this article, the convergence rate of the Gibbs sampler applied to the unobserved states of a first-order autoregression plus noise model is derived in terms of the parameters of the model, which are regarded as fixed.
Abstract: In this paper we obtain a closed form expression for the convergence rate of the Gibbs sampler applied to the unobserved states of a first-order autoregression plus noise model. The rate is expressed in terms of the parameters of the model, which are regarded as fixed. For the case where the unconditional mean of the states is a parameter of interest we provide evidence that a ‘centred’ parameterization of a state space model is preferable for the performance of the Gibbs sampler. These two results provide guidance when the Gaussianity or linearity of the state space form is lost. We illustrate this by examining the performance of a Markov chain Monte Carlo sampler for the stochastic volatility model.

44 citations


Posted Content
TL;DR: In this article, the authors consider the fitting and comparison of high-dimensional multivariate time series models with time varying correlations, and propose an estimation, filtering and model choice algorithm.
Abstract: This paper is concerned with the fitting and comparison of high dimensional multivariate time series models with time varying correlations. The models considered here combine features of the classical factor model with those of the univariate stochastic volatility model. Specifically, a set of unobserved time-dependent factors, along with an associated loading matrix, are used to model the contemporaneous correlation while, conditioned on the factors, the noise in each factor and each series is assumed to follow independent three-parameter univariate stochastic volatility processes. A complete analysis of these models, and its special cases, is developed that encompasses estimation, filtering and model choice. The centerpieces of our estimation algorithm (which relies on MCMC methods) is (1) a reduced blocking scheme for sampling the free elements of the loading matrix and the factors and (2) a special method for sampling the parameters of the univariate SV process. The sampling of the loading matrix (containing typically many hundreds of parameters) is done via a highly tuned Metropolis-Hastings step. The resulting algorithm is completely scalable in terms of series and factors and very simulation-efficient. We also provide methods for estimating the log-likelihood function and the filtered values of the time-varying volatilities and correlations. We pay special attention to the problem of comparing one version of the model with another and for determining the number of factors. For this purpose we use MCMC methods to find the marginal likelihood and associated Bayes factors of each fitted model. In sum, these procedures lead to the first unified and practical likelihood based analysis of truly high dimensional models of stochastic volatility. We apply our methods in detail to two datasets. The first is the return vector on 20 exchange rates against the US Dollar. The second is the return vector on 40 common stocks quoted on the New York Stock Exchange.

33 citations


Posted Content
TL;DR: In this article, the authors consider the fitting and comparison of high-dimensional multivariate time series models with time varying correlations, and propose an estimation, filtering and model choice algorithm.
Abstract: This paper is concerned with the fitting and comparison of high dimensional multivariate time series models with time varying correlations. The models considered here combine features of the classical factor model with those of the univariate stochastic volatility model. Specifically, a set of unobserved time-dependent factors, along with an associated loading matrix, are used to model the contemporaneous correlation while, conditioned on the factors, the noise in each factor and each series is assumed to follow independent three-parameter univariate stochastic volatility processes. A complete analysis of these models, and its special cases, is developed that encompasses estimation, filtering and model choice. The centerpieces of our estimation algorithm (which relies on MCMC methods) is (1) a reduced blocking scheme for sampling the free elements of the loading matrix and the factors and (2) a special method for sampling the parameters of the univariate SV process. The sampling of the loading matrix (containing typically many hundreds of parameters) is done via a highly tuned Metropolis-Hastings step. The resulting algorithm is completely scalable in terms of series and factors and very simulation-efficient. We also provide methods for estimating the log-likelihood function and the filtered values of the time-varying volatilities and correlations. We pay special attention to the problem of comparing one version of the model with another and for determining the number of factors. For this purpose we use MCMC methods to find the marginal likelihood and associated Bayes factors of each fitted model. In sum, these procedures lead to the first unified and practical likelihood based analysis of truly high dimensional models of stochastic volatility. We apply our methods in detail to two datasets. The first is the return vector on 20 exchange rates against the US Dollar. The second is the return vector on 40 common stocks quoted on the New York Stock Exchange.

16 citations


Journal ArticleDOI
TL;DR: In this article, the authors extend Rydberg-Shephard's activity, direction and size decomposition of trade-by-trade price movements to the multivariate case, and illustrate their ideas using a bivariate modelling problem.
Abstract: In this paper we extend Rydberg-Shephard's activity, direction and size decomposition of trade-by-trade price movements to the multivariate case. We illustrate our ideas using a bivariate modelling problem --- modelling the evolution of the prices of Ford and GM shares. Throughout we use the continuous record of trades made in the first five months of 1997 on the New York Stock Exchange (NYSE).

11 citations