scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The econometrics of ultra-high-frequency data

01 Jan 2000-Econometrica (Blackwell Publishers Ltd)-Vol. 68, Iss: 1, pp 1-22
TL;DR: In this article, the ACD point process was applied to IBM transaction arrival times to develop semiparametric hazard estimates and conditional intensities, and combined with a GARCH model of prices produces ultra-high-frequency measures of volatility.
Abstract: Ultra-high-frequency data is defined to be a full record of transactions and their associated characteristics. The transaction arrival times and accompanying measures can be analyzed as marked point processes. The ACD point process developed by Engle and Russell (1998) is applied to IBM transactions arrival times to develop semiparametric hazard estimates and conditional intensities. Combining these intensities with a GARCH model of prices produces ultra-high-frequency measures of volatility. Both returns and variances are found to be negatively influenced by long durations as suggested by asymmetric information models of market micro-structure.

Content maybe subject to copyright    Report






Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors provide a framework for non-parametric measurement of the jump component in asset return volatility and find that jumps are both highly prevalent and distinctly less persistent than the continuous sample path variation process.
Abstract: A rapidly growing literature has documented important improvements in financial return volatility measurement and forecasting via use of realized variation measures constructed from high-frequency returns coupled with simple modeling procedures. Building on recent theoretical results in Barndorff-Nielsen and Shephard (2004a, 2005) for related bi-power variation measures, the present paper provides a practical and robust framework for non-parametrically measuring the jump component in asset return volatility. In an application to the DM/$ exchange rate, the S&P500 market index, and the 30-year U.S. Treasury bond yield, we find that jumps are both highly prevalent and distinctly less persistent than the continuous sample path variation process. Moreover, many jumps appear directly associated with specific macroeconomic news announcements. Separating jump from non-jump movements in a simple but sophisticated volatility forecasting model, we find that almost all of the predictability in daily, weekly, and monthly return volatilities comes from the non-jump component. Our results thus set the stage for a number of interesting future econometric developments and important financial applications by separately modeling, forecasting, and pricing the continuous and jump components of the total return variation process.

1,167 citations

Journal ArticleDOI
TL;DR: The authors explore mixed data sampling (henceforth MIDAS) regression models, which involve time series data sampled at different frequencies, and provide empirical evidence on microstructure noise and volatility forecasting.
Abstract: We explore mixed data sampling (henceforth MIDAS) regression models The regressions involve time series data sampled at different frequencies Volatility and related processes are our prime focus, though the regression method has wider applications in macroeconomics and finance, among other areas The regressions combine recent developments regarding estimation of volatility and a not-so-recent literature on distributed lag models We study various lag structures to parameterize parsimoniously the regressions and relate them to existing models We also propose several new extensions of the MIDAS framework The paper concludes with an empirical section where we provide further evidence and new results on the risk–return trade-off We also report empirical evidence on microstructure noise and volatility forecasting

807 citations

Journal ArticleDOI
TL;DR: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models as mentioned in this paper, including high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models.
Abstract: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models. Wide-ranging applications to financial data have discovered important stylized facts and illustrated both the strengths and weaknesses of the models. There are now many surveys of this literature. This paper looks forward to identify promising areas of new research. The paper lists five new frontiers. It briefly discusses three—high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models. Two further frontiers are examined in more detail—application of ARCH models to the broad class of non-negative processes, and use of Least Squares Monte Carlo to examine non-linear properties of any model that can be simulated. Using this methodology, the paper analyses more general types of ARCH models, stochastic volatility models, long-memory models and breaking volatility models. The volatility of volatility is defined, estimated and compared with option-implied volatilities. Copyright © 2002 John Wiley & Sons, Ltd.

702 citations

Journal ArticleDOI
TL;DR: In this article, a generalized pre-averaging approach for estimating the integrated volatility is presented, which can generate rate optimal estimators with convergence rate n 1/4. But the convergence rate is not guaranteed.

525 citations

Journal ArticleDOI
TL;DR: The authors must take risks to achieve rewards but not all risks are equally rewarded, so they optimize their behavior, and in particular their portfolio, to maximize rewards and minimize risks.
Abstract: The advantage of knowing about risks is that we can change our behavior to avoid them. Of course, it is easily observed that to avoid all risks would be impossible; it might entail no flying, no driving, no walking, eating and drinking only healthy foods and never being touched by sunshine. Even a bath could be dangerous. I could not receive this prize if I sought to avoid all risks. There are some risks we choose to take because the benefits from taking them exceed the possible costs. Optimal behavior takes risks that are worthwhile. This is the central paradigm of finance; we must take risks to achieve rewards but not all risks are equally rewarded. Both the risks and the rewards are in the future, so it is the expectation of loss that is balanced against the expectation of reward. Thus we optimize our behavior, and in particular our portfolio, to maximize rewards and minimize risks.

524 citations

References
More filters
Journal ArticleDOI
TL;DR: In this paper, a cyclic metropolis algorithm is used to construct a Markov-chain simulation tool for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model.
Abstract: New techniques for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model are developed. A cyclic Metropolis algorithm is used to construct a Markov-chain simulation tool. Simulations from this Markov chain coverage in distribution to draws from the posterior distribution enabling exact finite-sample inference. The exact solution to the filtering/smoothing problem of inferring about the unobserved variance states is a by-product of our Markov-chain method. In addition, multistep-ahead predictive densities can be constructed that reflect both inherent model variability and parameter uncertainty. We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform ...

1,711 citations

Book
06 Apr 1995
TL;DR: In this article, the authors present two types of strategic traders: informed traders and uninformed traders, based on information-based models: Informed traders are those who make decisions based on their knowledge of the market.
Abstract: Foreword. 1. Markets and Market--Making. 2. Inventory Models. 3. Information--Based Models. 4. Strategic Trader Models I: Informed Traders. 5. Strategic Trader Models II: Uninformed Traders. 6. Information and the Price Process. 7. Market Viability and Stability. 8. Liquidity and the Relationships between Markets. 9. Issues in Market Performance.

1,567 citations

Journal ArticleDOI
TL;DR: In this article, the relationship between the variability of the daily price change and the daily volume of trading on the speculative markets was investigated and the results of the estimation can reconcile a conflict between the price variability-volume relationship for this market and the relationship obtained by previous investigators for other speculative markets.
Abstract: This paper concerns the relationship between the variability of the daily price change and the daily volume of trading on the speculative markets. Our work extends the theory of speculative markets in two ways. First, we derive from economic theory the joint probability distribution of the price change and the trading volume over any interval of time within the trading day. And second, we determine how this joint distribution changes as more traders enter (or exit from) the market. The model's parameters are estimated by FIML using daily data from the 90-day T-bills futures market. The results of the estimation can reconcile a conflict between the price variability-volume relationship for this market and the relationship obtained by previous investigators for other speculative markets. THIS PAPER CONCERNS the relationship between the variability of the daily price change and the volume of trading on speculative markets. Previous empirical studies [2, 3, 6, 12, 14, 16] of both futures and equity markets always find a positive association between price variability (as measured by the squared price change Ap2) and the trading volume.2 There are two explanations for the relationship. Clark's [2] explanation, which is secondary to his effort to explain why the probability distribution of the daily price change is leptokurtic, emphasizes randomness in the number of within-day transactions. In Clark's model the daily price change is the sum of a random number of within-day price changes. The variance of the daily price change is thus a random variable with a mean proportional to the mean number of daily transactions. Clark argues that the trading volume is related positively to the number of within-day transactions, and so the trading volume is related positively to the variability of the price change. The second explanation is due to Epps and Epps [6]. Their model examines the mechanics of within-day trading. The change in the market price on each within-day transaction or market clearing is the average of the changes in all of the traders' reservation prices. Epps and Epps assume there is a positive relationship between the extent to which traders disagree when they revise their reservation prices and the absolute value of the change in the market price. That is, an increase in the extent to which traders disagree is associated with a larger absolute price change. The price variability-volume relationship arises, then, because the volume of trading is positively related to the extent to which traders disagree when they revise their reservation prices.

1,558 citations

Journal ArticleDOI
TL;DR: In this paper, a comprehensive investigation of price and volume co-movement using daily New York Stock Exchange data from 1928 to 1987 is conducted, where the authors adjust the data to take into account well-known calendar effects and long-run trends.
Abstract: The authors undertake a comprehensive investigation of price and volume co-movement using daily New York Stock Exchange data from 1928 to 1987. They adjust the data to take into account well-known calendar effects and long-run trends. To describe the process, they use a seminonparametric estimate of the joint density of current price change and volume conditional on past price changes and volume. Four empirical regularities are found: (1) positive correlation between conditional volatility and volume; (2) large price movements are followed by high volume; (3) conditioning on lagged volume substantially attenuates the "leverage" effect, and (4) after conditioning on lagged volume, there is a positive risk-return relation. Article published by Oxford University Press on behalf of the Society for Financial Studies in its journal, The Review of Financial Studies.

1,418 citations

Journal ArticleDOI
TL;DR: In this article, a formal integration of standard volatility models with market microstructure variables to allow for a more comprehensive empirical investigation of the fundamental determinants behind the volatility clustering phenomenon is presented.

1,388 citations