scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The econometrics of ultra-high-frequency data

01 Jan 2000-Econometrica (Blackwell Publishers Ltd)-Vol. 68, Iss: 1, pp 1-22
TL;DR: In this article, the ACD point process was applied to IBM transaction arrival times to develop semiparametric hazard estimates and conditional intensities, and combined with a GARCH model of prices produces ultra-high-frequency measures of volatility.
Abstract: Ultra-high-frequency data is defined to be a full record of transactions and their associated characteristics. The transaction arrival times and accompanying measures can be analyzed as marked point processes. The ACD point process developed by Engle and Russell (1998) is applied to IBM transactions arrival times to develop semiparametric hazard estimates and conditional intensities. Combining these intensities with a GARCH model of prices produces ultra-high-frequency measures of volatility. Both returns and variances are found to be negatively influenced by long durations as suggested by asymmetric information models of market micro-structure.

Content maybe subject to copyright    Report






Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the authors provide a framework for non-parametric measurement of the jump component in asset return volatility and find that jumps are both highly prevalent and distinctly less persistent than the continuous sample path variation process.
Abstract: A rapidly growing literature has documented important improvements in financial return volatility measurement and forecasting via use of realized variation measures constructed from high-frequency returns coupled with simple modeling procedures. Building on recent theoretical results in Barndorff-Nielsen and Shephard (2004a, 2005) for related bi-power variation measures, the present paper provides a practical and robust framework for non-parametrically measuring the jump component in asset return volatility. In an application to the DM/$ exchange rate, the S&P500 market index, and the 30-year U.S. Treasury bond yield, we find that jumps are both highly prevalent and distinctly less persistent than the continuous sample path variation process. Moreover, many jumps appear directly associated with specific macroeconomic news announcements. Separating jump from non-jump movements in a simple but sophisticated volatility forecasting model, we find that almost all of the predictability in daily, weekly, and monthly return volatilities comes from the non-jump component. Our results thus set the stage for a number of interesting future econometric developments and important financial applications by separately modeling, forecasting, and pricing the continuous and jump components of the total return variation process.

1,167 citations

Journal ArticleDOI
TL;DR: The authors explore mixed data sampling (henceforth MIDAS) regression models, which involve time series data sampled at different frequencies, and provide empirical evidence on microstructure noise and volatility forecasting.
Abstract: We explore mixed data sampling (henceforth MIDAS) regression models The regressions involve time series data sampled at different frequencies Volatility and related processes are our prime focus, though the regression method has wider applications in macroeconomics and finance, among other areas The regressions combine recent developments regarding estimation of volatility and a not-so-recent literature on distributed lag models We study various lag structures to parameterize parsimoniously the regressions and relate them to existing models We also propose several new extensions of the MIDAS framework The paper concludes with an empirical section where we provide further evidence and new results on the risk–return trade-off We also report empirical evidence on microstructure noise and volatility forecasting

807 citations

Journal ArticleDOI
TL;DR: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models as mentioned in this paper, including high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models.
Abstract: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models. Wide-ranging applications to financial data have discovered important stylized facts and illustrated both the strengths and weaknesses of the models. There are now many surveys of this literature. This paper looks forward to identify promising areas of new research. The paper lists five new frontiers. It briefly discusses three—high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models. Two further frontiers are examined in more detail—application of ARCH models to the broad class of non-negative processes, and use of Least Squares Monte Carlo to examine non-linear properties of any model that can be simulated. Using this methodology, the paper analyses more general types of ARCH models, stochastic volatility models, long-memory models and breaking volatility models. The volatility of volatility is defined, estimated and compared with option-implied volatilities. Copyright © 2002 John Wiley & Sons, Ltd.

702 citations

Journal ArticleDOI
TL;DR: In this article, a generalized pre-averaging approach for estimating the integrated volatility is presented, which can generate rate optimal estimators with convergence rate n 1/4. But the convergence rate is not guaranteed.

525 citations

Journal ArticleDOI
TL;DR: The authors must take risks to achieve rewards but not all risks are equally rewarded, so they optimize their behavior, and in particular their portfolio, to maximize rewards and minimize risks.
Abstract: The advantage of knowing about risks is that we can change our behavior to avoid them. Of course, it is easily observed that to avoid all risks would be impossible; it might entail no flying, no driving, no walking, eating and drinking only healthy foods and never being touched by sunshine. Even a bath could be dangerous. I could not receive this prize if I sought to avoid all risks. There are some risks we choose to take because the benefits from taking them exceed the possible costs. Optimal behavior takes risks that are worthwhile. This is the central paradigm of finance; we must take risks to achieve rewards but not all risks are equally rewarded. Both the risks and the rewards are in the future, so it is the expectation of loss that is balanced against the expectation of reward. Thus we optimize our behavior, and in particular our portfolio, to maximize rewards and minimize risks.

524 citations

References
More filters
Journal ArticleDOI
TL;DR: In this paper, a model of the postwar U. S. economy is presented, based on fifty quarterly observations from the third quarter of 1947 to the fourth quarter of 1959, and the model is constructed on structural equations with the variables and the sources of data listed in Appendix B.
Abstract: of the postwar U. S. economy in the Tinbergen-Klein [32, 15, and 16] tradition,2 and to apply the model to an analysis of certain types of monetary and fiscal policy. The model is crude and exploratory,3 and the analysis is in aggregate terms. Needless to say, the U. S. economy cannot be adequately described by such a simple model, and the findings are necessarily highly tentative. The model and the simulations presented here are the initial result of a limited attempt to gain some knowledge about certain aspects of the economy in quantitative terms and to throw some light on the problems involved. The lag in effect in monetary and fiscal policy (Baumol [3], Culbertson [7 and 8], and Friedman [12]) is a case in point. Such a problem cannot be settled by theoretical analysis. An indication of the answer to the question can only be obtained through a quantitative study, however crude the approach and the indication may be. The model is constructed on fifty quarterly observations from the third quarter of 1947 to the fourth quarter of 1959. The structural equations are presented in Appendix A, with the variables and the sources of data listed in Appendix B. Sections 1-3 describe the model. The investment and consumption functions are presented in Section 1. A sub-model on inventory and price movements is formulated in Section 2. Functions for monetary and certain other variables are presented in Section 3. Extrapolations for the magnitudes of the major components of gnp are computed for 1960 and the first quarter of 1961 from the respective individual equations in Sections 1 1 The author wishes to express appreciation to the Ford Foundation whose faculty research fellowship made this research possible, to the Social Science Research Center of Cornell University for grants to support the initial computations, and to his colleagues at the Cornell Computing Center for their untiring cooperation. He is indebted to Marc Nerlove and A. S. Goldberger for discussion at various stages of the preparation

56 citations

Journal ArticleDOI
TL;DR: In this article, a new model for discrete valued time series is proposed in the context of generalized linear models, which is called the Autoregressive Conditional Multinomial (ACM) model.
Abstract: This paper proposes a new approach to modeling financial transactions data. A new model for discrete valued time series is proposed in the context of generalized linear models. Since the model is specified conditional on both the previous state, as well as the historic distribution, we call the model the Autoregressive Conditional Multinomial (ACM) model. When the data are viewed as a marked point process, the ACD model proposed in Engle and Russell (1998) allows for joint modeling of the price transition probabilities and the arrival times of the transactions. In this marked point process context, the transition probabilities vary continuously through time and are therefore duration dependent. Finally, variations of the model allow for volume and spreads to impact the conditional distribution of price changes. Impulse response studies show the long run price impact of a transaction can be very sensitive to volume but is less sensitive to the spread and transaction rate.

50 citations

ReportDOI
TL;DR: In this paper, the authors proposed a new statistical model for the analysis of data that does not arrive in equal time intervals such as financial transactions data, telephone calls, or sales data on commodities that are tracked electronically.
Abstract: This paper will propose a new statistical model for the analysis of data that does not arrive in equal time intervals such as financial transactions data, telephone calls, or sales data on commodities that are tracked electronically In contrast to fixed interval analysis, the model treats the time between observation arrivals as a stochastic time varying process and therefore is in the spirit of the models of time deformation initially proposed by Tauchen and Pitts (1983), Clark (1973) and more recently discussed by Stock (1988), Lamoureux and Lastrapes (1992), Muller et al (1990) and Ghysels and Jasiak (1994) but does not require auxiliary data or assumptions on the causes of time flow Strong evidence is provided for duration clustering beyond a deterministic component for the financial transactions data analyzed We will show that a very simple version of the model can successfully account for the significant autocorrelations in the observed durations between trades of IBM stock on the consolidated market A simple transformation of the duration data allows us to include volume in the model

47 citations

Posted Content
TL;DR: In this paper, the authors combine elements from Clark (1973), Dacorogna et al. (1993) and Ghysels and Jasiak (1994) and present a stochastic volatility model for foreign exchange markets with time deformation, based on daily patterns of arrivals of quotes and bid-ask spreads as well as returns.
Abstract: Globalization of trading in foreign exchange markets is a principal source of the daily and weekly seasonability in market volatility. One way to model such phenomena is to adopt a framework where market volatility is tied to the intensity of (world) trading through a subordinated stochastic process representation. In this paper we combine elements from Clark (1973), Dacorogna et al. (1993) and Ghysels and Jasiak (1994), and present a stochastic volatility model for foreign exchange markets with time deformation. The time deformation is based on daily patterns of arrivals of quotes and bid-ask spreads as well as returns. For empirical estimation we use the QMLE algorithm of Harvey et al. (1994), adopted by Ghysels and Jasiak for time deformed processes, and applied to the Olsen and Associates high frequency data set. La globalisation des echanges sur le marche mondial des taux de change est une des sources principales des effets saisonniers journaliers et hebdomadaires dans la volatilite des prix. Une facon de modeliser ces phenomenes consiste a utiliser la specification d'un processus subordonne pour formaliser la relation entre la volatilite et l'intensite des echanges. Cet article, fonde sur les idees de Clark (1973), Dacorogna et al. (1993) et Ghysels et Jasiak (1994), presente un modele de volatilite stochastique avec la deformation du temps pour les series des taux de change. La deformation du temps est determinee par la dynamique du flux des cotations a travers la journee, les fourchettes de prix passees ainsi que les rendements anterieurs. Dans la partie empirique, nous appliquons ce modele aux donnees de haute frequence de Olsen and Associates. La methode d'estimation que nous avons employee est le Quasi-Maximum de Vraisemblance propose par Harvey et Stock, adapte par Ghysels et Jasiak aux processus deformes du temps.

36 citations