scispace - formally typeset
Search or ask a question
Author

Yingying Li

Other affiliations: University of Chicago
Bio: Yingying Li is an academic researcher from Hong Kong University of Science and Technology. The author has contributed to research in topics: Volatility (finance) & Estimator. The author has an hindex of 19, co-authored 38 publications receiving 1487 citations. Previous affiliations of Yingying Li include University of Chicago.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a generalized pre-averaging approach for estimating the integrated volatility is presented, which can generate rate optimal estimators with convergence rate n 1/4. But the convergence rate is not guaranteed.

525 citations

Journal ArticleDOI
TL;DR: In this paper, the leverage effect refers to the generally negative correlation between an asset return and its changes of volatility, and a natural estimate consists in using the empirical correlation between the daily returns and the changes of daily volatility estimated from high frequency data.

198 citations

Posted Content
TL;DR: In this paper, the authors proposed the use of pairwise-refresh time and all refresh time methods to estimate the high-dimensional covariance matrix and compare their merits in the portfolio selection.
Abstract: Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of selected portfolios among a vast pool of assets, as demonstrated in Fan et al (2008). The required high-dimensional volatility matrix can be estimated by using high frequency financial data. This enables us to better adapt to the local volatilities and local correlations among vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This paper studies the volatility matrix estimation using high-dimensional high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of "pairwise-refresh time" and "all-refresh time" methods proposed by Barndorff-Nielsen et al (2008) for estimation of vast covariance matrix and compare their merits in the portfolio selection. We also establish the concentration inequalities of the estimates, which guarantee desirable properties of the estimated volatility matrix in vast asset allocation with gross exposure constraints. Extensive numerical studies are made via carefully designed simulations. Comparing with the methods based on low frequency daily data, our methods can capture the most recent trend of the time varying volatility and correlation, hence provide more accurate guidance for the portfolio allocation in the next time period. The advantage of using high-frequency data is significant in our simulation and empirical studies, which consist of 50 simulated assets and 30 constituent stocks of Dow Jones Industrial Average index.

115 citations

Journal ArticleDOI
TL;DR: This article proposes the use of “pairwise-refresh time” and “all-refreset time’ methods based on the concept of ”refreshtime” proposed by Barndorff-Nielsen, Hansen, Lunde, and Shephard for the estimation of vast covariance matrix and compares their merits in the portfolio selection.
Abstract: Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of portfolios selection among a vast pool of assets, as demonstrated by Fan, Zhang, and Yu. The required high-dimensional volatility matrix can be estimated by using high-frequency financial data. This enables us to better adapt to the local volatilities and local correlations among a vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This article studies the volatility matrix estimation using high-dimensional, high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of “pairwise-refresh time” and “all-refresh time” methods based on the concept of “refresh time” proposed by Barndorff-Nielsen, Hansen, Lunde, and Shephard for the estimation of vast covariance matrix and compare their merits in the portfolio selection. We establish the concentration inequalities of the estimates, which guarantee desi...

111 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider microstructure as an arbitrary contamination of the underlying latent securities price, through a Markov kernel Q. They show that, subject to smoothness conditions, the two scales realized volatility is robust to the form of contamination Q.
Abstract: We consider microstructure as an arbitrary contamination of the underlying latent securities price, through a Markov kernel Q. Special cases include additive error, rounding and combinations thereof. Our main result is that, subject to smoothness conditions, the two scales realized volatility is robust to the form of contamination Q. To push the limits of our result, we show what happens for some models that involve rounding (which is not, of course, smooth) and see in this situation how the robustness deteriorates with decreasing smoothness. Our conclusion is that under reasonable smoothness, one does not need to consider too closely how the microstructure is formed, while if severe non-smoothness is suspected, one needs to pay attention to the precise structure and also the use to which the estimator of volatility will be put.

78 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, realised kernels are used to carry out efficient feasible inference on the expost variation of underlying equity prices in the presence of simple models of market frictions, where the weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which is close to that of the maximum likelihood estimator in the parametric version of this problem.
Abstract: This paper shows how to use realised kernels to carry out efficient feasible inference on the expost variation of underlying equity prices in the presence of simple models of market frictions. The issue is subtle with only estimators which have symmetric weights delivering consistent estimators with mixed Gaussian limit theorems. The weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which is close to that of the maximum likelihood estimator in the parametric version of this problem. Realised kernels can also be selected to (i) be analysed using endogenously spaced data such as that in databases on transactions, (ii) allow for market frictions which are endogenous, (iii) allow for temporally dependent noise. The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice.

1,269 citations

Posted Content
TL;DR: In this paper, the authors test parametric models by comparing their implied parametric density to the same density estimated nonparametrically, and do not replace the continuous-time model by discrete approximations, even though the data are recorded at discrete intervals.
Abstract: Different continuous-time models for interest rates coexist in the literature. We test parametric models by comparing their implied parametric density to the same density estimated nonparametrically. We do not replace the continuous-time model by discrete approximations, even though the data are recorded at discrete intervals. The principal source of rejection of existing models is the strong nonlinearity of the drift. Around its mean, where the drift is essentially zero, the spot rate behaves like a random walk. The drift then mean-reverts strongly when far away from the mean. The volatility is higher when away from the mean.

830 citations

Journal ArticleDOI
TL;DR: In this article, the authors compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement, which is due to non-trivial liquidity effects.
Abstract: Realised kernels use high frequency data to estimate daily volatility of individual stock prices. They can be applied to either trade or quote data. Here we provide the details of how we suggest implementing them in practice. We compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement. We identify some features of the high frequency data which are challenging for realised kernels. They are when there are local trends in the data, over periods of around 10 minutes, where the prices and quotes are driven up or down. These can be associated with high volumes. One explanation for this is that they are due to non-trivial liquidity effects.

543 citations

Journal ArticleDOI
TL;DR: This paper showed that future volatility is more strongly related to the volatility of past negative returns than to that of positive returns and that the impact of a price jump on volatility depends on the sign of the jump, with negative (positive) jumps leading to higher volatility.
Abstract: Using estimators of the variation of positive and negative returns (realized semivariances) and high-frequency data for the S&P 500 Index and 105 individual stocks, this paper sheds new light on the predictability of equity price volatility.We showthat future volatility is more strongly related to the volatility of past negative returns than to that of positive returns and that the impact of a price jump on volatility depends on the sign of the jump, with negative (positive) jumps leading to higher (lower) future volatility. We show that models exploiting these findings lead to significantly better out-of-sample forecast performance.

533 citations

Journal ArticleDOI
TL;DR: In this article, a generalized pre-averaging approach for estimating the integrated volatility is presented, which can generate rate optimal estimators with convergence rate n 1/4. But the convergence rate is not guaranteed.

525 citations