scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Business & Economic Statistics in 2002"


Journal ArticleDOI
TL;DR: In this article, a new class of multivariate models called dynamic conditional correlation models is proposed, which have the flexibility of univariate generalized autoregressive conditional heteroskedasticity (GARCH) models coupled with parsimonious parametric models for the correlations.
Abstract: Time varying correlations are often estimated with multivariate generalized autoregressive conditional heteroskedasticity (GARCH) models that are linear in squares and cross products of the data. A new class of multivariate models called dynamic conditional correlation models is proposed. These have the flexibility of univariate GARCH models coupled with parsimonious parametric models for the correlations. They are not linear but can often be estimated very simply with univariate or two-step methods based on the likelihood function. It is shown that they perform well in a variety of situations and provide sensible empirical results.

5,695 citations


Journal ArticleDOI
TL;DR: Weak instruments arise when the instruments in linear instrumental variables (IV) regression are weakly correlated with the included endogenous variables as discussed by the authors, and weak instruments correspond to weak identification of some or all of the unknown parameters.
Abstract: Weak instruments arise when the instruments in linear instrumental variables (IV) regression are weakly correlated with the included endogenous variables. In generalized method of moments (GMM), more generally, weak instruments correspond to weak identification of some or all of the unknown parameters. Weak identification leads to GMM statistics with nonnormal distributions, even in large samples, so that conventional IV or GMM inferences are misleading. Fortunately, various procedures are now available for detecting and handling weak instruments in the linear IV model and, to a lesser degree, in nonlinear GMM.

3,038 citations


Journal ArticleDOI
TL;DR: This paper used principal component analysis (PCA) to predict macroeconomic time series variable using a large number of predictors, and the predictors were summarized using a small number of indexes constructed by principal component analyzer.
Abstract: This article studies forecasting a macroeconomic time series variable using a large number of predictors. The predictors are summarized using a small number of indexes constructed by principal component analysis. An approximate dynamic factor model serves as the statistical framework for the estimation of the indexes and construction of the forecasts. The method is used to construct 6-, 12-, and 24-monthahead forecasts for eight monthly U.S. macroeconomic time series using 215 predictors in simulated real time from 1970 through 1998. During this sample period these new forecasts outperformed univariate autoregressions, small vector autoregressions, and leading indicator models.

2,686 citations


Journal ArticleDOI
TL;DR: In this article, a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) model with time-varying correlations is proposed, which adopts the vech representation based on the conditional variances and the conditional correlations.
Abstract: In this article we propose a new multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) model with time-varying correlations. We adopt the vech representation based on the conditional variances and the conditional correlations. Whereas each conditional-variance term is assumed to follow a univariate GARCH formulation, the conditional-correlation matrix is postulated to follow an autoregressive moving average type of analog. Our new model retains the intuition and interpretation of the univariate GARCH model and yet satisfies the positive-definite condition as found in the constant-correlation and Baba–Engle–Kraft–Kroner models. We report some Monte Carlo results on the finite-sample distributions of the maximum likelihood estimate of the varying-correlation MGARCH model. The new model is applied to some real data sets.

1,087 citations


Journal ArticleDOI
TL;DR: The authors examined the performance of regime-switching models for interest rate data from the United States, Germany, and the United Kingdom, and found that the regimes in interest rates correspond reasonably well with business cycles.
Abstract: We examine the econometric performance of regime-switching models for interest rate data from the United States, Germany, and the United Kingdom. Regime-switching models forecast better out-ofsample than single-regime models, including an affine multifactor model, but do not always match moments very well. Regime-switching models incorporating international short-rate and term spread information forecast better, match sample moments better, and classify regimes better than univariate regime-switching models. Finally, the regimes in interest rates correspond reasonably well with business cycles, at least in the United States.

688 citations


Journal ArticleDOI
TL;DR: In this paper, a Cox-Ingersoll-Ross model with parameters calibrated to match monthly observations of the U.S. short-term interest rate is used as a test case.
Abstract: Stochastic differential equations often provide a convenient way to describe the dynamics of economic and financial data, and a great deal of effort has been expended searching for efficient ways to estimate models based on them. Maximum likelihood is typically the estimator of choice; however, since the transition density is generally unknown, one is forced to approximate it. The simulation-based approach suggested by Pedersen (1995) has great theoretical appeal, but previously available implementations have been computationally costly. We examine a variety of numerical techniques designed to improve the performance of this approach. Synthetic data generated by a Cox-Ingersoll-Ross model with parameters calibrated to match monthly observations of the U.S. short-term interest rate are used as a test case. Since the likelihood function of this process is known, the quality of the approximations can be easily evaluated. On datasets with 1,000 observations, we are able to approximate the maximum likelihood e...

439 citations


Journal ArticleDOI
TL;DR: In this paper, a new conditional jump model was developed to study jump dynamics in stock market returns. But the model is not suitable for the analysis of stock market volatility and the model does not capture the rally often observed in equity markets following a significant downturn.
Abstract: This article develops a new conditional jump model to study jump dynamics in stock market returns. We propose a simple filter to infer ex post the distribution of jumps. This permits construction of the shock affecting the time t conditional jump intensity and is the main input into an autoregressive conditional jump intensity model. The model allows the conditional jump intensity to be time-varying and follows an approximate autoregressive moving average (ARMA) form. The time series characteristics of 72 years of daily stock returns are analyzed using the jump model coupled with a generalized autoregressive conditional heteroscedasticity (GARCH) specification of volatility. We find significant time variation in the conditional jump intensity and evidence of time variation in the jump size distribution. The conditional jump dynamics contribute to good in-sample and out-of-sample fits to stock market volatility and capture the rally often observed in equity markets following a significant downturn.

341 citations


Journal ArticleDOI
TL;DR: In this paper, a semiparametric smooth coefficient model is proposed for estimating the production function of the nonmetal mineral industry in China, where the intermediate production and management expense has played a vital role and is an unbalanced determinant of the labor and capital elasticities of output in production.
Abstract: In this article, we propose a semiparametric smooth coefficient model as a useful yet flexible specification for studying a general regression relationship with varying coefficients. The article proposes a local least squares method with a kernel weight function to estimate the smooth coefficient function. The consistency of the estimator and its asymptotic normality are established. A simple statistic for testing a parametric model versus the semiparametric smooth coefficient model is proposed. An empirical application of the proposed method is presented with an estimation of the production function of the nonmetal mineral industry in China. The empirical findings show that the intermediate production and management expense has played a vital role and is an unbalanced determinant of the labor and capital elasticities of output in production.

291 citations


Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the empirical success of a variety of financial market instruments in predicting the future path of monetary policy and find that federal funds futures dominate all the other securities in forecasting monetary policy at horizons out to six months.
Abstract: A number of recent articles have used different financial market instruments to measure near-term expectations of the federal funds rate and the high-frequency changes in these instruments around Federal Open Market Committee announcements to measure monetary policy shocks. This article evaluates the empirical success of a variety of financial market instruments in predicting the future path of monetary policy. All of the instruments we consider provide forecasts that are clearly superior to those of standard time series models at all of the horizons considered. Among financial market instruments, we find that federal funds futures dominate all the other securities in forecasting monetary policy at horizons out to six months. For longer horizons, the predictive power of many of the instruments we consider is very similar. In addition, we present evidence that monetary policy shocks computed using the current-month federal funds futures contract are influenced by changes in the timing of policy actions tha...

281 citations


Journal ArticleDOI
TL;DR: This article developed a new bias-corrected estimator for the fixed-effects dynamic panel data model and derived its limiting distribution for finite number of time periods and large number of cross-section units.
Abstract: This study develops a new bias-corrected estimator for the fixed-effects dynamic panel data model and derives its limiting distribution for finite number of time periods, T, and large number of cross-section units, N The bias-corrected estimator is derived as a bias correction of the least squares dummy variable (within) estimator It does not share some of the drawbacks of recently developed instrumental variables and generalized method-of-moments estimators and is relatively easy to compute Monte Carlo experiments provide evidence that the bias-corrected estimator performs well even in small samples The proposed technique is applied in an empirical analysis of unemployment dynamics at the US state level for the 1991–2000 period

183 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose extensions of the continuous record asymptotic analysis for rolling sample variance estimators developed for estimating the quadratic variation of asset returns, referred to as integrated or realized volatility.
Abstract: We propose extensions of the continuous record asymptotic analysis for rolling sample variance estimators developed for estimating the quadratic variation of asset returns, referred to as integrated or realized volatility. We treat integrated volatility as a continuous time stochastic process sampled at high frequencies and suggest rolling sample estimators which share many features with spot volatility estimators. We discuss asymptotically efficient window lengths and weighting schemes for estimators of the quadratic variation and establish links between various spot and integrated volatility estimators. Theoretical results are complemented with extensive Monte Carlo simulations and an empirical investigation.

Journal ArticleDOI
TL;DR: In this article, the authors rely on subjective expectations available in the 1995 Survey of Household Income and Wealth, a large random sample representative of Italian households, to understand how individual uncertainty evolves over the life cycle and if attitudes toward risk affect occupational choices and income riskiness.
Abstract: The mean and higher moments of the distribution of future income are crucial determinants of individual choices. These moments are usually estimated in panel data from past income realizations. Inthis article we rely instead on subjective expectations available in the 1995 Survey of Household Income and Wealth, a large random sample representative of Italian households. The survey elicits information on the distribution of future earnings and on the probability of unemployment. Analysis of this distribution helps us understand how individual uncertainty evolves over the life cycle and if attitudes toward risk affect occupational choices and income riskiness.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the class of continuous-time stochastic processes commonly known as affine diffusions and affine jump diffusions (AJD's) and developed an efficient estimation technique based on empirical characteristic functions and a generalized method of moments (GMM) estimation procedure based on exact moment conditions.
Abstract: This article examines the class of continuous-time stochastic processes commonly known as affine diffusions (AD's) and affine jump diffusions (AJD's). By deriving the joint characteristic function, we are able to examine the statistical properties as well as develop an efficient estimation technique based on empirical characteristic functions (ECF's) and a generalized method of moments (GMM) estimation procedure based on exact moment conditions. We demonstrate that our methods are particularly useful when the diffusions involve latent variables. Our approach is illustrated with a detailed examination of a continuous-time stochastic volatility (SV) model, along with an empirical application using S&P 500 index returns.

Journal ArticleDOI
TL;DR: In this paper, the authors empirically compare the Markov-switching and stochastic volatility diffusion models of the short rate, and conclude that the volatility depends on the level of short rate.
Abstract: This article empirically compares the Markov-switching and stochastic volatility diffusion models of the short rate. The evidence supports the Markov-switching diffusion model. Estimates of the elasticity of volatility parameter for single-regime models unanimously indicate an explosive volatility process, whereas the Markov-switching models estimates are reasonable. Itis found that either Markov switching or stochastic volatility, but not both, is needed to adequately fit the data. A robust conclusion is that volatility depends on the level of the short rate. Finally, the Markov-switching model is the best for forecasting. A technical contribution of this article is a presentation of quasi-maximum likelihood estimation techniques for the Markov-switching stochastic-volatility model.

Journal ArticleDOI
TL;DR: A brief overview of applications of the generalized method of moments in finance can be found in this paper, where the authors provide an overview of the main applications of moment conditions in finance.
Abstract: We provide a brief overview of applications of generalized method of moments in finance. The models examined in the empirical finance literature, especially in the asset pricing area, often imply moment conditions that can be used in a straight forward way to estimate the model parameters without making strong assumptions regarding the stochastic properties of variables observed by the econometrician. Typically the number of moment conditions available to the econometrician would exceed the number of model parameters. This gives rise to overidentifying restrictions that can be used to test the validity of the model specifications. These advantages have led to the widespread use of the generalized method of moments in the empirical finance literature.

Journal ArticleDOI
TL;DR: A novel method of bootstrapping for GMM based on resampling from the empirical likelihood distribution that imposes the moment restrictions is presented, showing that this approach yields a large-sample improvement and is efficient.
Abstract: Generalized method of moments (GMM) has been an important innovation in econometrics. Its usefulness has motivated a search for good inference procedures based on GMM. This article presents a novel method of bootstrapping for GMM based on resampling from the empirical likelihood distribution that imposes the moment restrictions. We show that this approach yields a large-sample improvement and is efficient, and give examples. We also discuss the development of GMM and other recent work on improved inference.

Posted Content
TL;DR: In this article, the authors proposed a solution to introduce skewness in multivariate symmetric distributions by applying this procedure to the multivariate Student density leads to a "multivariate skew-student" density, for which each marginal has a different asymmetry coefficient and number of degrees of freedom.
Abstract: We propose a practical and flexible solution to introduce skewness in multivariate symmetrical distributions. Applying this procedure to the multivariate Student density leads to a "multivariate skew-Student" density, for which each marginal has a different asymmetry coefficient. Similarly, when applied to the product of independent univariate Student densities, it provides a "multivariate skew density with independent Student components" for which each marginal has a different asymmetry coefficient and number of degrees of freedom. Combined with a multivariate GARCH model, this new family of distributions (that generalizes the work of Fernandez and Steel, 1998) is potentially useful for modelling stock returns, which a are known to be conditionally heteroskedastic, fat-tailed, and often skew. In an application to the daily returns of the CAC40, NASDAQ, NIKKEI and the SMI, it is found that this density suits well the data and clearly outperforms its symmetric competitors.

Journal ArticleDOI
TL;DR: Generalized method of moments (GMM) estimation has become an important unifying framework for inference in econometrics in the last 20 years as discussed by the authors, and much work has been done on these methods since the seminal article by Hansen.
Abstract: Generalized method of moments (GMM) estimation has become an important unifying framework for inference in econometrics in the last 20 years. It can be thought of as encompassing almost all of the common estimation methods, such as maximum likelihood, ordinary least squares, instrumental variables, and two-stage least squares, and nowadays is an important part of all advanced econometrics textbooks. The GMM approach links nicely to economic theory where orthogonality conditions that can serve as such moment functions often arise from optimizing behavior of agents. Much work has been done on these methods since the seminal article by Hansen, and much remains in progress. This article discusses some of the developments since Hansen's original work. In particular, it focuses on some of the recent work on empirical likelihood–type estimators, which circumvent the need for a first step in which the optimal weight matrix is estimated and have attractive information theoretic interpretations.

Journal ArticleDOI
TL;DR: In this article, the authors consider the contribution to the analysis of economic time series of the generalized method ofmoments estimator introduced by Hansen and conduct a small-scale literature survey, and discuss some ongoing theoretical research.
Abstract: We consider the contribution to the analysis of economic time series of the generalized method-of-moments estimator introduced by Hansen. We outline the theoretical contribution, conduct a small-scale literature survey, and discuss some ongoing theoretical research.

Journal ArticleDOI
TL;DR: Sargan's work on instrumental variables (IV) estimation and its connections with the generalized method of moments (GMM) is surveyed in this paper, where the authors present the modeling context in which Sargan motivated IV estimation and their results for nonlinear-in-parameters IV models are described.
Abstract: This article surveys J. D. Sargan's work on instrumental variables (IV) estimation and its connections with the generalized method of moments (GMM). First the modeling context in which Sargan motivated IV estimation is presented. Then the theory of IV estimation as developed by Sargan is discussed. His approach to efficiency, his minimax estimator, tests of overidentification and underidentification, and his later work on the finite-sample properties of IV estimators are reviewed. Next, his approach to modeling IV equations with serial correlation is discussed and compared with the GMM approach. Finally, Sargan's results for nonlinear-in-parameters IV models are described.

Journal ArticleDOI
TL;DR: In this article, the authors considered the seasonal de-meaning and de-trending of a univariate time-series process and derived the limiting distributions of the proposed statistics under the (seasonal) unit root null and under near seasonal integration.
Abstract: This article considers tests for (seasonal) unit roots in a univariate time-series process that are similar with respect to both the initial values of the process and the possibility of (differential seasonal) drift under the (seasonal) unit root null. In contrast to existing approaches, the technique of recursive (seasonal) de-meaning and (seasonal) de-trending of the process is adopted. Representations are derived for the limiting distributions of the proposed statistics under the (seasonal) unit root null and under near (seasonal) integration. In the nonseasonal case the asymptotic local power of the proposed test is shown to exceed that of existing tests when the initial observation is drawn from the stationary distribution of the process. The proposed tests also display superior finite sample size and power properties to conventional seasonal unit root tests and variants of such tests constructed using simple symmetric least squares and weighted symmetric least squares estimation.

Journal ArticleDOI
TL;DR: In this paper, the authors apply a new methodology that recognizes the cumulative proportional nature of the Lorenz curve data by assuming that the income proportions are distributed as a Dirichlet distribution.
Abstract: The Lorenz curve relates the cumulative proportion of income to the cumulative proportion of population. When a particular functional form of the Lorenz curve is specified, it is typically estimated by linear or nonlinear least squares estimation techniques that have good properties when the error terms are independently and normally distributed. Observations on cumulative proportions are clearly neither independent nor normally distributed. This article proposes and applies a new methodology that recognizes the cumulative proportional nature of the Lorenz curve data by assuming that the income proportions are distributed as a Dirichlet distribution. Five Lorenz curve specifications are used to demonstrate the technique. Maximum likelihood estimates under the Dirichlet distribution assumption provide better fitting Lorenz curves than nonlinear least squares and another estimation technique that has appeared in the literature.

Journal ArticleDOI
TL;DR: In this paper, a stochastic volatility model of exchange rates is proposed that links both the level of volatility and its instantaneous covariance with returns to pathwise properties of the currency.
Abstract: This article tests a stochastic volatility model of exchange rates that links both the level of volatility and its instantaneous covariance with returns to pathwise properties of the currency. In particular, the model implies that the return–volatility covariance behaves like a weighted average of recent returns and hence switches signs according to the direction of trends in the data. This implies that the skewness of the finite-horizon return distribution likewise switches sign, leading to time-varying implied volatility “smiles” in options prices. The model is fit and assessed using Bayesian techniques. Some previously reported volatility results are accounted for by the fitted models. The predicted pattern of skewness dynamics accords well with that found in historical options prices.

Journal ArticleDOI
TL;DR: In this paper, the authors define an auxiliary model and find the value of the parameters that minimizes a criterion based either on the pseudoscore (efficient method of moments) or the difference between the pseudotrue value and the quasi-maximum likelihood estimator.
Abstract: The method of moments is based on a relation Eθ0(h(Xt, θ)) = 0, from which an estimator of θ is deduced. In many econometric models, the moment restrictions can not be evaluated numerically due to, for instance, the presence of a latent variable. Monte Carlo simulations method make possible the evaluation of the generalized method of moments (GMM) criterion. This is the basis for the simulated method of moments. Another approach involves defining an auxiliary model and finding the value of the parameters that minimizes a criterion based either on the pseudoscore (efficient method of moments) or the difference between the pseudotrue value and the quasi-maximum likelihood estimator (indirect inference). If the auxiliary model is sufficiently rich to encompass the true model, then these two methods deliver an estimator that is asymptotically as efficient as the maximum likelihood estimator.

Journal ArticleDOI
TL;DR: In this article, the authors established the asymptotic distributions of generalized method of moments estimators when the true parameter lies on the boundary of the parameter space, and discussed three examples: instrumental variables (IV) estimation of a regression model with nonlinear equality and/or inequality restrictions on the parameters.
Abstract: This article establishes the asymptotic distributions of generalized method of moments (GMM) estimators when the true parameter lies on the boundary of the parameter space. The conditions allow the estimator objective function to be nonsmooth and to depend on preliminary estimators. The boundary of the parameter space may be curved and/or kinked. The article discusses three examples: (1) instrumental variables (IV) estimation of a regression model with nonlinear equality and/or inequality restrictions on the parameters; (2) method of simulated moments estimation of a multinomial discrete response model with some random coefficient variances equal to 0, some random effect variances equal to 0, or some measurement error variances equal to 0; and (3) semiparametric least squares estimation of a partially linear regression model with nonlinear equality and/or inequality restrictions on the parameters.

Journal ArticleDOI
TL;DR: The authors developed an alternative approach that exploits the fact that the ratio of these series has historically been stable in nominal terms, and demonstrate that the choice of deflation methodology has important implications for wealth effect estimation and tests of the permanent income hypothesis.
Abstract: Many studies relate real nondurables and services consumption to real income and wealth, with the latter measures obtained by deflating with a price index for total consumption expenditures. This procedure is appropriate only if real nondurables and services consumption is a constant multiple of aggregate real consumption outlays, which is not the case in U.S. data. We develop an alternative approach that exploits the fact that the ratio of these series has historically been stable in nominal terms, and demonstrate that the choice of deflation methodology has important implications for wealth effect estimation and tests of the permanent income hypothesis.

Journal ArticleDOI
TL;DR: The authors presented an Editors' Introduction to the Twentieth anniversary Commemorative Issue of the Journal of Business and Economic Statistics (JBEES), which was the first issue of the journal's 20th edition.
Abstract: (2002). Editors' Introduction to Twentieth Anniversary Commemorative Issue of the Journal of Business and Economic Statistics. Journal of Business & Economic Statistics: Vol. 20, No. 1, pp. 1-4.

Journal ArticleDOI
TL;DR: In this article, a threshold autoregressive (TAR) process with the threshold effect only in the intercept term is considered, and a specification test for testing their stability is derived based on the idea that if (near) integratedness is really caused by level shifts, the series purged of these shifts should be stable so that known stationarity tests can be applied to this series.
Abstract: In some cases the unit root or near unit root behavior of linear autoregressive models fitted to economic time series is not in accordance with the underlying economic theory. To accommodate this feature we consider a threshold autoregressive (TAR) process with the threshold effect only in the intercept term. Although these processes are stationary, their realizations switch between different regimes and can therefore closely resemble those of (near) integrated processes for sample sizes relevant in many economic applications. Estimation and inference of these TAR models are discussed, and a specification test for testing their stability is derived. Testing is based on the idea that if (near) integratedness is really caused by level shifts, the series purged of these shifts should be stable so that known stationarity tests can be applied to this series. Simulation results indicate that in certain cases these tests, like several linearity tests, can have low power. The proposed model is applied to interest...

Journal ArticleDOI
TL;DR: In this article, the authors apply a generalized version of the friction model to costly reversible investment with fixed costs of investment and show that the investment model with costly reversibility and fixed costs is the best among the models.
Abstract: The analysis presented here applies a generalized version of the friction model to costly reversible investment with fixed costs of investment. The analysis investigates three U.S. industries: the computer and office equipment industry, the automobile industry, and the airline industry. Five different investment models are compared. Because some models are nonnested, the analysis employs Vuong's test of model selection. The analysis shows that the investment model with costly reversibility and fixed costs is the best among the models. In addition, the analysis suggests the existence of convex adjustment costs of investment.

Journal ArticleDOI
Jack Porter1
TL;DR: In this paper, the conditional information matrix estimator attains the semiparametric efficiency bound for the variance estimation problem, and two simulation variance estimators are proposed to approximate the integral.
Abstract: When econometric models are estimated by maximum likelihood, the conditional information matrix variance estimator is usually avoided in choosing a method for estimating the variance of the parameter estimate. However, the conditional information matrix estimator attains the semiparametric efficiency bound for the variance estimation problem. Unfortunately, for even moderately complex models, the integral involved in computation of the conditional information matrix estimator is prohibitively difficult to solve. Simulation is suggested to approximate the integral, and two simulation variance estimators are proposed. Monte Carlo results suggest these estimators are attractive in providing accurate confidence interval coverage rates compared to the standard maximum likelihood variance estimators.