scispace - formally typeset
Search or ask a question

Showing papers in "Econometric Theory in 1999"


Journal ArticleDOI
TL;DR: In this paper, a generalized least squares (GLS) procedure is proposed as a weighted least squares that can handle a wide range of unequally spaced panel data patterns and provides natural estimates of the serial correlation and variance components parameters.
Abstract: This paper deals with the estimation of unequally spaced panel data regression models with AR(1) remainder disturbances. A feasible generalized least squares (GLS) procedure is proposed as a weighted least squares that can handle a wide range of unequally spaced panel data patterns. This procedure is simple to compute and provides natural estimates of the serial correlation and variance components parameters. The paper also provides a locally best invariant test for zero first-order serial correlation against positive or negative serial correlation in case of unequally spaced panel data.

838 citations


Journal ArticleDOI
TL;DR: In this article, an asymptotic theory for stochastic processes generated from nonlinear transformations of nonstationary integrated time series is developed, and the convergence rate depends not only on the size of the sample but also on the realized sample path.
Abstract: An asymptotic theory for stochastic processes generated from nonlinear transformations of nonstationary integrated time series is developed. Various nonlinear functions of integrated series such as ARIMA time series are studied, and the asymptotic distributions of sample moments of such functions are obtained and analyzed. The transformations considered in the paper include a variety of functions that are used in practical nonlinear statistical analysis. It is shown that their asymptotic theory is quite different from that of integrated processes and stationary time series. When the transformation function is exponentially explosive, for instance, the convergence rate of sample functions is path dependent. In particular, the convergence rate depends not only on the size of the sample but also on the realized sample path. Some brief applications of these asymptotics are given to illustrate the effects of nonlinearly transformed integrated processes on regression. The methods developed in the paper are useful in a project of greater scope concerned with the development of a general theory of nonlinear regression for nonstationary time series. Nonstationary time series arising from autoregressive models with roots on the unit circle have been an intensive subject of recent research. The asymptotic behavior of regression statistics based on integrated time series (those for which one or more of the autoregressive roots are unity) has received the most attention, and a fairly complete theory is now available for linear time series regressions. The resulting limit theory forms the basis of much ongoing empirical econometric work, especially on the subject of unit root testing and cointegration model

318 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider a scalar I(d) process where the integration order d is any real number and assume that d is known and is greater than or equal to ½.
Abstract: This paper deals with a scalar I(d) process {yj}, where the integration order d is any real number. Under this setting, we first explore asymptotic properties of various statistics associated with {yj}, assuming that d is known and is greater than or equal to ½. Note that {yj} becomes stationary when d ½. We then consider, under the normality assumption, testing and estimation for d, allowing for any value of d. The tests suggested here are asymptotically uniformly most powerful invariant, whereas the maximum likelihood estimator is asymptotically efficient. The asymptotic theory for these results will not assume normality. Unlike in the usual unit root problem based on autoregressive models, standard asymptotic results hold for test statistics and estimators, where d need not be restricted to d ≥ ½. Simulation experiments are conducted to examine the finite sample performance of both the tests and estimators.

237 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a theoretical framework to study the accuracy of bootstrap P values, which may be based on a parametric or nonparametric bootstrap, and they show that, in many circumstances, the error in rejection probability of a bootstrap test will be one whole order of magnitude smaller than that of the corresponding asymptotic test.
Abstract: We provide a theoretical framework in which to study the accuracy of bootstrap P values, which may be based on a parametric or nonparametric bootstrap. In the parametric case, the accuracy of a bootstrap test will depend on the shape of what we call the critical value function. We show that, in many circumstances, the error in rejection probability of a bootstrap test will be one whole order of magnitude smaller than that of the corresponding asymptotic test. We also propose a simulation method for estimating this error that requires the calculation of only two test statistics per replication.

230 citations


Journal ArticleDOI
TL;DR: In this paper, a leading semiparametric estimate, the Gaussian or local Whittle one, can be consistent and have the same limiting distribution under conditional heteroskedasticity as under the conditional homogeneousness assumed by Robinson (1995, Annals of Statistics 23, 1630-61).
Abstract: Semiparametric estimates of long memory seem useful in the analysis of long financial time series because they are consistent under much broader conditions than parametric estimates. However, recent large sample theory for semiparametric estimates forbids conditional heteroskedasticity. We show that a leading semiparametric estimate, the Gaussian or local Whittle one, can be consistent and have the same limiting distribution under conditional heteroskedasticity as under the conditional homoskedasticity assumed by Robinson (1995, Annals of Statistics 23, 1630–61). Indeed, noting that long memory has been observed in the squares of financial time series, we allow, under regularity conditions, for conditional heteroskedasticity of the general form introduced by Robinson (1991, Journal of Econometrics 47, 67–84), which may include long memory behavior for the squares, such as the fractional noise and autoregressive fractionally integrated moving average form, and also standard short memory ARCH and GARCH specifications.

187 citations


Journal ArticleDOI
TL;DR: In this paper, the authors considered cointegrating regressions with time varying coefficients, where coefficients are modeled as smooth functions evolving over time, and they can be estimated nonparametrically, using suitably modified series estimators.
Abstract: This paper considers cointegrating regressions with time varying coefficients The coefficients are modeled as smooth functions evolving over time It is shown that they can be estimated nonparametrically, using suitably modified series estimators Presented is the efficient method of estimation, which relies on simple prefiltering of the data and preestimation of the model The test for the adequacy of model specification is also developed Our model and statistical methods are applied to analyze the US automobile demand function

161 citations


Journal ArticleDOI
TL;DR: In this article, a necessary and sufficient condition for the existence of the unconditional fourth moment of the GARCH(p,q) process is given and also an expression for the moment itself.
Abstract: In this paper, a necessary and sufficient condition for the existence of the unconditional fourth moment of the GARCH(p,q) process is given and also an expression for the moment itself. Furthermore ...

160 citations


Journal ArticleDOI
TL;DR: In this article, the authors considered vector valued autoregressive models with fractionally integrated errors and derived quadratic approximation to the log-likelihood ratios in the vicinity of auxiliary estimators of the parameters.
Abstract: Vector valued autoregressive models with fractionally integrated errors are considered. The possibility of the coefficient matrix of the model having eigenvalues with absolute values equal or close to unity is included. Quadratic approximation to the log-likelihood ratios in the vicinity of auxiliary estimators of the parameters is obtained and used to make a rough identification of the approximate unit eigenvalues, including complex ones, together with their multiplicities. Using the identification thus obtained, the stationary linear combinations (cointegrating relationships) and the trends that induce the nonstationarity are identified, and Wald-type inference procedures for the parameters associated with them are constructed. As in the situation in which the errors are independent and identically distributed (i.i.d.), the limiting behaviors are nonstandard in the sense that they are neither normal nor

73 citations


Journal ArticleDOI
TL;DR: In this paper, a method is presented to estimate correlated discrete random variables with known univariate distribution functions up to some parameters, and an empirical illustration on Dutch recreational data is presented.
Abstract: In this paper a method is presented to estimate correlated discrete random variables with known univariate distribution functions up to some parameters. We also present an empirical illustration on Dutch recreational data.

67 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a general framework for deriving the local power properties of the likelihood ratio (LR) tests for the cointegrating rank of a vector autoregressive process under different assumptions regarding deterministic terms.
Abstract: Likelihood ratio (LR) tests for the cointegrating rank of a vector autoregressive process have been developed under different assumptions regarding deterministic terms. For instance, nonzero mean terms and linear trends have been accounted for in some of the tests. In this paper we provide a general framework for deriving the local power properties of these tests. Thereby it is possible to assess the virtue of utilizing varying amounts of prior information by making assumptions regarding the deterministic terms. One interesting result from this analysis is that if no assumptions regarding the specific form of the mean term are made whereas a linear trend is excluded then a test is available that has the same local power as an LR test derived under a zero mean assumption.

60 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed new estimators whose pivotal statistics have the standard normal limiting distribution for all ranges of the autoregressive parameters, and the proposed estimators are approximately median unbiased.
Abstract: For autoregressive processes, we propose new estimators whose pivotal statistics have the standard normal limiting distribution for all ranges of the autoregressive parameters. The proposed estimators are approximately median unbiased. For seasonal time series, the new estimators give us unit root tests that have limiting normal distribution regardless of period of the seasonality. Using the estimators, confidence intervals of the autoregressive parameters are constructed. A Monte-Carlo simulation for first-order autoregressions shows that the proposed tests for unit roots are locally more powerful than the tests based on the ordinary least squares estimators. It also shows that the proposed confidence intervals have shorter average lengths than those of Andrews (1993, Econometrica 61, 139–165) based on the ordinary least squares estimators when the autoregressive coefficient is close to one.

Journal ArticleDOI
TL;DR: In this paper, the authors consider a class of multivariate processes that, when differenced enough, yields covariance stationary processes whose determinant of the matrix series associated with their Wold representation has various unit roots with various orders of multiplicity.
Abstract: Following the approach proposed by Gregoir and Laroque (1993, Econometric Theory 9, 329–342), we consider a class of multivariate processes that, when differenced enough, yields covariance stationary processes whose determinant of the matrix series associated with their Wold representation has various unit roots with various orders of multiplicity we restrict to be integers. A representation theorem is provided that involves different polynomial error correction terms at each frequency associated with each unit root. An identification criterion for each set of error correction terms is proposed.

Journal ArticleDOI
TL;DR: In this paper, the authors extend the statistical results obtained by Gregoir and Laroque (1994, Journal of Econometrics 63, 183,214) to analyze multivariate time series that can be represented under an autoregressive equation of finite order with various polynomial error correction terms at various frequencies with possibly a non-null deterministic part.
Abstract: This paper extends the statistical results obtained by Gregoir and Laroque (1994, Journal of Econometrics 63, 183–214). It develops statistical tools to analyze multivariate time series that can be represented under an autoregressive equation of finite order with various polynomial error correction terms at various frequencies with possibly a non-null deterministic part as introduced by Gregoir (1999, Econometric Theory 15, 435–468). We propose an estimation procedure that proceeds through repeated applications of principal component analysis and a specification test for the omission of a polynomial relation of cointegration at each frequency.

Journal ArticleDOI
TL;DR: In this paper, the authors considered adaptive maximum likelihood estimators of unit roots in autoregressive processes with possibly non-Gaussian innovations and constructed unit root tests based on the adaptive estimators.
Abstract: Adaptive maximum likelihood estimators of unit roots in autoregressive processes with possibly non-Gaussian innovations are considered. Unit root tests based on the adaptive estimators are constructed. Limiting distributions of the test statistics are derived, which are linear combinations of two functionals of Brownian motions. A Monte Carlo simulation reveals that the proposed tests have improved powers over the classical Dickey–Fuller tests when the distribution of the innovation is not close to normal. We also compare the proposed tests with those of Lucas (1995, Econometric Theory 11, 331–346) based on M-estimators.

Journal ArticleDOI
TL;DR: The authors ranked institutions and researchers based on a standardized page count of their econometric theory publications over the last 11 years (1986-1996) in 11 economics and statistics journals and revealed Yale University to be the leading academic institution, enjoying a large lead over the other top institutions: University of Chicago, M.I.T., and London School of Economics.
Abstract: We rank institutions and researchers based on a standardized page count of their econometric theory publications over the last 11 years (1986–1996) in 11 economics and statistics journals. Our ranking criteria differ from those employed by Hall (1987, Econometric Theory 3, 171–194; 1990, Econometric Theory 6, 1–16) and Baltagi (1998, Econometric Theory 14, 1–43). We weight the standardized page count of a publication by the publishing journal's “impact factor,” which measures a journal's impact on the profession. We also depart from the previous rankings by focusing only on publications in theoretical econometrics. Our rankings reveal Yale University to be the leading academic institution, enjoying a large lead over the other top institutions: University of Chicago, M.I.T., and London School of Economics. Our rankings also reveal that Peter Phillips and Donald Andrews (both affiliated with Yale University) are the leading researchers in theoretical econometrics. We also provide rankings of countries and Ph.D. programs.

Journal ArticleDOI
TL;DR: In this article, the authors construct properly scaled functions of Rp-valued partial sums of demeaned data and derive bounds via the functional law of the iterated logarithm for strong mixing processes.
Abstract: We construct properly scaled functions of Rp-valued partial sums of demeaned data and derive bounds via the functional law of the iterated logarithm for strong mixing processes. If we obtain a value below or equal to the bound we decide in favor of I(0); otherwise we decide in favor of I(1). This provides a consistent rule for classifying time series as being I(1) or I(0). The nice feature of the procedure lies in the almost sure nature of the bound, guaranteeing a lim sup–type result. We finally provide conditions for the strong consistency of estimators of the variance in the dependent and heterogeneous case.

Journal ArticleDOI
TL;DR: In this article, the authors studied efficient detrending in cointegrating regression and developed modified tests for cointegration that use efficient detending procedures, and derived asymptotics for these tests.
Abstract: This paper studies efficient detrending in cointegrating regression and develops modified tests for cointegration that use efficient detrending procedures. Asymptotics for these tests are derived. Monte Carlo experiments are conducted to evaluate the detrending procedures in finite samples and to compare tests for cointegration based on different detrending procedures. The limit theory allows for increasingly remote initial condition effects as the sample size goes to infinity.

Journal ArticleDOI
TL;DR: In this paper, the authors considered a set of test statistics, namely, the likelihood ratio, efficient score, and Wald test statistic, for econometric models under simulation estimation and showed that they are asymptotically equivalent.
Abstract: This paper considers classical test statistics, namely, the likelihood ratio, efficient score, and Wald statistics, for econometric models under simulation estimation. The simulated likelihood ratio, simulated efficient score, and simulated Wald test statistics are shown to be asymptotically equivalent. Because the simulated score vector can be asymptotically biased, limiting distributions of these simulated statistics can be asymptotically noncentral χ 2 distributed. This paper studies inference issues with various simulated test statistics. Monte Carlo results are also provided to compare and demonstrate finite sample properties of simulated test statistics.

Journal ArticleDOI
TL;DR: In this paper, the moments of the asymptotic distribution of the test statistics for testing the unit root were obtained in the null case, when the true drift or trend was lacking.
Abstract: For three models of linear autoregression the moments of the asymptotic distributions of the test statistics for testing the unit root are obtained in the null case, when the true drift or trend is lacking.

Journal ArticleDOI
TL;DR: This paper developed asymptotic approximations to the distribution of forecast errors from an estimated AR(1) model with no drift when the true process is nearly I (1) and both the forecast horizon and the sample size are allowed to increase at the same rate.
Abstract: We develop asymptotic approximations to the distribution of forecast errors from an estimated AR(1) model with no drift when the true process is nearly I (1) and both the forecast horizon and the sample size are allowed to increase at the same rate. We find that the forecast errors are the sums of two components that are asymptotically independent. The first is asymptotically normal whereas the second is asymptotically nonnormal. This throws doubt on the suitability of a normal approximation to the forecast error distribution. We then perform a Monte Carlo study to quantify further the effects on the forecast errors of sampling variability in the parameter estimates as we allow both forecast horizon and sample size to increase.

Journal ArticleDOI
TL;DR: The integrated conditional moment (ICM) test as discussed by the authors is based on the fact that a random function based on a correctly specified model should have zero mean, whereas any misspecification in the conditional mean implies a divergent mean for the random function.
Abstract: This paper proposes a version of the integrated conditional moment (ICM) test that is optimal for a class of composite alternatives. The ICM test is built on the fact that a random function based on a correctly specified model should have zero mean, whereas any misspecification in the conditional mean implies a divergent mean for the random function. We derive test statistics that are optimal for each basis element of an orthonormal decomposition of the function space for which the random function is an element. We then use a weighted summation of these test statistics to compose the single test statistic that is optimal for any pair of alternatives that are symmetric about zero. This test is equivalent to using a particular measure in the ICM test of Bierens and Ploberger (1997, Econometrica 65, 1129–1152).

Journal ArticleDOI
TL;DR: In this article, the authors investigated the asymptotic properties of the maximum marginal likelihood estimator for a regression model with a stochastic trend component when the signal-to-noise ratio is near zero.
Abstract: This paper investigates the asymptotic properties of the maximum marginal likelihood estimator for a regression model with a stochastic trend component when the signal-to-noise ratio is near zero. In particular, the local level model in Harvey (1989, Forecasting, Structural Time Series Models and the Kalman Filter, Cambridge: Cambridge University Press) and its variants where a time trend or an intercept is included are considered. A local-to-zero parameterization is adopted. Two sets of asymptotic properties are presented for the local maximizer: consistency and the limiting distribution. The estimator is found to be super-consistent. The limit distribution is derived and found to possess a long tail and a mass point at zero. It yields a good approximation for samples of moderate size. Simulation also documents that the empirical distribution converges less rapidly to the limit distribution as number of regression parameters increases. The results could be viewed as a transition step toward establishing new likelihood ratio–type or Wald-type tests for the stationarity null.

Journal ArticleDOI
TL;DR: The work of Tobin this article is one of the major developers of modern macroeconomic theory and has contributed fundamental knowledge to the theory of investment, of consumption, of money and banking, and of economic growth.
Abstract: Professor James Tobin is a figure of truly historic significance in the economics profession. He is one of the major developers of modern macroeconomic theory. He has contributed fundamental knowledge to the theory of investment, of consumption, of money and banking, and of economic growth. His theoretical work made possible the development of the capital asset pricing model that has been a central paradigm in modern finance. His work on limited-dependent variable models has started a field within econometrics.

Journal ArticleDOI
TL;DR: In this paper, the asymptotic behavior of the least-squares estimators of the coefficients of AR(p) processes with characteristic roots near the unit circle is studied.
Abstract: In this paper nearly unstable AR(p) processes (inf other words, models with characteristic roots near the unit circle) are studied. Our main aim is to describe the asymptotic behavior of the least-squares estimators of the coefficients. A convergence result is presented for the general complex-valued case. The limit distribution is given by the help of some continuous time AR processes. We apply the results for real-valued nearly unstable AR(p) models. In this case the limit distribution can be identified with the maximum likelihood estimator of the coefficients of the corresponding continuous time AR processes.

Journal ArticleDOI
TL;DR: In this article, the estimation of a location parameter in the binary choice model with some weak distributional assumptions imposed on the error term in the latent regression model is considered, and two estimators are proposed, both of which are two-step estimators.
Abstract: This paper considers the estimation of a location parameter in the binary choice model with some weak distributional assumptions imposed on the error term in the latent regression model. Two estimators are proposed here, both of which are two-step estimators; in the first step, the slope parameters are consistently estimated by existing methods; in the second step, the location parameter is consistently estimated based on a moment condition. The estimators are shown to be consistent and asymptotically normal. A small Monte Carlo study illustrates the usefulness of the estimators. We also point out that the location and slope parameters can be estimated simultaneously.

Journal ArticleDOI
TL;DR: In this article, Petrov and Csaki give simple expressions for the bias that can be used to construct improved estimators, which are nearly equivalent to model selection procedures based on severely biased estimators.
Abstract: In case of misspecification, the Akaike information criterion (AIC; Akaike, 1973, in Petrov & Csaki, eds., Second International Symposium on Information Theory, pp. 267-281. Budapest: Akademia Kiado) is an asymptotically biased estimator of the expected Kullback-Leibler discrepancy. This paper gives simple expressions for the bias that can be used to construct improved estimators. However, for the examples that are considered in detail it turns out that model selection procedures based on such improved estimators are nearly equivalent to model selection procedures based on severely biased estimators.


Journal ArticleDOI
TL;DR: In this article, a Bartlett-type correction to the likelihood ratio test for a unit root can be used to control size distortions in non-stationary time series, which can reduce the size of the distortion without affecting the power too much.
Abstract: Despite the fact that it is not correct to speak of Bartlett corrections in the case of non-stationary time series, this paper shows that a Bartlett-type correction to the likelihood ratio test for a unit root can be an effective tool in order to control size distortions. Using well-known formulae, we obtain second order (numerical) approximations to the moments and cumulants of the likelihood ratio, which makes it possible to calculate a Bartlett type factor. It turns out that the cumulants of the corrected statistic are closer to their asymptotic value than the original one. A simulation study is then carried out to assess the quality of these approximations for the first four moments; the size and power of the original and the corrected statistic are also simulated. Our results suggest that the proposed correction reduced the size of the distortion without affecting the power too much.

Journal ArticleDOI
TL;DR: The authors compared several tests for linear and loglinear regression models where both the dependent and independent variables are transformed and showed that the Lagrange multiplier test proposed by Godfrey and Wickens (1981, Review of Economic Studies 48, 487-496) in the framework of the Box-Cox regression model has the highest asymptotic power of compared tests.
Abstract: This paper compares several tests for linear and loglinear regression models where both the dependent and independent variables are transformed. It is shown that the Lagrange multiplier test proposed by Godfrey and Wickens (1981, Review of Economic Studies 48, 487–496) in the framework of the Box–Cox regression model has the highest asymptotic power of the compared tests. The extended projection test of MacKinnon, White, and Davidson (1983, Journal of Econometrics 11, 53–70), the test of Bera and McAleer (1983, paper presented to the SSRC Econometric Study Group Conference on Model Specification and Testing, Warwick; 1989, Sankhya B 51, 212–224), and the test of Andrews (1971, Biometrika 58, 249–254) are shown to have asymptotically equivalent powers and to have lower powers than the nonnested test of Cox (1961, Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability , Vol. 1, 105–123, Berkeley: University of California Press; 1962, Journal of the Royal Statistical Society B 24, 406–424).

Journal ArticleDOI
TL;DR: In this paper, the authors considered the long memory in the dependent variable and regressors, instead of the moment conditions of the error terms, that causes the spurious regression and showed that the t-ratios diverge at the rate of √T, which is identical to what Phillips (1986, Journal of Econometrics 33, 311-340) has obtained for the Gaussian case.
Abstract: This paper considers spurious regression between integrated processes with stable errors. Our results show that the t-ratios diverge at the rate of √T, which is identical to what Phillips (1986, Journal of Econometrics 33, 311–340) has obtained for the Gaussian case. Therefore, it is the long memory in the dependent variable and regressors, instead of the moment conditions of the error terms, that causes the spurious regression.