scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Applied Econometrics in 1999"


Journal ArticleDOI
TL;DR: In this article, the authors employ response surface regressions based on simulation experiments to calculate asymptotic distribution functions for the Johansen-type likelihood ratio tests for cointegration.
Abstract: This paper employs response surface regressions based on simulation experiments to calculate asymptotic distribution functions for the Johansen-type likelihood ratio tests for cointegration. These are carried out in the context of the models recently proposed by Pesaran, Shin, and Smith (1997) that allow for the possibility of exogenous variables integrated of order one. The paper calculates critical values that are very much more accurate than those available previously. The principal contributions of the paper are a set of data files that contain estimated asymptotic quantiles obtained from response surface estimation and a computer program for utilizing them. This program, which is freely available via the Internet, can be used to calculate both asymptotic critical values and P-values. Copyright © 1999 John Wiley & Sons, Ltd.

1,971 citations


Journal ArticleDOI
TL;DR: In this article, the use of a new bootstrap method for small-sample inference in long-horizon regressions is illustrated by analysing the longhorizon predictability of four major exchange rates, and the findings are reconciled with those of an earlier study by Mark.
Abstract: The use of a new bootstrap method for small-sample inference in long-horizon regressions is illustrated by analysing the long-horizon predictability of four major exchange rates, and the findings are reconciled with those of an earlier study by Mark (1995). While there is some evidence of exchange rate predictability, contrary to earlier studies, no evidence is found of higher predictability at longer horizons. Additional evidence is presented that the linear VEC model framework underlying the empirical study is likely to be misspecified, and that the methodology for constructing bootstrap p-values for long-horizon regression tests may be fundamentally flawed. Copyright © 1999 John Wiley & Sons, Ltd.

378 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed two simple alternatives to 2SLS and limited-information-maximum-likelihood estimators for models with more instruments than endogenous regressors, which can be interpreted as instrumental variables procedures using an instrument that is independent of disturbances even in finite samples.
Abstract: Two-stage-least-squares (2SLS) estimates are biased towards OLS estimates. This bias grows with the degree of over-identification and can generate highly misleading results. In this paper we propose two simple alternatives to 2SLS and limited-information-maximum-likelihood (LIML) estimators for models with more instruments than endogenous regressors. These estimators can be interpreted as instrumental variables procedures using an instrument that is independent of disturbances even in finite samples. Independence is achieved by using a `leave-one-out' jackknife-type fitted value in place of the usual first-stage equation. The new estimators are first-order equivalent to 2SLS but with finite-sample properties superior to those of 2SLS and similar to LIML when there are many instruments. Moreover, the jackknife estimators appear to be less sensitive than LIML to deviations from the linear reduced form used in classical simultaneous equations models.

361 citations


Journal ArticleDOI
TL;DR: In this paper, a generalization of the Dickey-fuller test procedure is proposed to identify the collapsing periods from the expanding ones, which makes use of the class of Markov regime switching models.
Abstract: This paper addresses the problem of testing for the presence of a stochastic bubble in a time series in the case that the bubble is periodically collapsing so that the asset price keeps returning to the level implied by the market fundamentals. As this is essentially a problem of identifying the collapsing periods from the expanding ones, we propose using a generalization of the Dickey–Fuller test procedure which makes use of the class of Markov regime-switching models. The potential of the new methodology is illustrated via simulation, and an empirical example is given. Copyright © 1999 John Wiley & Sons, Ltd.

282 citations


Journal ArticleDOI
TL;DR: In this article, a computationally attractive iterated linear least squares estimator (ILLE) is proposed for large non-linear simultaneous equation systems which are conditionally linear in unknown parameters.
Abstract: Empirical demand systems that do not impose unreasonable restrictions on preferences are typically non-linear. We show, however, that all popular systems possess the property of conditional linearity. A computationally attractive iterated linear least squares estimator (ILLE) is proposed for large non-linear simultaneous equation systems which are conditionally linear in unknown parameters. The estimator is shown to be consistent and its asymptotic efficiency properties are derived. An application is given for a 22-commodity quadratic demand system using household-level data from a time series of repeated cross-sections.

273 citations


Journal ArticleDOI
TL;DR: This paper employ an additive semiparametric partially linear model to uncover the way that initial output and schooling levels affect growth rates and suggest the presence of multiple regimes (equilibria).
Abstract: In this paper we employ an additive semiparametric partially linear model to uncover the way that initial output and schooling levels affect growth rates. Our results based on marginal integration allow for graphical representation of the non-linearities that characterize the effects that these variables have on growth rates and suggest the presence of multiple regimes (equilibria). Our findings seem to be in agreement with those of Durlauf and Johnson (1995) and Hansen (1996) who used a different data set. Copyright © 1999 John Wiley & Sons, Ltd.

231 citations


Journal ArticleDOI
TL;DR: In this article, the authors apply an econometric framework which allows for complex non-convex budget sets, highly non-linear labour supply curves and imperfect markets with institutional constraints.
Abstract: This study applies an econometric framework which allows for complex non-convex budget sets, highly non-linear labour supply curves and imperfect markets with institutional constraints. A married couple's version of the model is estimated on Italian microdata. The empirical results show that male labour supply is rather inelastic while labour supply among females, especially participation, is considerably more elastic. The elasticities depend strongly on household income. The largest elasticities are found for females living in poor households. The results of the tax simulations suggest that there are only modest labour supply responses from replacing the 1987 system by proportional taxes. Copyright © 1999 John Wiley & Sons, Ltd.

209 citations


Journal ArticleDOI
In Choi1
TL;DR: In this article, the variance ratio test is calculated by using Andrews' (1991) optimal data-dependent methods, and the results of applying these tests to the real exchange rates are occasionally inconsistent.
Abstract: This paper tests the random walk hypothesis for the log-differenced monthly US real exchange rates versus some major currencies. The tests we use are variance ratio test, Durlauf's (1991) spectral domain tests and Andrews and Ploberger's ( 1996) optimal tests. The variance ratio test is calculated by using Andrews' (1991) optimal data-dependent methods. Finite sample properties of these tests are also reported. Because the results of applying these tests to the real exchange rates are occasionally inconsistent, tests to synthesize these test results are proposed and applied to the real exchange rates. These tests have often been used in meta-analysis, but have not previously been used to synthesize different test results. Simulation results for these tests are also reported. For the real exchange rate data from the post-Bretton Woods period, these tests reject the null only for the Swiss franc. But when longer-horizon data are used, there is more evidence of serial correlations in the log-differenced real exchange rates. Copyright © 1999 John Wiley & Sons, Ltd.

186 citations


Journal ArticleDOI
TL;DR: The authors investigated the multi-period forecast performance of a number of empirical selfexciting threshold autoregressive (SETAR) models that have been proposed in the literature for modelling exchange rates and GNP, amongst other variables.
Abstract: In this paper we investigate the multi-period forecast performance of a number of empirical selfexciting threshold autoregressive (SETAR) models that have been proposed in the literature for modelling exchange rates and GNP, amongst other variables. We take each of the empirical SETAR models in turn as the DGP to ensure that the ‘non-linearity’ characterises the future, and compare the forecast performance of SETAR and linear autoregressive models on a number of quantitative and qualitative criteria. Our results indicate that non-linear models have an edge in certain states of nature but not in others, and that this can be highlighted by evaluating forecasts conditional upon the regime.

134 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the properties of the Lagrange Multiplier (LM) test for autoregressive conditional heteroskedasticity and generalized ARCH in the presence of additive outliers (AO's).
Abstract: In this paper we investigate the properties of the Lagrange Multiplier (LM) test for autoregressive conditional heteroskedasticity (ARCH) and generalized ARCH (GARCH) in the presence of additive outliers (AO's). We show analytically that both the asymptotic size and power are adversely affected if AO's are neglected: the test rejects the null hypothesis of homoskedasticity too often when it is in fact true, while the test has difficulty detecting genuine GARCH effects. Several Monte Carlo experiments show that these phenomena occur in small samples as well. We design and implement a robust test, which has better size and power properties than the conventional test in the presence of AO's. Applications to the French industrial production series and weekly returns of the Spanish peseta/US dollar exchange rate reveal that, sometimes, apparent GARCH effects may be due to only a small number of outliers and, conversely, that genuine GARCH effects can be masked by outliers.

126 citations


Journal ArticleDOI
TL;DR: In this article, the stability and linearity of a German M1 money demand function were investigated using smooth transition regression techniques, and it was found that the money demand equation considered is both linear and stable.
Abstract: Starting from a linear error correction model (ECM) the stability and linearity of a German M1 money demand function are investigated, applying smooth transition regression techniques. Using seasonally unadjusted quarterly data from 1961(1) to 1990(2) it is found that the money demand equation considered is both linear and stable. After extending the sampling period until 1995(4) a clear structural instability due to the monetary unification on 1 July 1990 is found and subsequently modelled. A non-linear specification for the extended period is presented and discussed. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the authors considered nine Swedish macroeconomic time series whose business cycle properties were discussed by Englund, Persson, and Svensson (1992) using frequency domain techniques and found that all but two of the logarithmed and differenced series are nonlinear.
Abstract: This paper considers nine long Swedish macroeconomic time series whose business cycle properties were discussed by Englund, Persson, and Svensson (1992) using frequency domain techniques. It is found by testing that all but two of the logarithmed and differenced series are nonlinear. The observed nonlinearity is characterized by STAR models. The statistical and dynamic properties of the estimated STAR models are investigated using, among other things, parametrically estimated 'local' or 'sliced' spectra. Cyclical variation at business cycle frequencies does not seem to be constant over time for all series, and it is difficult to find a 'Swedish business cycle'. Only two series may be regarded as having genuinely asymmetric cyclical variation. Standard Granger noncausality tests are adapted to the nonlinear (STAR) case, and the null hypothesis of noncausality is tested for pairs of series. The results point at strong temporal interactions between series. They also indicate that the assumption of functional form (linear or STAR) strongly affects the outcome of these pairwise tests.

Journal ArticleDOI
TL;DR: In this article, Monte Carlo simulations were used to compare the performance of the TSLS, the LIML, and four new jackknife IV estimators when the instruments are weak, and they found that the new estimators and LIML have a smaller bias but a larger variance than the traditional TSLS.
Abstract: Using Monte Carlo simulations we study the small sample performance of the traditional TSLS, the LIML and four new jackknife IV estimators when the instruments are weak. We find that the new estimators and LIML have a smaller bias but a larger variance than the TSLS. In terms of root mean square error, neither LIML nor the new estimators perform uniformly better than the TSLS. The main conclusion from the simulations and an empirical application on labour supply functions is that in a situation with many weak instruments, there still does not exist an easy way to obtain reliable estimates in small samples. Better instruments and/or larger samples is the only way to increase precision in the estimates. Since the properties of the estimators are specific to each data-generating process and sample size it would be wise in empirical work to complement the estimates with a Monte Carlo study of the estimators' properties for the relevant sample size and data-generating process believed to be applicable. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors consider econometric issues related to time-series data that have been subject to abrupt governmental interventions and show a substantial bias in favour of concluding that the series is stationary and that shocks have temporary effects.
Abstract: This paper considers econometric issues related to time-series data that have been subject to abrupt governmental interventions. The motivating example for this study is the Brazilian monthly inflation rate (1974:1–1993:6) which we use throughout for illustration. This series has been heavily influenced by the effect of so-called shock plans implemented by various governments starting in the mid-1980s. The plans act as ‘inliers’ in the sense that the series is temporarily brought down to low levels before returning to its previous trend path. We analyse the effects on standard unit root tests and measures of persistence caused by the presence of these ‘inliers’. We show a substantial bias in favour of concluding that the series is stationary and that shocks have temporary effects. We then construct appropriately corrected statistics which take into account the presence of the plans. These show, unlike the standard tests, that the stochastic behaviour of the inflation rate was indeed unstable over this period. Simulation results are presented to support the adequacy of our corrected statistics. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors test a particular form of interdependent behavior, namely the hypothesis that individuals' choices of hours of work are influenced by the average of work in a social reference group.
Abstract: In this paper we test a particular form of interdependent behavior, namely the hypothesis that individuals' choices of hours of work are influenced by the average hours of work in a social reference group. There are problems to empirically disentangle the effects of interdependent behavior and preference variation across groups. We show that panel data or data from several points in time are needed. In the empirical analysis we combine cross-section data from 1973, 1980 and 1990. Our results support the hypothesis of interdependent behavior. The implication is that conventional tax policy predictions, in which preference interdependencies are neglected, will tend to underestimate the effect of a tax reform on hours of work. Our point estimates suggest that conventional calculations would capture only about a third of the actual change in hours of work.

Journal ArticleDOI
TL;DR: Using kernel density estimation, the authors describe the distribution of household size-adjusted real income and how it changed over the business cycle of the 1980s in the United States and the United Kingdom.
Abstract: Using kernel density estimation we describe the distribution of household size-adjusted real income and how it changed over the business cycle of the 1980s in the United States and the United Kingdom. We confirm previous studies that show income inequality increased in the two countries and the middle of the distribution was squashed down. Using a series of statistical tests, however, we find that while the mass in both tails of the distribution increased significantly in both countries over the period, by far the greatest gains were in the upper tail. Copyright © 1999 John Wiley & Sons, Ltd.


Journal ArticleDOI
TL;DR: In this paper, it was shown that common cycles are present in the Hylleberg-Engle-Granger-Yoo decomposition of these series when there exists a linear combination of their seasonal differences which follows an MA process of order, at most three.
Abstract: This paper extends the notion of common cycles to quarterly time series having unit roots both at the zero and seasonal frequencies. It is shown that common cycles are present in the Hylleberg–Engle–Granger–Yoo decomposition of these series when there exists a linear combination of their seasonal differences which follows an MA process of order, at most, three. The pitfalls of seasonal adjustment for common cycles analysis are also documented. Inference on common cycles in seasonally cointegrated series is derived from existing statistical methods for codependence. Concepts and methods are illustrated with an empirical analysis of the comovements between consumption and output using Italian data. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: R, an open-source S-like high-level matrix programming language that can be used for econometric simulations and data analysis is reviewed.
Abstract: SUMMARY This article reviews R, an open-source S-like high-level matrix programming language that can be used for econometric simulations and data analysis. Copyright # 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a non-linear filter which yields the exact likelihood of stochastic volatility models is employed, and a smoothing algorithm for volatility estimation is also constructed. But since volatility is a latent variable in SV models, it is difficult to evaluate the approximate likelihood.
Abstract: This paper develops a new model for the analysis of stochastic volatility (SV) models. Since volatility is a latent variable in SV models, it is difficult to evaluate the exact likelihood. In this paper, a non-linear filter which yields the exact likelihood of SV models is employed. Solving a series of integrals in this filter by piecewise linear approximations with randomly chosen nodes produces the likelihood, which is maximized to obtain estimates of the SV parameters. A smoothing algorithm for volatility estimation is also constructed. Monte Carlo experiments show that the method performs well with respect to both parameter estimates and volatility estimates. We illustrate our model by analysing daily stock returns on the Tokyo Stock Exchange. Since the method can be applied to more general models, the SV model is extended so that several characteristics of daily stock returns are allowed, and this more general model is also estimated. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the natural rate is treated as an unobserved state variable in a system that includes measurement equations for the unemployment rate, the rate of wage growth and the rates of inflation.
Abstract: How should one measure the natural rate of unemployment? This paper proposes a systems procedure as an alternative to NAIRU. The natural rate is treated as an unobserved state variable in a system that includes measurement equations for the unemployment rate, the rate of wage growth and the rate of inflation. The model is derived from a version of the wage bargaining model of Blanchard and embodies a version of the natural rate hypothesis. The model is estimated by embedding the Kalman filter within the full-information maximum likelihood procedure. For US data, the estimated model implies substantial post-war variation in the natural rate and a negative, but weak, effect of inflation surprises on unemployment. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the authors ranked academic institutions by publication activity in applied econometrics over the period 1989-1995, based on standardized page counts of articles published in these journals over the stated period.
Abstract: This paper ranks academic institutions by publication activity in applied econometrics over the period 1989-1995. Fourteen leading international journals that publish applied econometrics articles are used to provide the database. The rankings are based on standardized page counts of articles published in these journals over the stated period. A 'Hall of Fame' is developed listing the top 100 individual producers of applied econometrics in the fourteen journals considered. To control for quality differences among the applied journals, separate rankings are provided both for institutions and for individuals according to econometrics publications by journal.


Journal ArticleDOI
TL;DR: In this paper, a two-sided Type II Tobit model is proposed to explain the sign and magnitude of the Federal Reserve's discount rate changes, and a procedure for its estimation is developed, considering the discrete and censored nature of the changes.
Abstract: This paper estimates a policy rule that explains the sign and the magnitude of the Federal Reserve's (Fed's) discount rate changes. It sets out a two-sided Type II Tobit model and develops a procedure for its estimation, considering the discrete and censored nature of the changes. The results suggest that the Fed has conducted discount rate policy counter-cyclically to influence output and to curb inflation, and that the Fed's response to policy indicators varies over monetary regimes. Furthermore, consistency is found between the model prediction of the discount rate change and a classification based on whether the change is technical or non-technical. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the authors show that new tests that are robust to negative MA roots allow a reliable test for a unit root in the volatility process to be conducted, and apply these tests to exchange rate and stock returns, strong rejections of nonstationarity in volatility are obtained.
Abstract: It is now well established that the volatility of asset returns is time varying and highly persistent. One leading model that is used to represent these features of the data is the stochastic volatility model. The researcher may test for non-stationarity of the volatility process by testing for a unit root in the log-squared time series. This strategy for inference has many advantages, but is not followed in practice because these unit root tests are known to have very poor size properties. In this paper I show that new tests that are robust to negative MA roots allow a reliable test for a unit root in the volatility process to be conducted. In applying these tests to exchange rate and stock returns, strong rejections of non-stationarity in volatility are obtained. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors attempt to decide, using the posterior odds ratio, whether the symmetric common-value paradigm or the asymmetric independent-private-values paradigm is a more probable explanation of the low-price, sealed-bid auctions conducted by the Indian Oil Corporation to purchase crude-oil from the international market.
Abstract: I attempt to decide, using the posterior odds ratio, whether the symmetric common-value paradigm or the symmetric independent-private-values paradigm is a more probable explanation of the low-price, sealed-bid auctions conducted by the Indian Oil Corporation to purchase crude-oil from the international market. The estimation approach is structural parametric. The auctions are modelled as static non-cooperative games of incomplete information with risk neutral bidders. I conclude that the symmetric independent-private-values paradigm is more probable. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the linear quadratic adjustment cost (LQAC) model is used to estimate the long-run adjustment cost of the UK money demand data, and a non-linear multicointegrating regression is used for estimating the adjustment cost.
Abstract: Tom ENGSTEDDepartment of Information Science, Aarhus School of Business, DK-8210 Aarhus V, andCentre for Non-linear Modelling in Economics, Aarhus University, Denmark.Niels HALDRUPDepartment of Economics, and Centre for Non-linear Modelling in Economics, AarhusUniversity, DK-8000 Aarhus C, Denmark.Abstract.This paper derives a method for estimating and testing the Linear Quadratic AdjustmentCost (LQAC) model when the target variable and some of the forcing variables follow I(2)processes. Based on a forward-looking error-correction formulation of the model it isshown how to obtain strongly consistent estimates of the structural long-run parametersand the adjustment cost parameter from both a linear and a non-linear cointegratingregression, where first-differences of the I(2) variables are included as regressors(multicointegration). Further, based on the estimated parameter values, it is shown how totest and evaluate the LQAC model using a VAR approach. In an empirical applicationusing UK money demand data, the non-linear multicointegrating regression delivers aneconomically plausible estimate of the adjustment cost parameter. However, the exactrestrictions implied by the LQAC model under rational expectations are strongly rejected.JEL-codes: C10, C12, C32, E24Keywords: Adjustment costs, error correction, money demand, rational expectations,multicointegration.This work was completed while the second author was visiting the Department forEconomics at University of California, San Diego, during fall, 1995. The hospitality of theDepartment is gratefully acknowledged. We would like to thank Pierre Siklos fordiscussions.


Journal ArticleDOI
TL;DR: In this paper, a time-varying parameter model with Markov-switching conditional heteroscedasticity is employed to investigate two sources of shifts in real interest rates: (1) shifts in the coefficients relating the ex ante real rate to the nominal rate, the inflation rate and a supply shock variable and (2) unconditional shifts in variance of the stochastic process.
Abstract: A time-varying parameter model with Markov-switching conditional heteroscedasticity is employed to investigate two sources of shifts in real interest rates: (1) shifts in the coefficients relating the ex ante real rate to the nominal rate, the inflation rate and a supply shock variable and (2) unconditional shifts in the variance of the stochastic process. The results underscore the importance of modelling continual change in the ex ante real rate in terms of other economic variables rather than relying on a statistical characterization that permits only a limited number of discrete jumps in the mean of the process. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this review, a list of qualities that a Bayesian software package should have is outlined and whether BACC has these qualities in the context of a brief description and an extended example is discussed.
Abstract: Bayesian Analysis, Computation and Communication (BACC) is a new Bayesian software package which is linked to Gauss and takes the form of a set of Gauss commands. In this review, I outline a list of qualities that a Bayesian software package should have. I then discuss whether BACC has these qualities in the context of a brief description and an extended example. Copyright © 1999 John Wiley & Sons, Ltd.