scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Applied Econometrics in 2002"


Journal ArticleDOI
TL;DR: In this paper, a simple approach to handling the initial conditions problem in dynamic, nonlinear unobserved effects models is proposed, where instead of attempting to obtain the joint distribution of all outcomes of the endogenous variables, instead, the distribution is conditional on the initial value (and the observed history of strictly exogenous explanatory variables).
Abstract: I study a simple, widely applicable approach to handling the initial conditions problem in dynamic, nonlinear unobserved effects models. Rather than attempting to obtain the joint distribution of all outcomes of the endogenous variables, I propose finding the distribution conditional on the initial value (and the observed history of strictly exogenous explanatory variables). The approach is flexible, and results in simple estimation strategies for at least three leading dynamic, nonlinear models: probit, Tobit and Poisson regression. I treat the general problem of estimating average partial effects, and show that simple estimators exist for important special cases. Copyright © 2005 John Wiley & Sons, Ltd.

1,613 citations


Journal ArticleDOI
TL;DR: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models as mentioned in this paper, including high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models.
Abstract: In the 20 years following the publication of the ARCH model, there has been a vast quantity of research uncovering the properties of competing volatility models. Wide-ranging applications to financial data have discovered important stylized facts and illustrated both the strengths and weaknesses of the models. There are now many surveys of this literature. This paper looks forward to identify promising areas of new research. The paper lists five new frontiers. It briefly discusses three—high-frequency volatility models, large-scale multivariate ARCH models, and derivatives pricing models. Two further frontiers are examined in more detail—application of ARCH models to the broad class of non-negative processes, and use of Least Squares Monte Carlo to examine non-linear properties of any model that can be simulated. Using this methodology, the paper analyses more general types of ARCH models, stochastic volatility models, long-memory models and breaking volatility models. The volatility of volatility is defined, estimated and compared with option-implied volatilities. Copyright © 2002 John Wiley & Sons, Ltd.

702 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the RV is sometimes a quite noisy estimator of integrated variance, even with large values of M. The authors use the limit theory on some exchange rate data and some stock data.
Abstract: This paper looks at some recent work on estimating quadratic variation using realised variance (RV) — that is sums of M squared returns. This econometrics has been motivated by the advent of the common availability of high frequency financial return data. When the underlying process is a semimartingale we recall the fundamental result that RV is a consistent (as M →∞ ) estimator of quadratic variation (QV). We express concern that without additional assumptions it seems difficult to give any measure of uncertainty of the RV in this context. The position dramatically changes when we work with a rather general SV model — which is a special case of the semimartingale model. Then QV is integrated variance and we can derive the asymptotic distribution of the RV and its rate of convergence. These results do not require us to specify a model for either the drift or volatility functions, although we have to impose some weak regularity assumptions. We illustrate the use of the limit theory on some exchange rate data and some stock data. We show that even with large values of M the RV is sometimes a quite noisy estimator of integrated variance.

626 citations


Journal ArticleDOI
TL;DR: The authors developed an estimable micro model of consumption growth allowing for constraints on factor mobility and externalities, whereby geographic capital can influence the productivity of a household's own capital and found robust evidence of geographic poverty traps in farm-household panel data from post-reform rural China.
Abstract: How important are neighbourhood endowments of physical and human capital in explaining diverging fortunes over time for otherwise identical households in a developing rural economy? To answer this question we develop an estimable micro model of consumption growth allowing for constraints on factor mobility and externalities, whereby geographic capital can influence the productivity of a household's own capital. Our statistical test has considerable power in detecting geographic effects given that we control for latent heterogeneity in measured consumption growth rates at the micro level. We find robust evidence of geographic poverty traps in farm-household panel data from post-reform rural China. Our results strengthen the equity and efficiency case for public investment in lagging poor areas in this setting. Copyright © 2002 John Wiley & Sons, Ltd.

530 citations


Journal ArticleDOI
TL;DR: In this article, a new type of multivariate GARCH model is proposed, in which potentially large covariance matrices can be parameterized with a fairly large degree of freedom while estimation of the parameters remains feasible.
Abstract: Multivariate GARCH specifications are typically determined by means of practical considerations such as the ease of estimation, which often results in a serious loss of generality. A new type of multivariate GARCH model is proposed, in which potentially large covariance matrices can be parameterized with a fairly large degree of freedom while estimation of the parameters remains feasible. The model can be seen as a natural generalization of the O-GARCH model, while it is nested in the more general BEKK model. In order to avoid convergence difficulties of estimation algorithms, we propose to exploit unconditional information first, so that the number of parameters that need to be estimated by means of conditional information is more than halved. Both artificial and empirical examples are included to illustrate the model. Copyright © 2002 John Wiley & Sons, Ltd.

431 citations


Journal ArticleDOI
TL;DR: This paper examined the spatial patterns of unemployment in Chicago between 1980 and 1990 and found that there is a strong positive and statistically significant degree of spatial dependence in the distribution of raw unemployment rates.
Abstract: This paper examines the spatial patterns of unemployment in Chicago between 1980 and 1990. We study unemployment clustering with respect to different social and economic distance metrics that reflect the structure of agents' social networks. Specifically, we use physical distance, travel time, and differences in ethnic and occupational distribution between locations. Our goal is to determine whether our estimates of spatial dependence are consistent with models in which agents' employment status is affected by information exchanged locally within their social networks. We present non-parametric estimates of correlation across Census tracts as a function of each distance metric as well as pairs of metrics, both for unemployment rate itself and after conditioning on a set of tract characteristics. Our results indicate that there is a strong positive and statistically significant degree of spatial dependence in the distribution of raw unemployment rates, for all our metrics. However, once we condition on a set of covariates, most of the spatial autocorrelation is eliminated, with the exception of physical and occupational distance. Racial and ethnic composition variables are the single most important factor in explaining the observed correlation patterns. Copyright © 2002 John Wiley & Sons, Ltd.

394 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide both qualitative and quantitative measures of the precision of measuring integrated volatility by realized volatility for a fixed frequency of observation and propose a simple approach to capture the information about integrated volatility contained in the returns through the leverage effect.
Abstract: In this paper we provide both qualitative and quantitative measures of the precision of measuring integrated volatility by realized volatility for a fixed frequency of observation. We start by characterizing for a general diffusion the difference between realized and integrated volatility for a given frequency of observation. Then we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has as special cases log-normal, affine and GARCH diffusion models. Using previous empirical results, we show that the noise is substantial compared with the unconditional mean and variance of integrated volatility, even if one employs five-minute returns. We also propose a simple approach to capture the information about integrated volatility contained in the returns through the leverage effect. We show that in practice, the leverage effect does not matter. Copyright © 2002 John Wiley & Sons, Ltd.

307 citations


Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the performance of several recently proposed tests for structural breaks in the conditional variance dynamics of asset returns, which apply to the class of ARCH and SV type processes as well as data-driven volatility estimators using highfrequency data.
Abstract: The paper evaluates the performance of several recently proposed tests for structural breaks in the conditional variance dynamics of asset returns. The tests apply to the class of ARCH and SV type processes as well as data-driven volatility estimators using high-frequency data. In addition to testing for the presence of breaks, the statistics identify the number and location of multiple breaks. We study the size and power of the new tests for detecting breaks in the conditional variance under various realistic univariate heteroscedastic models, change-point hypotheses and sampling schemes. The paper concludes with an empirical analysis using data from the stock and FX markets for which we find multiple breaks associated with the Asian and Russian financial crises. These events resulted in changes in the dynamics of volatility of asset returns in the samples prior and post the breaks. Copyright © 2002 John Wiley & Sons, Ltd.

272 citations


Journal ArticleDOI
TL;DR: A stochastic frontier model with random coefficients is proposed to separate technical inefficiency from technological differences across firms, and free the frontier model from the restrictive assumption that all firms must share exactly the same technological possibilities.
Abstract: The paper proposes a stochastic frontier model with random coefficients to separate technical inefficiency from technological differences across firms, and free the frontier model from the restrictive assumption that all firms must share exactly the same technological possibilities. Inference procedures for the new model are developed based on Bayesian techniques, and computations are performed using Gibbs sampling with data augmentation to allow finite-sample inference for underlying parameters and latent efficiencies. An empirical example illustrates the procedure. Copyright © 2002 John Wiley & Sons, Ltd.

237 citations


Journal ArticleDOI
TL;DR: The Stochastic Volatility in Mean (SVM) model as discussed by the authors is a practical alternative to the generalized autoregressive conditional heteroskedasticity (GARCH) type models that have been used so widely in empirical financial research and which have relied on simultaneous modelling of the first and second moment.
Abstract: The Stochastic Volatility (SV) models we present in this chapter are a practical alternative to the Generalised Autoregressive Conditional Heteroskedasticity (GARCH) type models that have been used so widely in empirical financial research and which have relied on simultaneous modelling of the first and second moment1. For certain financial time series such as stock index returns, which have been shown to display high positive first-order autocorrelations, this constitutes an improvement in terms of efficiency; see Campbell, Lo and MacKinlay (1997, Chapter 2). The volatility of daily stock index returns has been estimated with SV models but usually results have relied on extensive pre-modelling of these series, thus avoiding the problem of simultaneous estimation of the mean and variance 2. New estimation techniques now enable us to include explanatory variables in the mean equation and estimate their coefficients simultaneously with the parameters of the volatility process3. One of the explanatory variables in our model is the variance process itself, hence its name: Stochastic Volatility in Mean (SVM).

204 citations


Journal ArticleDOI
TL;DR: In this article, a new nonlinear time series model that captures a post-recession "bounce-back" in the level of aggregate output is presented. But the model is applied to US real GDP, and they find that the Markov-switching regimes are closely related to NBER-dated recessions and expansions.
Abstract: This paper presents a new nonlinear time series model that captures a post-recession ‘bounce-back’ in the level of aggregate output. While a number of studies have examined this type of business cycle asymmetry using recession-based dummy variables and threshold models, we relate the ‘bounce-back’ effect to an endogenously estimated unobservable Markov-switching state variable. When the model is applied to US real GDP, we find that the Markov-switching regimes are closely related to NBER-dated recessions and expansions. Also, the Markov-switching form of nonlinearity is statistically significant and the ‘bounce-back’ effect is large, implying that the permanent effects of recessions are small. Meanwhile, having accounted for the ‘bounce-back’ effect, we find little or no remaining serial correlation in the data, suggesting that our model is sufficient to capture the defining features of US business cycle dynamics. When the model is applied to other countries, we find larger permanent effects of recessions. Copyright © 2005 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper showed that the realized volatilities constructed from the summation of the high-frequency intraday squared returns conditional on the lagged squared daily returns are approximately Inverse Gaussian (IG) distributed.
Abstract: SUMMARY This paper bridges the gap between traditional ARCH modelling and recent advances on realized volatilities. Based on a ten-year sample of five-minute returns for the ECU basket currencies versus the US dollar, we find that the realized volatilities constructed from the summation of the high-frequency intraday squared returns conditional on the lagged squared daily returns are approximately Inverse Gaussian (IG) distributed, while the distribution of the daily returns standardized by their realized volatilities is approximately normal. Moreover, the implied daily GARCH model with Normal Inverse Gaussian (NIG) errors estimated for the ECU returns results in very accurate out-of-sample predictions for the three years of actual daily Euro/US dollar exchange rates. Copyright  2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
Guy Laroque1, Bernard Salanié1
TL;DR: In this article, the authors used individual data to study how the minimum wage and the welfare system combine to affect employment in France, using the 1997 Labour Force Survey, they decompose non-employment of married women into three components: voluntary, classical, and other.
Abstract: The purpose of this paper is to use individual data to study how the minimum wage and the welfare system combine to affect employment in France. Using the 1997 Labour Force Survey, we decompose non-employment of married women into three components: voluntary, classical (due to the minimum wage) and ‘other’ (a residual category). We find that the minimum wage explains close to 15% of non-employment for these women and that the disincentive effects of some welfare policy measures may be large. Our approach also allows us to evaluate various labour and welfare policy experiments in their effects on participation and employment. Copyright © 2002 John Wiley & Sons, Ltd.


Journal ArticleDOI
TL;DR: In this paper, a new estimation method was proposed to account for the inherent non-linearity of the underlying regression structure, which accommodates the likely possibility that alcohol abuse effects are heterogeneous with respect to the observed and unobserved characteristics of individuals.
Abstract: Based on data from the 1988 Alcohol Supplement of the National Health Interview Survey, Mullahy and Sindelar (1996) (M&S) find, for both men and women, that alcohol abuse results in reduced employment and increased unemployment. The estimates from which they drew these inferences were obtained via the instrumental variables (IV) method, which was implemented in order to account for the potential endogeneity of problem drinking. Though these IV estimates qualitatively supported the prior expectation that problem drinking damages individuals' labour market prospects, they were not found to be statistically significant. The present paper revisits this research and offers a new estimation method which, in addition to accounting for endogeneity, explicitly allows for the inherent non-linearity of the underlying regression structure. The new method is applied to the same data and variable specifications as those used by M&S for the male subpopulation. Consistent with their results, problem drinking is found to have a positive effect on the probability of unemployment and negative effect on the likelihood of being employed. Unlike their result, however, the latter estimate is statistically significant. An appealing feature of the new method is that it accommodates the likely possibility that alcohol abuse effects are heterogeneous with respect to the observed and unobserved characteristics of individuals in the population. To illustrate this fact, abuse effects are computed for two widely differing subgroups of the population. The large differential between the estimated effects for these two subpopulations demonstrates the potential importance of accounting for heterogeneity. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a conceptual model is developed for a trader hedging the crack spread, and various hedge ratio estimation techniques are compared to a multivariate GARCH model that directly incorporates the time to maturity effect often found in futures markets.
Abstract: Crude oil, heating oil, and unleaded gasoline futures contracts are simultaneously analysed for their effectiveness in reducing price volatility for an energy trader. A conceptual model is developed for a trader hedging the ‘crack spread’. Various hedge ratio estimation techniques are compared to a Multivariate GARCH model that directly incorporates the time to maturity effect often found in futures markets. Modelling of the time-variation in hedge ratios via the Multivariate GARCH methodology, and thus taking into account volatility spillovers between markets is shown to result in significant reductions in uncertainty even while accounting for trading costs. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, a Bayesian framework is proposed to quantify the uncertainty about the half-life of deviations from purchasing power parity, based on the responses to a survey study, and a prior probability distribution for the halflife under the recent float intended to capture widely held views among economists.
Abstract: We propose a Bayesian framework in which the uncertainty about the half-life of deviations from purchasing power parity can be quantified. Based on the responses to a survey study, we propose a prior probability distribution for the half-life under the recent float intended to capture widely held views among economists. We derive the posterior probability distribution of the half-life under this consensus prior and confirm the presence of substantial uncertainty about the half-life. We provide for the first time a comprehensive formal evaluation of several nonnested hypotheses of economic interest, including Rogoff's (1996) claim that the half-life is contained in the range of 3 to 5 years. We find that no hypothesis receives strong support from the data. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: Applying the model to the hospitalization data indicates that the FPS model may be preferred even in cases in which other parametric approaches are available, and a new approach for sample selection problems in parametric duration models is developed.
Abstract: I examine the effects of insurance status and managed care on hospitalization spells, and develop a new approach for sample selection problems in parametric duration models. MLE of the Flexible Parametric Selection (FPS) model does not require numerical integration or simulation techniques. I discuss application to the exponential, Weibull, log-logistic and gamma duration models. Applying the model to the hospitalization data indicates that the FPS model may be preferred even in cases in which other parametric approaches are available. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the structural and statistical properties of the STAR-GARCH model and the finite sample properties of maximum likelihood estimation (MLE) of the estimators of the model were investigated.
Abstract: Theoretical and practical interest in non-linear time series models, particularly regime switching models, have increased substantially in recent years Given the abundant research activity in analysing time-varying volatility through Generalized Autoregressive Conditional Heteroscedasticity (GARCH) processes (see Engle, 1982; Bollerslev, 1986), it is important to analyse regime switching models with GARCH errors A popular specification in this class is the (stationary) Smooth Transition Autoregressive–GARCH (STAR-GARCH) model Little is presently known about the structure of the model, or the consistency, asymptotic normality and finite sample properties of the estimators The paper develops the structural and statistical properties of the STAR-GARCH model, and investigates the finite sample properties of maximum likelihood estimation (MLE) of STAR and STAR-GARCH models through numerical simulation The effects of fixing the threshold value and/or the transition rate for the STAR model, misspecification of the conditional mean and the transition function of the STAR-GARCH model, and the finite sample properties of the MLE for the STAR-GARCH model, are also examined These numerical results are used as a guide in empirical research, with an application to Standard and Poor's Composite 500 Index returns for alternative STAR-GARCH models Copyright © 2002 John Wiley & Sons, Ltd

Journal ArticleDOI
TL;DR: In this paper, the authors examined the impact of income on the transitions between home, living independently and first marriage of young Americans and found that income has a strong and significant effect.
Abstract: The paper examines the impact of income on the transitions between home, living independently and first marriage of young Americans. A matching model is outlined, similar to that used in theories of job search, to explain the probability of marriage and living alone. A multiple-state, multiple-transition model which allows for correlated heterogeneity on the first and subsequent transitions is estimated. The results show that income has a strong and significant effect. The impact of unobserved heterogeneity is examined in detail. The impact of the young person's earnings on the transitions is explored through simulation.

Journal ArticleDOI
TL;DR: In this paper, the authors develop new tests of the capital asset pricing model that take account of and are valid under the assumption that the distribution generating returns is elliptically symmetric; this assumption is necessary and sufficient for the validity of the CAPM.
Abstract: We develop new tests of the capital asset pricing model that take account of and are valid under the assumption that the distribution generating returns is elliptically symmetric; this assumption is necessary and sufficient for the validity of the CAPM. Our test is based on semiparametric efficient estimation procedures for a seemingly unrelated regression model where the multivariate error density is elliptically symmetric, but otherwise unrestricted. The elliptical symmetry assumption allows us to avoid the curse of dimensionality problem that typically arises in multivariate semiparametric estimation procedures, because the multivariate elliptically symmetric density function can be written as a function of a scalar transformation of the observed multivariate data. The elliptically symmetric family includes a number of thick-tailed distributions and so is potentially relevant in financial applications. Our estimated betas are lower than the OLS estimates, and our parameter estimates are much less consistent with the CAPM restrictions than the corresponding OLS estimates. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper proposed an optimal filter to transform the Conference Board Composite Leading Index (CLI) into recession probabilities in the US economy, and compared the predictive performance of linear, VAR extensions of smooth transition regression and switching regimes.
Abstract: We propose an optimal filter to transform the Conference Board Composite Leading Index (CLI) into recession probabilities in the US economy. We also analyse the CLI's accuracy at anticipating US output growth. We compare the predictive performance of linear, VAR extensions of smooth transition regression and switching regimes, probit, non-parametric models and conclude that a combination of the switching regimes and non-parametric forecasts is the best strategy at predicting both the NBER business cycle schedule and GDP growth. This confirms the usefulness of CLI, even in a real-time analysis. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This review focuses on using R for teaching econometrics, an open-source programming environment for data analysis and graphics that has in only a decade grown to become a de-facto standard for statistical analysis against which many popular commercial programs may be measured.
Abstract: R, an open-source programming environment for data analysis and graphics, has in only a decade grown to become a de-facto standard for statistical analysis against which many popular commercial programs may be measured. The use of R for the teaching of econometric methods is appealing. It provides cutting-edge statistical methods which are, by R’s open-source nature, available immediately. The software is stable, available at no cost, and exists for a number of platforms, including various a vors of Unix and Linux, Windows (9x/NT/2000), and the MacOS. Manuals are also available for download at no cost, and there is extensive on-line information for the novice user. This review focuses on using R for teaching econometrics. Since R is an extremely powerful environment, this review should also be of interest to researchers.

Journal ArticleDOI
TL;DR: In this article, a meta-analysis is used to compare alternative volatility measures in terms of their forecasting utility, in order to evaluate the utility of alternative measures in the context of portfolio diversification.
Abstract: Investor risk is a complicated concept in practice and is not well captured by measures of volatility as is well understood by uncertainty theory. Rather than asking statisticians to attempt to measure risk, it may be better to listen to decision theorists, but their suggestions are not very practical. Diversification is clearly helpful in reducing risk but the risk level of one portfolio cannot be measured without knowing the risks of other major portfolios. A meta-analysis can be used to compare alternative volatility measures in terms of their forecasting utility. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the modified version of the time reversibility (TR) test of Chen, Chou and Kuan (2000) was used as a complementary diagnostic test for time series models.
Abstract: In this paper we suggest using a modified version of the time reversibility (TR) test of Chen, Chou and Kuan (2000) as a complementary diagnostic test for time series models. The modified CCK test is easy to compute and requires weaker moment conditions than existing tests. Our simulations demonstrate that this test is powerful against asymmetry in volatility but the BDS test is not. In the empirical study of US stock index returns, we first find that these returns are all time irreversible. Applying the GARCH and EGARCH models to these returns, we also find that the Q and BDS tests always accept the null hypothesis and cannot distinguish between these models. By contrast, the modified CCK test accepts only the EGARCH model with an order that can accommodate the underlying asymmetry in volatility. Our results suggest that the detected time irreversibility in these return series may be attributed to volatility asymmetry and that such asymmetry may be captured using a proper EGARCH model. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the relationship between wealth and labour market transitions is investigated in a reduced-from model, in which they allow for random effects, initial conditions, and measurement error in wealth.
Abstract: We study the relationship between wealth and labour market transitions. A lifecycle model, in which individuals are faced by uncertainty about the availability of jobs, serves as a basis for a reduced-form specification for the probabilities of labour market transitions, which depend on wealth according to the model. Theory implies a negative effect of wealth on the probability of becoming or staying employed. This implication is tested for in a reduced-from model of labour market transitions, in which we allow for random effects, initial conditions, and measurement error in wealth. Elasticities of transitions probabilities with respect to wealth are presented. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A special issue on modeling and forecasting financial volatility was published in 2002 to mark the Twenty-First Anniversary of the publication of the first ARCH paper as discussed by the authors, which has become a mature discipline over the last two decades and one of its major research objects is the modelling and forecasting of volatility.
Abstract: It is now 20 years since the publication of Engle's (1982) seminal paper, which introduced ARCH to the world. The ARCH paper had an enormous influence on both theoretical and applied econometrics, and was influential in the establishment of the discipline of Financial Econometrics. In this paper we provide an introduction to the special issue on modelling and forecasting financial volatility, which commemorates the Twentieth Anniversary of the publication of ARCH. Financial econometrics has become a mature discipline over the last two decades, and one of its major research objects is the modelling and forecasting of volatility. This special issue presents ten papers, all of which focus on volatility and risk. The papers examine issues such as the new frontiers of volatility, the selection of models for observed and unobserved volatility, the potential long-memory property of volatility, and the measurement of volatility. The commonality of papers is that they do not examine the extant literature, which has been reviewed elsewhere, but rather outline a number of important issues that are not only of current interest, but are likely to remain so for many years to come. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A sender-receiver game environment where communication is necessary for coordination and learning is essential for communication is investigated to provide a better understanding of the interaction between learning and communication.
Abstract: This paper compares the performance of stimulus response (SR) and belief-based learning (BBL) using data from game theory experiments. The environment, extensive form games played in a population setting, is novel in the empirical literature on learning in games. Both the SR and BBL models fit the data reasonably well in common interest games with history while the test results accept SR and reject BBL in games with no history and in all but one of the divergent interest games. Estimation is challenging since the likelihood function is not globally concave and the results may be subject to convergence bias.


Journal ArticleDOI
TL;DR: In this article, the authors show that failure to represent stochastic trend and stochastically seasonality in an AIDS model leads to a misspecified and possibly structurally unstable model.
Abstract: The argument that is put forward in this paper is that failure to represent stochastic trend and stochastic seasonality in an AIDS model leads to a misspecified and possibly structurally unstable model. This proposition is verified by estimating an AIDS model of the demand for alcoholic beverages in the United Kingdom. Three versions of the model are estimated, and it is demonstrated that the version allowing for stochastic trend and stochastic seasonality performs better than the other two versions of the model in terms of the diagnostics tests and goodness of fit measures. The best estimated model turns out to possess the properties of having common components and being homogenous. Further empirical testing reveals the presence of stochastic trends and cointegration between the budget shares of beer and wine. The results clearly indicate that there has been a shift away from the consumption of beer towards wine. Copyright © 2002 John Wiley & Sons, Ltd.