scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Applied Econometrics in 2008"


Journal ArticleDOI
TL;DR: In this article, a simulation-based sensitivity analysis was proposed to evaluate the effect of temporary work agency (TWA) assignments on the probability of finding a stable job in the USA and Europe.
Abstract: The diffusion of temporary work agency (TWA) jobs has led to a harsh policy debate and ambiguous empirical evidence. Results for the USA, based on quasi-experimental evidence, suggest that a TWA assignment decreases the probability of finding a stable job, while results for Europe, based on the conditional independence assumption (CIA), typically reach opposite conclusions. Using data for two Italian regions, we rely on a matching estimator to show that TWA assignments can be an effective springboard to permanent employment. We also propose a simulation-based sensitivity analysis, which highlights that only for one of these two regions are our results robust to specific failures of the CIA. We conclude that European studies based on the CIA should not be automatically discarded, but should be put under the scrutiny of a sensitivity analysis like the one we propose. Copyright © 2008 John Wiley & Sons, Ltd.

460 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyzed the effects of public R&D subsidies on research expenditure in the German manufacturing sector and found that public funding increased firms' R&DI expenditure.
Abstract: This paper analyzes the effects of public R&D subsidies on R&D expenditure in the German manufacturing sector. The focus is on the question whether public R&D funding stimulates or crowds out private investment. Cross sectional data at the firm level is used. By apllying parametric and semiparametric selection models, it turns out that public funding increases firms' R&D expenditure. Altough the magnitude of the treatment effect depends on the assumpions imposed by the particular selection model.

292 citations


Journal ArticleDOI
TL;DR: In this article, a fresh look at Africa's growth experience by using the Bayesian model averaging (BMA) methodology was taken, which enables us to consider a large number of potential explanatory variables and sort out which of these variable can effectively explain the African growth experience.
Abstract: This paper takes a fresh look at Africa's growth experience by using the Bayesian model averaging (BMA) methodology. BMA enables us to consider a large number of potential explanatory variables and sort out which of these variable can effectively explain Africa's growth experience. Posterior coefficient estimates reveal that key engines of growth in Africa are substantially different from those in the rest of the world. More precisely, it is shown that mining, primary exports and initial primary education exerted differential effect on African growth. These results are examined in relation to the existing literature. Copyright © 2008 John Wiley & Sons, Ltd.

259 citations


Journal ArticleDOI
TL;DR: This paper investigated the empirical relevance of structural breaks for GARCH models of exchange rate volatility using both in-sample and out-of-sample tests and found that structural breaks in the unconditional variance of seven of eight US dollar exchange rate return series over the 1980-2005 period suggest unstable GARCH processes for these exchange rates and parameter estimates often vary substantially across the subsamples defined by the structural breaks.
Abstract: We investigate the empirical relevance of structural breaks for GARCH models of exchange rate volatility using both in-sample and out-of-sample tests. We find significant evidence of structural breaks in the unconditional variance of seven of eight US dollar exchange rate return series over the 1980–2005 period—implying unstable GARCH processes for these exchange rates—and GARCH(1,1) parameter estimates often vary substantially across the subsamples defined by the structural breaks. We also find that it almost always pays to allow for structural breaks when forecasting exchange rate return volatility in real time. Combining forecasts from different models that accommodate structural breaks in volatility in various ways appears to offer a reliable method for improving volatility forecast accuracy given the uncertainty surrounding the timing and size of the structural breaks. Copyright © 2008 John Wiley & Sons, Ltd.

208 citations


Journal ArticleDOI
TL;DR: In this article, the authors explore some distributional properties of aggregate output growth-rate time series and show that, in the majority of OECD countries, output growth rate distributions are well-approximated by symmetric exponential power densities with tails much fatter than those of a Gaussian.
Abstract: This work explores some distributional properties of aggregate output growth-rate time series. We show that, in the majority of OECD countries, output growth-rate distributions are well-approximated by symmetric exponential-power densities with tails much fatter than those of a Gaussian. Fat tails robustly emerge in output growth rates independently of: (i) the way we measure aggregate output; (ii) the family of densities employed in the estimation; (iii) the length of time lags used to compute growth rates. We also show that fat tails still characterize output growth-rate distributions even after one washes away outliers, autocorrelation and heteroscedasticity.

201 citations


Journal ArticleDOI
TL;DR: This article investigated the effect of survey mode on respondent learning and fatigue during repeated choice experiments, and found that respondents become more proficient at the choice task as they move through more question occasions, while the quality of the online respondents' answers declines.
Abstract: SUMMARY This study investigates the effect of survey mode on respondent learning and fatigue during repeated choice experiments. Stated preference data are obtained from an experiment concerning high-speed Internet service conducted on samples of mail respondents and online respondents. We identify and estimate aspects of the error components for different subsets of the choice questions, for both mail and online respondents. Results show mail respondents answer questions consistently throughout a series of choice experiments, but the quality of the online respondents’ answers declines. Therefore, while the online survey provides lower survey administration costs and reduced time between implementation and data analysis, such benefits come at the cost of less precise responses. Copyright  2008 John Wiley & Sons, Ltd. Stated preference (SP) data are used extensively by economists, marketers, and policy makers to estimate individual’s willingness to pay for multidimensional goods not traded in markets. SP data are often obtained from respondent choices in experiments administered through informal ‘pencil and paper’ surveys mailed to sample populations. When designing choice experiments, researchers pay careful attention to the number of alternatives the respondent can choose from, the number of attributes used to describe alternatives, appropriate wording of attribute descriptions, and the number of choice scenarios (or question replications) per respondent. Given the relatively high cost of developing and administering a statistically appropriate mail survey instrument, researchers often trade off aspects of choice task complexity with sample size and survey response quality. For instance, a relatively small sample of respondents may be asked to answer repeated choice questions to simultaneously reduce data collection costs and increase the number of observations available for estimation of marginal utilities. If respondents learn about their preferences and become more proficient at the choice task as they move through more question occasions, the quality of the data improves. Alternatively, multiple question occasions may induce fatigue or boredom. If respondents become tired or bored as they move through the repeated choice questions, the quality of the data deteriorates. Given the tradeoffs described above it is not surprising that recent rapid growth in US Internet penetration has corresponded with greater interest in online or ‘web’ surveys administered through the Internet and other networks. Compared to traditional telephone and mail survey modes, online surveys have low marginal costs of providing completed surveys and reduced time between survey

176 citations


Journal ArticleDOI
TL;DR: This paper proposed a new dynamic copula model where the parameter characterizing dependence follows an autoregressive process, which can be viewed as a generalization of multivariate stochastic volatility models.
Abstract: We propose a new dynamic copula model where the parameter characterizing dependence follows an autoregressive process As this model class includes the Gaussian copula with stochastic correlation process, it can be viewed as a generalization of multivariate stochastic volatility models Despite the complexity of the model, the decoupling of marginals and dependence parameters facilitates estimation We propose estimation in two steps, where first the parameters of the marginal distributions are estimated, and then those of the copula Parameters of the latent processes (volatilities and dependence) are estimated using efficient importance sampling (EIS) We discuss goodness-of-fit tests and ways to forecast the dependence parameter For two bivariate stock index series, we show that the proposed model outperforms standard competing models

144 citations


Journal ArticleDOI
TL;DR: In this article, an alternative approach to identify the wage effects of private-sector training is proposed, which is to narrow down the comparison group by only taking into consideration the workers who wanted to participate in training but did not do so because of some random event.
Abstract: SUMMARY This paper follows an alternative approach to identify the wage effects of private-sector training. The idea is to narrow down the comparison group by only taking into consideration the workers who wanted to participate in training but did not do so because of some random event. This makes the comparison group increasingly similar to the group of participants in terms of observed individual characteristics and the characteristics of (planned) training events. At the same time, the point estimate of the average return to training consistently drops from a large and significant return to a point estimate close to zero. Copyright  2008 John Wiley & Sons, Ltd.

137 citations


Journal ArticleDOI
TL;DR: In this article, the authors estimate gravity models in levels and logs using two data sets via nonparametric methods and show that parametric models based on these assumptions offer equally or more reliable in-sample predictions and out-of-sample forecasts in the majority of cases, particularly in the levels model.
Abstract: Despite the solid theoretical foundation on which the gravity model of bilateral trade is based, empirical implementation requires several assumptions which do not follow directly from the underlying theory. First, unobserved trade costs are assumed to be a (log-)linear function of observables. Second, the effects of trade costs on trade flows are assumed to be constant across country pairs. Maintaining consistency with the underlying theory, but relaxing these assumptions, we estimate gravity models—in levels and logs—using two data sets via nonparametric methods. The results are striking. Despite the added flexibility of the nonparametric models, parametric models based on these assumptions offer equally or more reliable in-sample predictions and out-of-sample forecasts in the majority of cases, particularly in the levels model. Moreover, formal statistical tests fail to reject either parametric functional form. Thus, concerns in the gravity literature over functional form appear unwarranted, and estimation of the gravity model in levels is recommended. Copyright © 2008 John Wiley & Sons, Ltd.

132 citations


Journal ArticleDOI
TL;DR: In this article, the authors employ analytic methods to understand the economics of the NKPC identification problem in the canonical three-equation, new Keynesian model and revisit the empirical evidence for the USA, UK, and Canada by constructing tests and confidence intervals based on the Anderson and Rubin (1949) statistic, which is robust to weak identification.
Abstract: Phillips curves are central to discussions of inflation dynamics and monetary policy. The hybrid new Keynesian Phillips curve (NKPC) describes how past inflation, expected future inflation, and a measure of real aggregate demand drive the current inflation rate. This paper studies the (potential) weak identification of the NKPC under Generalized Method of Moments and traces this syndrome to a lack of higher-order dynamics in exogenous variables. We employ analytic methods to understand the economics of the NKPC identification problem in the canonical three-equation, new Keynesian model. We revisit the empirical evidence for the USA, the UK, and Canada by constructing tests and confidence intervals based on the Anderson and Rubin (1949) statistic, which is robust to weak identification. We also apply the Guggenberger and Smith (2008) LM test to the underlying NKPC pricing parameters. Both tests yield little evidence of forward-looking inflation dynamics. Copyright © 2008 John Wiley & Sons, Ltd.

120 citations


Journal ArticleDOI
TL;DR: In this article, a Bayesian hierarchical model is developed that specifies region-specific latent effects parameters modeled using a connectivity structure between regions that can reflect geographical proximity in conjunction with technological and other types of proximity.
Abstract: This study investigates the pattern of knowledge spillovers arising from patent activity between European regions. A Bayesian hierarchical model is developed that specifies region-specific latent effects parameters modeled using a connectivity structure between regions that can reflect geographical proximity in conjunction with technological and other types of proximity. This approach exploits the fact that interregional relationships may exhibit industry-specific technological linkages or transportation network linkages, which is in contrast to traditional studies relying exclusively on geographical proximity. We also allow for both symmetric and asymmetric knowledge spillovers between regions, and for heterogeneity across the regional sample. A series of formal Bayesian model comparisons provides support for a model based on technological proximity combined with spatial proximity, asymmetric knowledge spillovers, and heterogeneity in the disturbances. Estimates of region-specific latent effects parameters structured in this fashion are produced by the model and used to draw inferences regarding the character of knowledge spillovers across the regions. The method is illustrated using sample data on patent activity covering 323 regions in nine European countries. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors take advantage of bidding data from two auction designs to identify nonparametrically the bidders' utility function within a private value framework, which leads to a nonparametric estimator.
Abstract: Estimating bidders’ risk aversion in auctions is a challeging problem because of identification issues. This paper takes advantage of bidding data from two auction designs to identify nonparametrically the bidders’ utility function within a private value framework. In particular, ascending auction data allow us to recover the latent distribution of private values, while first-price sealed-bid auction data allow us to recover the bidders’ utility function. This leads to a nonparametric estimator. An application to the US Forest Service timber auctions is proposed. Estimated utility functions display concavity, which can be partly captured by constant relative risk aversion.

Journal ArticleDOI
TL;DR: The authors applied extreme value analysis to US sectoral stock indices in order to assess whether tail risk measures like value-at-risk and extremal linkages were significantly altered by 9/11.
Abstract: We apply extreme value analysis to US sectoral stock indices in order to assess whether tail risk measures like value-at-risk and extremal linkages were significantly altered by 9/11. We test whether semi-parametric quantile estimates of ‘downside risk’ and ‘upward potential’ have increased after 9/11. The same methodology allows one to estimate probabilities of joint booms and busts for pairs of sectoral indices or for a sectoral index and a market portfolio. The latter probabilities measure the sectoral response to macro shocks during periods of financial stress (so-called ‘tail-βs’). Taking 9/11 as the sample midpoint we find that tail-βs often increase in a statistically and economically significant way. This might be due to perceived risk of new terrorist attacks. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The authors generalize the specifications used in previous studies of the effect of body mass index (BMI) on earnings by allowing the potentially endogenous BMI variable to enter the log wage equation nonparametrically.
Abstract: We generalize the specifications used in previous studies of the effect of body mass index (BMI) on earnings by allowing the potentially endogenous BMI variable to enter the log wage equation nonparametrically. We introduce a Bayesian posterior simulator for fitting our model that permits a nonparametric treatment of the endogenous BMI variable, flexibly accommodates skew in the BMI distribution, and whose implementation requires only Gibbs steps. Using data from the 1970 British Cohort Study, our results indicate the presence of nonlinearities in the relationships between BMI and log wages that differ across men and women, and also suggest the importance of unobserved confounding for our sample of males. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a general methodology for forecasting in the presence of structural breaks induced by unpredictable changes to model parameters, using Bayesian methods of learning and model comparison to derive a predictive density that takes into account the possibility that a break will occur before the next observation.
Abstract: We provide a general methodology for forecasting in the presence of structural breaks induced by unpredictable changes to model parameters. Bayesian methods of learning and model comparison are used to derive a predictive density that takes into account the possibility that a break will occur before the next observation. Estimates for the posterior distribution of the most recent break are generated as a by-product of our procedure. We discuss the importance of using priors that accurately reflect the econometrician's opinions as to what constitutes a plausible forecast. Several applications to macroeconomic time-series data demonstrate the usefulness of our procedure. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors apply recent results from the statistics literature to test for multimodality of worldwide distributions of several (unweighted and population-weighted) measures of labor productivity.
Abstract: We apply recent results from the statistics literature to test for multimodality of worldwide distributions of several (unweighted and population-weighted) measures of labor productivity. Specifically, we employ Silverman (Bump) and Dip modality tests, calibrated to correct for their incorrect asymptotic levels. We show that test results are sensitive to the test statistic employed and to population weighting. But regardless of the statistical criterion used, multimodality is present throughout, or emerges during, our sample period (1960-2000). We also examine (a) movements of economies between modal clusters and (b) relationships between certain key development factors and multimodality of the productivity distribution.

Journal ArticleDOI
TL;DR: In this article, the authors define a bivariate mixture model to test whether economic growth can be considered exogenous in the Solovian sense, and apply it to the Bernanke and Gurkaynak extension of the Solow model.
Abstract: We define a bivariate mixture model to test whether economic growth can be considered exogenous in the Solovian sense. For this purpose, the multivariate mixture approach proposed by Alfo and Trovato is applied to the Bernanke and Gurkaynak extension of the Solow model. We find that the explanatory power of the Solow growth model is enhanced, since growth rates are not statistically significantly associated with investment rates, when cross-country heterogeneity is considered. Moreover, no sign of convergence to a single equilibrium is found. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a logistic smooth transition and Markov switching autoregressive model of the monthly US unemployment rate are estimated by Markov chain Monte Carlo methods. But both models provide very similar descriptions, Bayes factors and predictive efficiency tests (both Bayesian and classical) favor the smooth transition model.
Abstract: Logistic smooth transition and Markov switching autoregressive models of a logistic transform of the monthly US unemployment rate are estimated by Markov chain Monte Carlo methods. The Markov switching model is identified by constraining the first autoregression coefficient to differ across regimes. The transition variable in the LSTAR model is the lagged seasonal difference of the unemployment rate. Out-of-sample forecasts are obtained from Bayesian predictive densities. Although both models provide very similar descriptions, Bayes factors and predictive efficiency tests (both Bayesian and classical) favor the smooth transition model. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This work not only compares the linearized AIDS model with the Rotterdam model, but also with the full nonlinear AIDS.
Abstract: The Rotterdam model and the Almost Ideal Demand System (AIDS) are often applied in consumer demand systems modeling. Using Monte Carlo techniques, we determine which model performs best in recovering the true elasticities of demand. The AIDS model is usually used in linearized form. Since the Rotterdam model is also linear in a very similar form, comparison of the Rotterdam model and the AIDS model has been the subject of much speculation in the literature. We not only compare the linearized AIDS model with the Rotterdam model, but also with the full nonlinear AIDS. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the authors investigate what can be learned about the prevalence of work disability under various assumptions on the reporting error process and provide tight inferences under their strongest assumptions but then find that identification deteriorates rapidly as the assumptions are relaxed.
Abstract: In light of widespread concerns about the reliability of self-reported disability, we investigate what can be learned about the prevalence of work disability under various assumptions on the reporting error process. Developing a nonparametric bounding framework, we provide tight inferences under our strongest assumptions but then find that identification deteriorates rapidly as the assumptions are relaxed. For example, we find that inferences are highly sensitive to how one models potential inconsistencies between subjective self-assessments of work limitation and more objective measures of functional limitation. These two indicators appear to measure markedly different aspects of health status. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A straightforward algorithm based on sequential Gaussian quadrature is suggested, which performs very well both in the empirical application and a Monte Carlo study for ordered logit and binary probit models with an AR(1) error component.
Abstract: This paper discusses the estimation of a class of nonlinear state space models including nonlinear panel data models with autoregressive error components. A health economics example illustrates the usefulness of such models. For the approximation of the likelihood function, nonlinear filtering algorithms developed in the time-series literature are considered. Because of the relatively simple structure of these models, a straightforward algorithm based on sequential Gaussian quadrature is suggested. It performs very well both in the empirical application and a Monte Carlo study for ordered logit and binary probit models with an AR(1) error component. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: An application of these semiparametric models to rural districts indicates that pollution resulting from intensive livestock farming has a significant nonlinear impact on house prices.
Abstract: In the area of environmental analysis using hedonic price models, we investigate the performance of various nonparametric and semiparametric specifications. The proposed model specifications are made up of two parts: a linear component for house characteristics and a non-(semi)parametric component representing the nonlinear influence of environmental indicators on house prices. We adopt a general-to-specific search procedure, based on recent specification tests comparing the proposed specifications with a fully nonparametric benchmark model, to select the best model specification. An application of these semiparametric models to rural districts indicates that pollution resulting from intensive livestock farming has a significant nonlinear impact on house prices. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the efficiency externalities of trade and various forms of foreign investment for a sample of 20 OECD countries between 1982 and 2000 using a stochastic frontier approach were investigated.
Abstract: The literature on the spillover effects of trade and inflows of foreign direct investment (FDI) has concentrated on technological externalities. Little effort has been directed towards identifying their efficiency externalities. This paper measures the efficiency externalities of trade and various forms of foreign investment for a sample of 20 OECD countries between 1982 and 2000 using a stochastic frontier approach. Trade and all foreign investment inflows are found to enhance efficiency, whereas outflows of FDI are found to exacerbate inefficiency. The efficiency externalities from foreign investment are contingent on the absorptive capacity of the host economies. Copyright (C) 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: It is shown that a sequence of CSEs approximates an equilibrium under standard conditions, and the flexibility of the CSE approximation is illustrated with a series of auction examples, including a complex multi-unit auction.
Abstract: We define a new concept of constrained strategic equilibrium (CSE) for Bayesian games. We show that a sequence of CSEs approximates an equilibrium under standard conditions. We also provide an algorithm to implement the CSE approximation method numerically in a broad class of Bayesian games, including games without analytically tractable solutions. Finally, we illustrate the flexibility of the CSE approximation with a series of auction examples, including a complex multi-unit auction. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, the authors use counterfactual experiments to investigate the sources of the large volatility reduction in US real GDP growth in the 1980s, and find strong statistical support for the idea that a change in the size of structural shocks alone, with no corresponding change in propagation of these shocks, would have produced the same overall volatility reduction as what actually occurred.
Abstract: We use counterfactual experiments to investigate the sources of the large volatility reduction in US real GDP growth in the 1980s. Contrary to an existing literature that conducts counterfactual experiments based on classical estimation and point estimates, we consider Bayesian analysis that provides a straightforward measure of estimation uncertainty for the counterfactual quantity of interest. Using Blanchard and Quah's (1989) structural VAR model of output growth and the unemployment rate, we find strong statistical support for the idea that a counterfactual change in the size of structural shocks alone, with no corresponding change in the propagation of these shocks, would have produced the same overall volatility reduction as what actually occurred. Looking deeper, we find evidence that a counterfactual change in the size of aggregate supply shocks alone would have generated a larger volatility reduction than a counterfactual change in the size of aggregate demand shocks alone. We show that these results are consistent with a standard monetary VAR, for which counterfactual analysis also suggests the importance of shocks in generating the volatility reduction, but with the counterfactual change in monetary shocks alone generating a small reduction in volatility. Copyright © 2007 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This article showed that the average wage gap decomposition between any two groups of workers can be carried out using nonparametric wage structures, and proposed an algorithm to correct for sample selection in non-parametric models known as tree structures.
Abstract: This paper shows that average wage gap decompositions between any two groups of workers can be carried out using nonparametric wage structures. It also proposes an algorithm to correct for sample selection in nonparametric models known as tree structures. This paper studies the wage gap between third-generation Mexican American and non-Hispanic white workers in the southwest. It is shown that the decomposition heavily depends on functional assumptions, and that different aproaches to flexibility may render sufficiently good and similar results Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the authors investigate using the method of ordinary least squares (OLS) on auction data and find that for parameterizations of the valuation distribution that are common in empirical practice, an adaptation of OLS provides unbiased estimators of structural parameters.
Abstract: I investigate using the method of ordinary least squares (OLS) on auction data. I find that for parameterizations of the valuation distribution that are common in empirical practice, an adaptation of OLS provides unbiased estimators of structural parameters. Under symmetric independent private values, adapted OLS is a specialization of the method of moments strategy of Laffont, Ossard and Vuong (1995). In contrast to their estimator, here simulation is not required, leading to a computationally simpler procedure. The paper also discusses using estimation results for inference on the shape of the valuation distribution, and applicability outside the symmetric independent private values framework. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a utility-consistent static labor supply model with flexible preferences and a nonlinear and possibly non-convex budget set is considered and stochastic error terms are introduced to represent optimization and reporting errors.
Abstract: We consider a utility-consistent static labor supply model with flexible preferences and a nonlinear and possibly non-convex budget set. Stochastic error terms are introduced to represent optimization and reporting errors, stochastic preferences, and heterogeneity in wages. Coherency conditions on the parameters and the support of error distributions are imposed for all observations. The complexity of the model makes it impossible to write down the probability of participation. Hence we use simulation techniques in the estimation. We compare our approach with various simpler alternatives proposed in the literature. Both in Monte Carlo experiments and for real data the various estimation methods yield very different results. Copyright © 2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the conditional mean dependence of the unobserved heterogeneity on the initial conditions and the strictly exogenous variables was specified, and the authors applied their method to study female labor supply using a panel data set from the National Longitudinal Survey of Youth 1979.
Abstract: SUMMARY This paper develops semiparametric Bayesian methods for inference of dynamic Tobit panel data models. Our approach requires that the conditional mean dependence of the unobserved heterogeneity on the initial conditions and the strictly exogenous variables be specified. Important quantities of economic interest such as the average partial effect and average transition probabilities can be readily obtained as a by-product of the Markov chain Monte Carlo run. We apply our method to study female labor supply using a panel data set from the National Longitudinal Survey of Youth 1979. Copyright  2008 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the pitfalls of the conventional heteroskedasticity and autocorrelation robust (HAR) Wald test and the advantages of new HAR tests developed by Kiefer and Vogelsang in 2005 and by Phillips, Sun and Jin in 2003 and 2006 are discussed.
Abstract: This paper illustrates the pitfalls of the conventional heteroskedasticity and autocorrelation robust (HAR) Wald test and the advantages of new HAR tests developed by Kiefer and Vogelsang in 2005 and by Phillips, Sun and Jin in 2003 and 2006. The illustrations use the 1993 Fama–French three-factor model. The null that the intercepts are zero is tested for 5-year, 10-year and longer sub-periods. The conventional HAR test with asymptotic P-values rejects the null for most 5-year and 10-year sub-periods. By contrast, the null is not rejected by the new HAR tests. This conflict is explained by showing that inferences based on the conventional HAR test are misleading for the sample sizes used in this application. Copyright © 2007 John Wiley & Sons, Ltd.