scispace - formally typeset
Search or ask a question

Showing papers in "Econometric Reviews in 2014"


Posted Content
TL;DR: The authors investigated how the recent boom in the US natural gas industry has affected local economies in the central United States and found a modest positive impact on local labor market outcomes in counties where natural gas production has increased, and little evidence of a resource curse.
Abstract: The extraction of natural gas from shale and tight gas formations is one of the largest innovations in the US energy sector in several decades According to the Energy Information Agency's (EIA) 2013 Annual Energy Outlook, total US recoverable natural gas resources were estimated to be 2,327 trillion cubic feet, up from 1,259 trillion cubic feet in 2000 Using projected annual growth in US natural gas consumption, current US reserves of natural gas represent an estimated 70 years' worth of supply This energy boom has reversed a long downward trend in US natural gas production In the 1970s the US energy sector seemingly conceded its decline and began investing in global markets to survive That trend reversed course in the mid-2000sA key question is whether this now abundant and accessible natural resource has positive effects on local economic conditions Some theories suggest resource abundance may increase local economic development through higher demand for labor in the energy sector and spillover spending in the local economy Other theories, though, suggest industries not closely related to the resource extraction industry may be harmed as energy production expands For example, labor demand by the extraction industry may be high enough to bid up local wages, which in turn could pull employees from other lower-paying jobs and make it difficult for other industries to survive At the national and international level, this phenomenon has been referred to as the "natural resource curse," but the topic has received much less attention at the local levelThis article investigates how the recent boom in the US natural gas industry has affected local economies in the central United States Labor market conditions at the county level in a nine-state region are analyzed using econometric models to determine how employment and wages have responded to the rapid expansion of natural gas production from 2001 to 2011 The article finds a modest positive impact on local labor market outcomes in counties where natural gas production has increased, and little evidence of a natural resource curseSection I discusses factors leading to the shale boom in the natural gas industry and potential opportunities for the US economy Section II highlights factors that can lead to a natural resource curse or "blessing" and how those factors look different at the local and national levels Section III describes the study region and discusses the empirical findings and evidence of a resource curseI SHALE BOOM: A TALE OF TWO TECHNOLOGIESTechnologies pursued initially by two independent energy companies that were eventually brought together have forever changed the oil and gas industry Production and proven reserves of natural gas have increased significantly since the mid-2000s This increase has opened new possibilities for the US economyHydraulic fracturing and horizontal drillingIn the early 1980s, Mitchell Energy & Development Corporation, led by George P Mitchell, drilled the first well in the Barnett Shale field in western Texas Instead of encountering the typical, highly porous rock of conventional formations, Mitchell Energy encountered shale Shale has the potential to hold vast amounts of natural gas; however, it is highly nonporous, which causes the gas to be trapped in the rock Mitchell Energy experimented over 20 years with different techniques, and found that by using hydraulic fracturing (commonly referred to as "fracking") it was able to break apart the rock to free natural gas Fracking consists of shooting a mixture of water, chemicals, and sand into wells to create fissures in rock formations that frees the trapped gasOver the same period, Devon Energy Corporation of Oklahoma City had been developing horizontal drilling techniques Advances in controls and measurement allowed operators to drill to a certain depth, then drill further at an angle or even sideways, exposing more of the reservoir and allowing much greater recovery …

100 citations


Journal ArticleDOI
TL;DR: In this paper, a multivariate student-t distribution for the structural shocks is used for estimating the marginal likelihood of the Dynamic Stochastic General Equilibrium (DSGE) model.
Abstract: This paper deals with Dynamic Stochastic General Equilibrium (DSGE) models under a multivariate student-t distribution for the structural shocks. Based on the solution algorithm of Klein (2000) and the gamma-normal representation of the t-distribution, the TaRB-MH algorithm of Chib and Ramamurthy (2010) is used to estimate the model. A technique for estimating the marginal likelihood of the DSGE student-t model is also provided. The methodologies are illustrated first with simulated data and then with the DSGE model of Ireland (2004) where the results support the t-error model in relation to the Gaussian model.

59 citations


Journal ArticleDOI
TL;DR: In this paper, the authors propose an effective sample size that applies to fairly general linear models and illustrate in a variety of situations, including the use of Zellner-Siow priors in Bayesian model selection.
Abstract: Model selection procedures often depend explicitly on the sample size n of the experiment. One example is the Bayesian information criterion (BIC) criterion and another is the use of Zellner–Siow priors in Bayesian model selection. Sample size is well-defined if one has i.i.d real observations, but is not well-defined for vector observations or in non-i.i.d. settings; extensions of critera such as BIC to such settings thus requires a definition of effective sample size that applies also in such cases. A definition of effective sample size that applies to fairly general linear models is proposed and illustrated in a variety of situations. The definition is also used to propose a suitable ‘scale’ for default proper priors for Bayesian model selection.

59 citations


Journal ArticleDOI
TL;DR: This work considers stochastic frontier models in a panel data setting where there is dependence over time and proposes two alternative specifications, one of which applies a copula function to the distribution of the composed error term, which permits the use of maximum likelyhood estimate (MLE) and generalized method moments (GMM).
Abstract: We consider stochastic frontier models in a panel data setting where there is dependence over time. Current methods of modelling time dependence in this setting are either unduly restrictive or computationally infeasible. Some impose restrictive assumptions on the nature of dependence such as the \scaling" property. Others involveT -dimensional integration, whereT is the number of cross-sections, which may be large. Moreover, no known multivariate distribution has the property of having commonly used, convenient marginals such as normal/half-normal. We show how to use copulas to resolve these issues. The range of dependence we allow for is unrestricted and the computational task involved is easy compared to the alternatives. Also, the resulting estimators are more ecient than those that assume independence over time. We propose two alternative specications. One applies a copula function to the distribution of the composed error term. This permits the use of MLE and GMM. The other applies a copula to the distribution of the one-sided error term. This allows for a simulated MLE and improved estimation of ineciencies. An application demonstrates the usefulness of our approach.

53 citations


Journal ArticleDOI
TL;DR: The authors employ quantile regression fixed effects models to estimate the income-pollution relationship on NO x (nitrogen oxide) and SO 2 (sulfur dioxide) using U.S. data.
Abstract: We employ quantile regression fixed effects models to estimate the income-pollution relationship on NO x (nitrogen oxide) and SO 2 (sulfur dioxide) using U.S. data. Conditional median results suggest that conditional mean methods provide too optimistic estimates about emissions reduction for NO x , while the opposite is found for SO 2. Deleting outlier states reverses the absence of a turning point for SO 2 in the conditional mean model, while the conditional median model is robust to them. We also document the relationship's sensitivity to including additional covariates for NO x , and undertake simulations to shed light on some estimation issues of the methods employed.

50 citations


Journal ArticleDOI
TL;DR: In this article, a rank-based goodness-of-fit test for copulas is proposed, which uses the information matrix equality and so relates to the White (1982) specification test.
Abstract: We propose a new rank-based goodness-of-fit test for copulas. It uses the information matrix equality and so relates to the White (1982) specification test. The test avoids parametric specification of marginal distributions, it does not involve kernel weighting, bandwidth selection, or any other strategic choices, it is asymptotically pivotal with a standard distribution, and it is simple to compute compared to available alternatives. The finite-sample size of this type of tests is known to deviate from their nominal size based on asymptotic critical values, and bootstrapping critical values could be a preferred alternative. A power study shows that, in a bivariate setting, the test has reasonable properties compared to its competitors. We conclude with an application in which we apply the test to two stock indices.

43 citations


Journal ArticleDOI
Abstract: Recently, various studies have used the Poisson Pseudo-Maximal Likehood (PML) to estimate gravity specifications of trade flows and non-count data models more generally. Some papers also report results based on the Negative Binomial Quasi-Generalised Pseudo-Maximum Likelihood (NB QGPML) estimator, which encompasses the Poisson assumption as a special case. This note shows that the NB QGPML estimators that have been used so far are unappealing when applied to a continuous dependent variable which unit choice is arbitrary, because estimates artificially depend on that choice. A new NB QGPML estimator is introduced to overcome this shortcoming.

41 citations


Journal ArticleDOI
TL;DR: Treatment parameters are derived in the framework of a potential outcomes model with a treatment choice equation, where the correlation between the unobservable components of the model is driven by a low-dimensional vector of latent factors.
Abstract: This paper contributes to the emerging Bayesian literature on treatment effects. It derives treatment parameters in the framework of a potential outcomes model with a treatment choice equation, where the correlation between the unobservable components of the model is driven by a low-dimensional vector of latent factors. The analyst is assumed to have access to a set of measurements generated by the latent factors. This approach has attractive features from both theoretical and practical points of view. Not only does it address the fundamental identification problem arising from the inability to observe the same person in both the treated and untreated states, but it also turns out to be straightforward to implement. Formulae are provided to compute mean treatment effects as well as their distributional versions. A Monte Carlo simulation study is carried out to illustrate how the methodology can easily be applied.

40 citations


Journal ArticleDOI
TL;DR: In this paper, a new Cholesky-based prior for the covariance matrix of the errors in instrumental variable regressions is proposed, which is more flexible and robust than the inverted Wishart prior.
Abstract: Instrumental variable (IV) regression provides a number of statistical challenges due to the shape of the likelihood. We review the main Bayesian literature on instrumental variables and highlight these pathologies. We discuss Jeffreys priors, the connection to the errors-in-the-variables problems and more general error distributions. We propose, as an alternative to the inverted Wishart prior, a new Cholesky-based prior for the covariance matrix of the errors in IV regressions. We argue that this prior is more flexible and more robust thanthe inverted Wishart prior since it is not based on only one tightness parameter and therefore can be more informative about certain components of the covariance matrix and less informative about others. We show how prior-posterior inference can be formulated in a Gibbs sampler and compare its performance in the weak instruments case for synthetic as well as two illustrations based on well-known real data.

40 citations


Posted Content
TL;DR: In this article, the authors examined attributes of mobile payments that may be a benefit or a concern to U.S. brick-and-mortar merchants and conducted interviews with about 20 large and midsize merchants from various retail categories.
Abstract: The U.S. payment market has attracted increasing attention from technology firms and their investors seeking to capitalize on mobile and cloud technologies and the growing trend in consumer adoption of smartphones. Although consumers in the United States largely have not adopted mobile payments, merchants believe these technologies will address some current barriers to the use of mobile payments. In fact, many merchants are actively developing and implementing mobile payment applications.Will these new technologies increase the overall value of mobile payments for end users, namely merchants and consumers, and motivate them to use mobile payments? End users' preferences will influence the industry's direction as industry participants consider making investments and policymakers consider payments policies. This article focuses on merchants' mobile payments preferences because, unlike consumer payment preferences, there is little research on the merchant perspective.The article examines attributes of mobile payments that may be a benefit or a concern to U.S. brick-and-mortar merchants. The analysis is based on phone interviews with about 20 large and midsize merchants from various retail categories. The article finds some attributes have clear effects on merchants. An enhanced customer shopping experience will be a benefit for merchants, while, at least in the near term, fragmented markets-in which several mobile technologies and applications coexist but no one gains enough traction to propel the industry forward-will be a concern. The effect of other attributes, such as cost, customer data control, and security, depends on what technologies will be used, which payment method will be linked to fund the mobile payment transaction, and who will provide the mobile payment application.Section I reviews the current payment environment for merchants and compares basic features and associated business models of mobile payment technologies. Section II discusses key attributes of mobile payments for merchants-customer shopping experience, cost, customer data control, security, and fragmented markets-and examines how benefits and concerns about these attributes vary by merchant characteristics. Section III summarizes the findings and draws conclusions by discussing the direction of mobile payments in the United States.I. PAYMENT ENVIRONMENT AND MOBILE PAYMENT TECHNOLOGIESMerchants view the adoption of mobile payments methods, especially those that use barcodes, quick response (QR) codes, and cloud technology, as an opportunity to improve a payment environment long dominated by cards. Merchants generally have been dissatisfied with card fees and rules that limit payment acceptance practices, such as surcharging and discounting. Although any mobile payment technology theoretically can accommodate a variety of payment methods as a funding source, each mobile technology tends to favor a particular payment method due to business models associated with the technology.Payment cards-the current payment environmentAs the U.S. payments system has evolved from paper-based to electronic, the share of merchants' total sales made with payment cards has increased. The share of consumers that prefers to use a payment card (either a credit, debit, or prepaid card) over other payment methods at brick-and-mortar merchants increased from 49 percent in 2001 to 69 percent in 2010 (Chart 1). Consequently, fees charged to merchants to process payment cards, as well as rules and security standards set by payment card networks, significantly affect merchants' net income.Fees merchants pay to accept card transactions have risen rapidly in the last two decades because of increased volume and value of card transactions and increased fees per transaction. The increased fees per transaction are attributed to interchange fees, which are paid to card issuers and account for more than 80 percent of all fees merchants pay for card transactions. …

39 citations


Journal ArticleDOI
TL;DR: In this article, Cavaliere et al. developed bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996).
Abstract: In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of the underlying vector autoregressive (VAR) model which obtain under the reduced rank null hypothesis. They propose methods based on an independent and individual distributed (i.i.d.) bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when time-varying behavior is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and, moreover, that the probability tha...

Posted Content
TL;DR: In this article, the authors examined whether increases and decreases in uncertain- ty have asymmetric effects on economic activity and concluded that large increases in uncertainty have larger effects on the U.S. economy, with increases having larger effects than decreases.
Abstract: (ProQuest: ... denotes formulae omitted.)In the wake of the financial crisis and severe recession, the U.S. econ- omy's recovery has been sluggish by historical standards. One often- cited explanation for the tepid recovery is that elevated uncertainty about the future has been a drag on economic activity.Several episodes of heightened uncertainty followed the financial crisis. In May 2010, the European sovereign debt crisis caused financial markets to question the survival of the euro area and how the crisis would be resolved. Similarly, in August 2011, the U.S. debt ceiling crisis cast doubts on the U.S. government's commitment to repay its debts, causing financial turmoil. In June 2013, uncertainty about the Federal Reserve's plans for slowing the pace of ongoing asset purchases after a speech by then-Chairman Ben Bernanke resulted in a brief peri- od of heightened financial market volatility that is popularly called the "taper tantrum." A large increase in uncertainty during each of these episodes may have slowed the recovery. In each case, however, uncer- tainty declined after a short period.Examining temporary spikes in uncertainty can help determine its effects on economic activity, specifically whether increases have the same effect as decreases. If uncertainty has symmetric effects-that is, if decreases in uncertainty offset increases-then short-lived spikes in uncertainty should not have long-lasting effects. On the other hand, if uncertainty has asymmetric effects-if increases have more sizable effects than decreases-then short-lived spikes in uncertainty may per- sistently lower economic activity.This article examines whether increases and decreases in uncertain- ty have asymmetric effects. It concludes that sizable increases in uncer- tainty have larger effects on economic activity than sizable decreases. As a result, the short-lived uncertainty episodes during the current re- covery may have had long-lasting effects, leading to lower growth in output and employment. The first section of the article discusses how uncertainty might affect the economy, and documents the high uncer- tainty during the recovery. Section II shows that changes in uncertainty have asymmetric effects on the U.S. economy, with increases having larger effects than decreases. Section III uses the results on asymmetry to demonstrate how movements in uncertainty caused lower economic activity and employment growth during the recovery. The lower em- ployment growth created substantial cumulative losses in aggregate em- ployment, with the burden falling disproportionately across industries.I. UNCERTAINTY AND THE ECONOMYEconomic theory suggests that rising uncertainty causes firms to wait before investing and hiring, and causes consumers to wait before purchasing certain consumption goods. These delays can slow the economy. Recovery from the financial crisis has been unusually slow, and measures of stock market volatility suggest uncertainty has been high. These observations are consistent with the view that uncertainty has slowed the recovery.Why uncertainty might matterUncertainty about future economic outcomes can be viewed as a probability distribution. For example, a firm may be uncertain about whether future demand for its product will be higher, lower, or the same as current demand. If the firm's uncertainty increases, it may place greater weight on relatively extreme outcomes-for example, the likeli- hood that demand will be significantly higher or lower-and thus in- crease the possible range of outcomes. In many cases, such as the recent financial crisis, uncertainty rises because of negative news, which lowers expectations of future economic activity but also increases the range of possible outcomes.Economic theory, such as that developed by Bernanke, and Bloom and others, predicts that rising uncertainty may lower economic ac- tivity as firms postpone their investments and hiring, and consumers postpone their purchases. …

Journal ArticleDOI
TL;DR: In this article, a Two-Stage Bayesian Model Averaging (2SBMA) method is proposed to address model uncertainty at both the instrument and covariate level in economic modeling in the presence of endogeneity.
Abstract: Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed.

Journal ArticleDOI
TL;DR: In this paper, the conditional Gaussian likelihood function is used to avoid the incidental parameters problem induced by the inclusion of individual fixed effects for each cross-sectional unit, and the Conditional Lagrange Multiplier test is applied to both balanced and unbalanced panels.
Abstract: Typical panel data models make use of the assumption that the regression parameters are the same for each individual cross-sectional unit. We propose tests for slope heterogeneity in panel data models. Our tests are based on the conditional Gaussian likelihood function in order to avoid the incidental parameters problem induced by the inclusion of individual fixed effects for each cross-sectional unit. We derive the Conditional Lagrange Multiplier test that is valid in cases where N → ∞ and T is fixed. The test applies to both balanced and unbalanced panels. We expand the test to account for general heteroskedasticity where each cross-sectional unit has its own form of heteroskedasticity. The modification is possible if T is large enough to estimate regression coefficients for each cross-sectional unit by using the MINQUE unbiased estimator for regression variances under heteroskedasticity. All versions of the test have a standard Normal distribution under general assumptions on the error distribution as ...

Journal ArticleDOI
TL;DR: The overall emphasis is that intuition and conventional wisdom need to be examined via critical thinking and theoretical verification before they can be trusted fully.
Abstract: Possibly, but more likely you are merely a victim of conventional wisdom. More data or better models by no means guarantee better estimators (e.g., with a smaller mean squared error), when you are not following probabilistically principled methods such as MLE (for large samples) or Bayesian approaches. Estimating equations are particularly vulnerable in this regard, almost a necessary price for their robustness. These points will be demonstrated via common tasks of estimating regression parameters and correlations, under simple models such as bivariate normal and ARCH(1). Some general strategies for detecting and avoiding such pitfalls are suggested, including checking for self-efficiency (Meng, 1994; Statistical Science) and adopting a guiding working model. Using the example of estimating the autocorrelation ρ under a stationary AR(1) model, we also demonstrate the interaction between model assumptions and observation structures in seeking additional information, as the sampling interval s increases. Fu...

Journal ArticleDOI
TL;DR: In this article, a revised version of bagging bootstrap aggregating is proposed as a forecast combination method for the out-of-sample forecasts in time series models, which can be used to justify the validity of the bagging in the reduction of mean squared forecast error when compared with the unbagged forecasts.
Abstract: In this paper we propose a revised version of (bagging) bootstrap aggregating as a forecast combination method for the out-of-sample forecasts in time series models. The revised version explicitly takes into account the dependence in time series data and can be used to justify the validity of bagging in the reduction of mean squared forecast error when compared with the unbagged forecasts. Monte Carlo simulations show that the new method works quite well and outperforms the traditional one-step-ahead linear forecast as well as the nonparametric forecast in general, especially when the in-sample estimation period is small. We also find that the bagging forecasts based on misspecified linear models may work as effectively as those based on nonparametric models, suggesting the robustification property of bagging method in terms of out-of-sample forecasts. We then reexamine forecasting powers of predictive variables suggested in the literature to forecast the excess returns or equity premium. We find that, co...

Posted Content
TL;DR: In this paper, the authors focused on the problems regarding working capital, improving entrepreneurship and financial environment in the long term, attracting new financial resources, facilitating dialogue and consultation between governments, SMEs and financial institutions.
Abstract: Small and medium enterprises (SMEs) play a vital role in economic development, they offer the most economical use of capital in relation to job creation and provide the strongest channel for development and innovation. Innovation is recognized as an essential component of the economic growth process, broadly defined as development, deployment and economic utilization of new products, processes and services. SMEs are crucial for helping economies to restructure quickly in response to changing economic, social and market conditions, under the impact of international financial crisis. However, SMEs can fulfill this potential if they obtain the finance necessary to start and develop their businesses. Access to finance is a key determinant for business start-up, development and growth for SMEs, including the innovative ones, and they have different needs and face different challenges. The limited market power, lack of management skills, absence of adequate accounting records and insufficient assets, transaction costs and lack of collateral, all tend to increase the risk profile of SMEs. Moreover, uncertainty and informational asymmetries that characterize SMEs are amplified for innovative SMEs making it more difficult for them to access finance through traditional means. The current economic environment has brought SME needs into particular focus given the significantly tightened credit supply conditions arising from the reduced ability and willingness of banks to provide financing. In order to improve the access to finance for SMEs, the efforts should be focused at the European and national levels on solving the problems regarding working capital, improving entrepreneurship and financial environment in the long term, attracting new financial resources, facilitating dialogue and consultation between governments, SMEs and financial institutions.

Journal ArticleDOI
TL;DR: In this paper, a multiway analysis of variance for non-Gaussian multivariate distributions and a simulation algorithm to estimate the corresponding components of variance is presented, which can be used to estimate both extrinsic and intrinsic variance.
Abstract: This paper develops a multiway analysis of variance for non-Gaussian multivariate distributions and provides a practical simulation algorithm to estimate the corresponding components of variance. It specifically addresses variance in Bayesian predictive distributions, showing that it may be decomposed into the sum of extrinsic variance, arising from posterior uncertainty about parameters, and intrinsic variance, which would exist even if parameters were known. Depending on the application at hand, further decomposition of extrinsic or intrinsic variance (or both) may be useful. The paper shows how to produce simulation-consistent estimates of all of these components, and the method demands little additional effort or computing time beyond that already invested in the posterior simulator. It illustrates the methods using a dynamic stochastic general equilibrium model of the US economy, both before and during the global financial crisis.

Posted Content
TL;DR: In this paper, the authors examine developments in euro area cross-border capital flows, together with benchmark rates and bank lending rates, and show that the process of financial integration among euro area countries over the first years of the monetary union went temporarily into reverse, starting in 2007 and intensifying after 2011.
Abstract: The article examines developments in euro area cross-border capital flows, together with benchmark rates and bank lending rates, and shows that the process of financial integration among euro area countries over the first years of the monetary union went temporarily into reverse, starting in 2007 and intensifying after 2011. This development was characterised by an increasing home bias in the banking sector, a reversal of net capital flows within the euro area, and diverging interest rate developments across national borders. This fragmentation process creates distortions in the monetary policy transmission mechanism in the euro area and weakens the sustainability of the net external positions built up in the past. The article also presents the policy responses to deal with these developments, from the initial substitution of private by official cross-border capital flows and measures to restore monetary transmission in all market segments, to longer-term measures (e.g. banking union) that should contribute to a deeper and more robust form of financial integration in the euro area.

Journal ArticleDOI
TL;DR: In this paper, the authors examined whether or not these results indicate their procedure is useless at such frequencies and concluded that their test severely suffers from quite low power when the noncausality hypothesis is tested at a frequency close to 0 or pi.
Abstract: Breitung and Candelon (2006) in Journal of Econometrics proposed a simple statistical testing procedure for the noncausality hypothesis at a given frequency. In their paper, however, they reported some theoretical results indicating that their test severely suffers from quite low power when the noncausality hypothesis is tested at a frequency close to 0 or pi. This paper examines whether or not these results indicate their procedure is useless at such frequencies.

Journal ArticleDOI
TL;DR: In this article, the authors show that the realized volatility has long-range memory and that the corrected local Whittle estimator of Hurvich et al. (2005) is robust to the choice of the sampling frequency used to compute the realized variance.
Abstract: A stylized fact is that realized variance has long memory. We show that, when the instantaneous volatility is a long memory process of order d, the integrated variance is characterized by the same long-range dependence. We prove that the spectral density of realized variance is given by the sum of the spectral density of the integrated variance plus that of a measurement error, due to the sparse sampling and market microstructure noise. Hence, the realized volatility has the same degree of long memory as the integrated variance. The additional term in the spectral density induces a finite-sample bias in the semiparametric estimates of the long memory. A Monte Carlo simulation provides evidence that the corrected local Whittle estimator of Hurvich et al. (2005) is much less biased than the standard local Whittle estimator and the empirical application shows that it is robust to the choice of the sampling frequency used to compute the realized variance. Finally, the empirical results suggest that the volati...

Journal ArticleDOI
TL;DR: In this article, a nonparametric test of conditional independence between variables of interest based on a generalization of the empirical distribution function is proposed for both discrete variables and estimated parameters.
Abstract: We propose a nonparametric test of the hypothesis of conditional independence between variables of interest based on a generalization of the empirical distribution function. This hypothesis is of interest both for model specification purposes, parametric and semiparametric, and for nonmodel-based testing of economic hypotheses. We allow for both discrete variables and estimated parameters. The asymptotic null distribution of the test statistic is a functional of a Gaussian process. A bootstrap procedure is proposed for calculating the critical values. Our test has power against alternatives at distance n −1/2 from the null; this result holding independently of dimension. Monte Carlo simulations provide evidence on size and power.

Journal ArticleDOI
TL;DR: In this article, a double selection problem was used to identify average and quantile treatment effects in the presence of double selection and attrition in a range of treatment evaluation problems such as the estimation of the returns to schooling or training.
Abstract: Sample selection and attrition are inherent in a range of treatment evaluation problems such as the estimation of the returns to schooling or training. Conventional estimators tackling selection bias typically rely on restrictive functional form assumptions that are unlikely to hold in reality. This paper shows identification of average and quantile treatment effects in the presence of the double selection problem into (i) a selective subpopulation (e.g., working—selection on unobservables) and (ii) a binary treatment (e.g., training—selection on observables) based on weighting observations by the inverse of a nested propensity score that characterizes either selection probability. Weighting estimators based on parametric propensity score models are applied to female labor market data to estimate the returns to education.

Journal ArticleDOI
TL;DR: This paper proposed an impulse-indicator saturation test of such specifications, applied to USA and Euro-area new-Keynesian Phillips curve, and showed the consequences for such models of breaks in data processes.
Abstract: Many economic models (such as the new-Keynesian Phillips curve, NKPC) include expected future values, often estimated after replacing the expected value by the actual future outcome, using Instrumental Variables (IV) or Generalized Method of Moments (GMM). Although crises, breaks, and regime shifts are relatively common, the underlying theory does not allow for their occurrence. We show the consequences for such models of breaks in data processes, and propose an impulse-indicator saturation test of such specifications, applied to USA and Euro-area NKPCs.

Posted Content
TL;DR: In this paper, the authors evaluate whether the reaction of asset markets on September 18, 2013, was a typical response to Federal Reserve policy and find that the response of asset prices within the United States to monetary policy does not appear to be different at the zero lower bound.
Abstract: The Federal Reserve Open Market Committee's (FOMC) announcement following its meeting on September 18, 2013, moved stock and bond markets worldwide. In the United States, the yield on 10-year Treasury bonds fell nearly 20 basis points in the hours following the announcement while stock prices surged higher-the S&P 500 jumped 1.2 percent. The effects of the announcement were not limited to the United States. The value of the dollar dropped during the afternoon of September 18, falling more than 1 percent against the euro and the Japanese yen, and more than 2 percent against emerging economy currencies. The Brazilian stock market added 2 percent on the news, as did Asian markets when they opened the following day. Although market participants were uncertain about what the outcome of the meeting would be, the reaction to the post-meeting announcement was striking, particularly considering that the FOMC did not change policy that day.This article evaluates whether the reaction of asset markets on September 18 was a typical response to Federal Reserve policy. In a world with free mobility of capital, an unanticipated monetary policy action within the United States will affect asset prices both in the United States and outside of the country, as investors arbitrage away price differentials between assets with similar risk/reward characteristics. A closely related question is whether the reaction of asset prices to monetary policy is different at the zero lower bound. Since 2008, the conventional tool for monetary policy in the United States-the federal funds rate-has been near zero. As a result, the Federal Reserve has turned to unconventional monetary policies to provide additional accommodation. These unconventional policies may have altered the response of asset prices to Fed policy. To that end, the analysis compares the response of international asset price changes to unanticipated monetary policy actions before and after the federal funds rate hit the zero lower bound.The analysis shows that a change in monetary policy in the United States is associated with movements in a variety of asset prices, both in the United States and abroad. Evidence of a change in the behavior of asset prices at the zero lower bound is mixed. The responses of asset prices within the United States to monetary policy do not appear to be different at the zero lower bound. However, some international asset prices do appear to react differently to policy announcements after 2007. Most notably, the response of exchange rates to monetary policy has been more volatile since the zero lower bound began to constrain conventional policies.The analysis proceeds in two steps. The first step, described in Section I, develops a measure of monetary policy changes. Importantly, the measure of monetary policy remains valid even when the federal funds rate is constrained by the zero lower bound. The second step, described in Section II, relates the measure of monetary policy changes to movements in international asset prices. The results of the analysis are discussed in Section III.I. MEASURING MONETARY POLICY SURPRISESMovements of prices in the federal funds futures and Eurodollar markets on policy announcement days are used to detect unanticipated changes to policy, or monetary policy surprises. This section describes events used to isolate policy surprises and shows how price movements in federal funds futures and Eurodollar markets can be used to extract a markets-based measure of policy surprises.Identifying monetary policy surprises poses several analytical challenges because the Federal Reserve sets policy contingent on the state of the economy. Since market participants can infer the state of the economy, they can, at least in part, anticipate monetary policy changes. Identifying monetary policy surprises today is further complicated by the fact that policy is constrained by the zero lower bound. Following the financial crisis in the fall of 2008, the FOMC moved the federal funds rate target to near zero. …

Posted Content
TL;DR: However, interest rates charged by lenders to consumers do not change automatically when the Federal Reserve alters the stance of monetary policy as mentioned in this paper, and the extent to which policy actions pass through to consumer interest rates determines, in part, the effectiveness of monetary policies.
Abstract: The economic recovery following the financial crisis and Great Recession of 2007-09 has been slow. Research has shown that recessions following banking crises are typically accompanied by large and persistent declines in output. Contributing factors include sharp declines in asset prices, such as housing prices, that damage the balance sheets of both households and financial institutions. These factors, combined often with a buildup of debt during the bubble years prior to a crisis, cause debt deleveraging to be drawn out. Demand for new credit by households is therefore depressed by the effects of reduced income and wealth, and by the debt overhang. Likewise, the supply of new credit from banks is limited by past liquidity and solvency shocks and by banks' perceptions of higher risk in future lending.The Federal Reserve has taken steps since the financial crisis to push both short- and long-term interest rates to historically low levels. These steps have aimed to reduce financing costs generally and, more specifically, to lower the interest rates charged to finance consumer spending, which accounts for about 70 percent of all spending in the economy.However, interest rates charged by lenders to consumers do not change automatically when the Federal Reserve alters the stance of monetary policy. The extent to which policy actions pass through to consumer interest rates determines, in part, the effectiveness of monetary policy. Typically, when the Federal Reserve wants to provide policy stimulus to the economy, it lowers its target for the federal funds rate-its main policy interest rate. But when the short-term rate hits the zero bound as it did in the financial crisis, there are fewer options, and the effects are less certain. Thus, it is particularly important to evaluate this pass-through from monetary policy to consumer loan rates when central banks ease policy through unconventional tools such as purchases of longer-term securities and communication to the public about the future path of policy.This article examines the extent of pass-through to bank-reported lending rates. The data show that, since unconventional monetary policy was introduced at the end of 2008, this pass-through has weakened. The weaker response is not limited to one group of banks but characterizes both large banks and community banks. This means the effect of monetary policy on consumer spending may have declined.Section I reviews recent Federal Reserve policy actions and trends in interest rates on Treasuries and other securities. Section II describes banks' role in monetary policy transmission and introduces disaggregated data on consumer rates, which can be used to assess banks' ratesetting behavior. Section III examines the effectiveness of the banking channel of monetary policy transmission by estimating the response of consumer rates to market rates before and after the financial crisis.I. MONETARY POLICY ACTIONSIn normal times, the policy instrument the Federal Reserve targets to influence economic activity is the federal funds rate, the overnight rate at which banks lend to and borrow from each other. Conventionally, the Federal Reserve eases monetary policy by lowering its target for the federal funds rate. Because markets are integrated, other interest rates-including long-term borrowing costs-also move down. By driving down borrowing rates and increasing interest-sensitive consumption and investment, the Federal Reserve stimulates economic activity.But recent times have not been normal. The onset of the financial crisis in August 2007 led to disruptions in the normal functioning of credit markets in which financial institutions obtain and provide funding to each other. These disruptions later affected bank borrowers, visible in the sharp plunge in credit to the overall economy (Chart 1). Bank credit contracted more sharply and for longer than during previous recessions in the early 1990s and early 2000s. …

Posted Content
TL;DR: In this article, Wu and Krippner used a "shadow" federal funds rate to assess the overall stance of monetary policy, which is a summary measure of the total accommodation provided by conventional and unconventional policies.
Abstract: Evaluating the stance of monetary policy has become very challenging. In the past, policymakers could simply compare the target federal funds rate to the prescriptions from simple policy rules to get a sense of whether the stance of policy was appropriate given economic conditions. However, as the economy fell deep into recession in 2008, the Federal Open Market Committee (FOMC) lowered its target for the federal funds rate to its effective lower bound. Since then, and until recently, many of the simple rules that have guided policy in the past have prescribed a negative federal funds rate target. However, with the federal funds rate constrained by the zero lower bound (ZLB) on nominal interest rates, the FOMC could not move the target federal funds rate below zero.Instead, the FOMC turned to a number of unconventional policies to provide additional monetary accommodation. These other policies included several large-scale asset purchase programs and the use of "forward guidance." Purchases of longer-term Treasury and agency mortgage-backed securities resulted in an expansion of the Federal Reserve's balance sheet to more than $4 trillion. Forward guidance provided market participants information about how long the federal funds rate target might remain at its effective lower bound and how steep its trajectory might be after liftofffrom zero. These policies are widely viewed as having put downward pressure on longer-term interest rates, providing additional monetary accommodation even though short-term rates remained constrained by the ZLB.With the implementation of unconventional policies, there currently is no single, directly observable indicator that can summarize the stance of policy. Moreover, the economics literature provides no generally accepted rule for how unconventional policies should be adjusted in response to changing economic conditions. As a result, policymakers have had to use considerable judgment and discretion to calibrate the stance of policy in the aftermath of the Great Recession.This article addresses these challenges by using a "shadow" federal funds rate to assess the overall stance of monetary policy. The shadow federal funds rate-based on research by Jing Cynthia Wu and Fan Dora Xia and by Leo Krippner-is a summary measure of the total accommodation provided by conventional and unconventional policies. It provides an estimate of what the federal funds rate would be, given asset purchases and forward guidance, if the federal funds rate could be negative. More precisely, it represents the policy rate that would generate the observed yield curve if the ZLB were not binding.This shadow federal funds rate is then compared to the prescriptions from a policy rule estimated over a period of relative macroeconomic stability. The estimated rule shows how monetary policy responded in the past to economic conditions. The specification of the rule reflects the Federal Reserve's dual mandate of price stability and maximum employment. It prescribes a setting for the federal funds rate that depends on the deviation of inflation from the FOMC's medium to long-term objective of 2 percent and on two indicators of labor market activity that summarize a wide range of variables. The labor market indicators replace the unemployment or output gaps traditionally used in policy rules based on the concern that, currently, the unemployment rate may not be a reliable indicator of economic slack and that the output gap is difficult to measure in real time.Based on deviations of the shadow federal funds rate from the prescriptions of the estimated policy rule, policy was not sufficiently accommodative in the immediate aftermath of the Great Recession but became considerably more accommodative over time. While the unconventional policies adopted by the FOMC were effective in pushing the shadow federal funds rate well below zero, they did not initially lower it sufficiently to reach the level prescribed by the estimated rule. …

Journal ArticleDOI
TL;DR: In this paper, the authors proposed several new partially adaptive estimators that cover a wide range of distributional characteristics and investigated the estimators' relative efficiency in these settings, showing that the partially adaptive censored regression estimators have little efficiency loss for censored normal errors and may outperform Tobit and semiparametric estimators considered for non-normal distributions.
Abstract: Data censoring causes ordinary least squares estimates of linear models to be biased and inconsistent Tobit, semiparametric, and partially adaptive estimators have been considered as possible solutions This paper proposes several new partially adaptive estimators that cover a wide range of distributional characteristics A simulation study is used to investigate the estimators’ relative efficiency in these settings The partially adaptive censored regression estimators have little efficiency loss for censored normal errors and may outperform Tobit and semiparametric estimators considered for non-normal distributions An empirical example of out-of-pocket expenditures for a health insurance plan provides an example, which supports these results

Journal ArticleDOI
TL;DR: In this article, a method for constructing bootstrap confidence sets based on t statistics has been proposed for the coefficient of the single right-hand-side endogenous variable in a linear equation with weak instruments.
Abstract: We study several methods of constructing confidence sets for the coefficient of the single right-hand-side endogenous variable in a linear equation with weak instruments. Two of these are based on conditional likelihood ratio (CLR) tests, and the others are based on inverting t statistics or the bootstrap P values associated with them. We propose a new method for constructing bootstrap confidence sets based on t statistics. In large samples, the procedures that generally work best are CLR confidence sets using asymptotic critical values and bootstrap confidence sets based on limited-information maximum likelihood (LIML) estimates.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss Bayesian inferential procedures within the family of instrumental variables regression models and focus on two issues: existence conditions for posterior moments of the parameters of interest under a flat prior and the potential of Direct Monte Carlo (DMC) approaches for efficient evaluation of such possibly highly nonelliptical posteriors.
Abstract: We discuss Bayesian inferential procedures within the family of instrumental variables regression models and focus on two issues: existence conditions for posterior moments of the parameters of interest under a flat prior and the potential of Direct Monte Carlo (DMC) approaches for efficient evaluation of such possibly highly non-elliptical posteriors. We show that, for the general case of m endogenous variables under a flat prior, posterior moments of order r exist for the coefficients reflecting the endogenous regressors’ effect on the dependent variable, if the number of instruments is greater than m +r, even though there is an issue of local non-identification that causes non-elliptical shapes of the posterior. This stresses the need for efficient Monte Carlo integration methods. We introduce an extension of DMC that incorporates an acceptance-rejection sampling step within DMC. This Acceptance-Rejection within Direct Monte Carlo (ARDMC) method has the attractive property that the generated random dra...