scispace - formally typeset
Search or ask a question

Showing papers by "Federal Reserve System published in 2005"


Journal ArticleDOI
TL;DR: In this article, the authors present a model embodying moderate amounts of nominal rigidities that accounts for the observed inertia in inflation and persistence in output, and the key features of their model are those that prevent a sharp rise in marginal costs after an expansionary shock to monetary policy.
Abstract: We present a model embodying moderate amounts of nominal rigidities that accounts for the observed inertia in inflation and persistence in output. The key features of our model are those that prevent a sharp rise in marginal costs after an expansionary shock to monetary policy. Of these features, the most important are staggered wage contracts that have an average duration of three quarters and variable capital utilization.

4,250 citations


Journal ArticleDOI
TL;DR: In this paper, the mean squared prediction error (MSPE) from the parsimonious model is adjusted to account for the noise in the large model's model. But, the adjustment is based on the nonstandard limiting distributions derived in Clark and McCracken (2001, 2005a) to argue that use of standard normal critical values will yield actual sizes close to, but a little less than, nominal size.
Abstract: Forecast evaluation often compares a parsimonious null model to a larger model that nests the null model. Under the null that the parsimonious model generates the data, the larger model introduces noise into its forecasts by estimating parameters whose population values are zero. We observe that the mean squared prediction error (MSPE) from the parsimonious model is therefore expected to be smaller than that of the larger model. We describe how to adjust MSPEs to account for this noise. We propose applying standard methods (West (1996)) to test whether the adjusted mean squared error difference is zero. We refer to nonstandard limiting distributions derived in Clark and McCracken (2001, 2005a) to argue that use of standard normal critical values will yield actual sizes close to, but a little less than, nominal size. Simulation evidence supports our recommended procedure.

1,540 citations


Journal ArticleDOI
TL;DR: This article found evidence consistent with small banks being better able to collect and act on soft information than large banks, and that large banks are less willing to lend to informationally "difficult" credits, such as firms with no financial records.

1,407 citations


Journal ArticleDOI
TL;DR: In this paper, a factor-augmented structural vector autoregressions (FAVAR) methodology is proposed to identify the monetary transmission mechanism. But the authors do not provide a comprehensive and coherent picture of the effect of monetary policy on the economy.
Abstract: Structural vector autoregressions (VARs) are widely used to trace out the effect of monetary policy innovations on the economy. However, the sparse information sets typically used in these empirical models lead to at least two potential problems with the results. First, to the extent that central banks and the private sector have information not reflected in the VAR, the measurement of policy innovations is likely to be contaminated. A second problem is that impulse responses can be observed only for the included variables, which generally constitute only a small subset of the variables that the researcher and policymaker care about. In this paper we investigate one potential solution to this limited information problem, which combines the standard structural VAR analysis with recent developments in factor analysis for large data sets. We find that the information that our factor-augmented VAR (FAVAR) methodology exploits is indeed important to properly identify the monetary transmission mechanism. Overall, our results provide a comprehensive and coherent picture of the effect of monetary policy on the economy.

1,336 citations


Journal ArticleDOI
TL;DR: In this paper, the authors propose a new framework, based on explicit micro foundations, within which macro policy can be studied and calibrate the model to standard observations and use it to measure the cost of inflation.
Abstract: Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro policy can be studied. The framework is analytically tractable and easily quantifiable. We calibrate the model to standard observations and use it to measure the cost of inflation. We find that going from 10 percent to 0 percent inflation is worth between 3 and 5 percent of consumption—much higher than previous estimates.

1,066 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explain how to assess the state of house prices, both whether there is a bubble and what underlying factors support housing demand, in a way that is grounded in economic theory.
Abstract: How does one tell when rapid growth in house prices is caused by fundamental factors of supply and demand and when it is an unsustainable bubble? In this paper, we explain how to assess the state of house prices—both whether there is a bubble and what underlying factors support housing demand—in a way that is grounded in economic theory. In doing so, we correct four common fallacies about the costliness of the housing market. For a number of reasons, conventional metrics for assessing pricing in the housing market such as price-to-rent ratios or price-to-income ratios generally fail to reflect accurately the state of housing costs. To the eyes of analysts employing such measures, housing markets can appear "exuberant" even when houses are in fact reasonably priced. We construct a measure for evaluating the cost of home owning that is standard for economists—the imputed annual rental cost of owning a home, a variant of the user cost of housing—and apply it to 25 years of history across a wide variety of housing markets. This calculation enables us to estimate the time pattern of housing costs within a market. As of the end of 2004, our analysis reveals little evidence of a housing bubble.

825 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine tests for jumps based on recent asymptotic results; they interpret the tests as Hausman-type tests and find that microstructure noise biases the tests against detecting jumps, and a simple lagging strategy corrects the bias.
Abstract: We examine tests for jumps based on recent asymptotic results; we interpret the tests as Hausman-type tests. Monte Carlo evidence suggests that the daily ratio z-statistic has appropriate size, good power, and good jump detection capabilities revealed by the confusion matrix comprised of jump classification probabilities. We identify a pitfall in applying the asymptotic approximation over an entire sample. Theoretical and Monte Carlo analysis indicates that microstructure noise biases the tests against detecting jumps, and that a simple lagging strategy corrects the bias. Empirical work documents evidence for jumps that account for seven percent of stock market price variance.

782 citations


Posted Content
TL;DR: In this article, a multivariate model, identifying monetary policy and allowing for simultaneity and regime switching in coefficients and variances, is confronted with US data since 1959 and the best fit is with a version that allows time variation in structural disturbance variances only.
Abstract: A multivariate model, identifying monetary policy and allowing for simultaneity and regime switching in coefficients and variances, is confronted with US data since 1959. The best fit is with a version that allows time variation in structural disturbance variances only. Among versions that allow for changes in equation coefficients also, the best fit is for a one that allows coefficients to change only in the monetary policy rule. That version allows switching among three main regimes and one rarely and briefly occurring regime. The three main regimes correspond roughly to periods when most observers believe that monetary policy actually differed, but the differences among regimes are not large enough to account for the rise, then decline, in inflation of the 70?s and 80?s. In versions that insist on changes in the policy rule, the estimates imply monetary targeting was central in the early 80?s, but also important sporadically in the 70?s.

615 citations


Journal ArticleDOI
TL;DR: In this article, the authors tried to explain the credit default swap (CDS) premium, using a novel approach to identify the volatility and jump risks of individual firms from high-frequency equity prices.
Abstract: This paper tries to explain the credit default swap (CDS) premium, using a novel approach to identify the volatility and jump risks of individual firms from high-frequency equity prices. Our empirical results suggest that the volatility risk alone predicts 50 percent of the variation in CDS spread levels, while the jump risk alone forecasts 19 percent. After controlling for credit ratings, macroeconomic conditions, and firms’ balance sheet information, we can explain 77 percent of the total variation. Moreover, the pricing effects of volatility and jump measures vary consistently across investmentgrade and high-yield entities. The estimated nonlinear effects of volatility and jump risks on credit spreads are in line with the implications from a calibrated structural model with stochastic volatility and jumps, although the challenge of simultaneously matching credit spreads and default probabilities remains.

588 citations


Posted Content
TL;DR: In this article, the effect of social interactions among neighbors on labor market outcomes is investigated using Census data that characterize residential and employment locations down to the city block, and whether individuals residing in the same block are more likely to work together than those in nearby blocks.
Abstract: We use a novel dataset and research design to empirically detect the effect of social interactions among neighbors on labor market outcomes. Specifically, using Census data that characterize residential and employment locations down to the city block, we examine whether individuals residing in the same block are more likely to work together than those in nearby blocks. We find evidence of significant social interactions operating at the block level: residing on the same versus nearby blocks increases the probability of working together by over 33 percent. The results also indicate that this referral effect is stronger when individuals are similar in sociodemographic characteristics (e.g., both have children of similar ages) and when at least one individual is well attached to the labor market. These findings are robust across various specifications intended to address concerns related to sorting and reverse causation. Further, having determined the characteristics of a pair of individuals that lead to an especially strong referral effect, we provide evidence that the increased availability of neighborhood referrals has a significant impact on a wide range of labor market outcomes including employment and wages.

584 citations


Journal ArticleDOI
TL;DR: For example, the authors found higher rates of job-hopping for college-educated men in Silicon Valley's computer industry than in computer clusters located out of the state, suggesting some role for features of California law that make noncompete agreements unenforceable.
Abstract: Observers of Silicon Valley's computer cluster report that employees move rapidly between competing firms, but evidence supporting this claim is scarce. Job-hopping is important in computer clusters because it facilitates the reallocation of talent and resources toward firms with superior innovations. Using new data on labor mobility, we find higher rates of job-hopping for college-educated men in Silicon Valley's computer industry than in computer clusters located out of the state. Mobility rates in other California computer clusters are similar to Silicon Valley's, suggesting some role for features of California law that make noncompete agreements unenforceable. Consistent with our model of innovation, mobility rates outside computer industries are no higher in California than elsewhere.

Journal ArticleDOI
TL;DR: In this paper, the authors examine the misallocation of credit in Japan associated with the perverse incentives faced by banks to provide additional credit to the weakest firms, and find that firms are more likely to receive additional bank credit if they are in poor financial condition, because troubled Japanese banks have an incentive to allocate credit to severely impaired borrowers to avoid the realization of losses on their own balance sheets.
Abstract: We examine the misallocation of credit in Japan associated with the perverse incentives faced by banks to provide additional credit to the weakest firms. Firms are more likely to receive additional bank credit if they arein poor financial condition, because troubled Japanese banks have an incentive to allocate credit to severely impaired borrowers in order to avoid the realization of losses on their own balance sheets. This "evergreening" behavior is more prevalent among banks that have reported capital ratios close to the required minimum, and is compounded by the incentives arising from extensive corporate affiliations.

Journal ArticleDOI
TL;DR: The authors assess some of the explanations that have been put forward for the global pattern of current account imbalances that has emerged in recent years, particularly the large U.S. current account deficit and the large surpluses of the Asian developing economies.

Journal ArticleDOI
TL;DR: A simple three-factor arbitrage-free term structure model estimated by Federal Reserve Board staff and reported results obtained from fitting this model to U.S. Treasury yields since 1990 as discussed by the authors.
Abstract: This paper reviews a simple three-factor arbitrage-free term structure model estimated by Federal Reserve Board staff and reports results obtained from fitting this model to U.S. Treasury yields since 1990. The model ascribes a large portion of the decline in long-term yields and distant-horizon forward rates since the middle of 2004 to a fall in term premiums. A variant of the model that incorporates inflation data indicates that about two-thirds of the decline in nominal term premiums owes to a fall in real term premiums, but estimated compensation for inflation risk has diminished as well.

Journal ArticleDOI
TL;DR: In this article, the authors jointly analyzed the static, selection, and dynamic effects of domestic, foreign, and state ownership on bank performance in Argentina in the 1990s and found that state-owned banks have poor long-term performance, those undergoing privatization had particularly poor performance beforehand (selection effect), and these banks dramatically improved following privatization.
Abstract: We jointly analyze the static, selection, and dynamic effects of domestic, foreign, and state ownership on bank performance. We argue that it is important to include indicators of all the relevant governance effects in the same model. “Nonrobustness” checks (which purposely exclude some indicators) support this argument. Using data from Argentina in the 1990s, our strongest and most robust results concern state ownership. State-owned banks have poor long-term performance (static effect), those undergoing privatization had particularly poor performance beforehand (selection effect), and these banks dramatically improved following privatization (dynamic effect). However, much of the measured improvement is likely due to placing nonperforming loans into residual entities, leaving “good” privatized banks.

Journal ArticleDOI
TL;DR: In this article, the authors employ a variety of simple empirical techniques to identify links between the observed moderation in economic activity and the influence of financial innovation on consumer spending, housing investment, and business fixed investment.

Journal ArticleDOI
TL;DR: In this paper, a new multicountry open economy SDGE model named SIGMA has been developed as a quantitative tool for policy analysis, and its implications for the near-term responses of key variables are generally similar to those of FRB/Global.
Abstract: In this paper, we describe a new multicountry open economy SDGE model named "SIGMA" that we have developed as a quantitative tool for policy analysis. We compare SIGMA's implications to those of an estimated large-scale econometric policy model (the FRB/Global model) for an array of shocks that are often examined in policy simulations. We show that SIGMA's implications for the near-term responses of key variables are generally similar to those of FRB/Global. Nevertheless, some quantitative disparities between the two models remain due to certain restrictive aspects of SIGMA's optimization-based framework. We conclude by using long-term simulations to illustrate some areas of comparative advantage of our SDGE modeling framework.

Journal ArticleDOI
TL;DR: The authors argue that existing rational expectations sticky-price models fail to provide a useful empirical description of the inflation process, especially relative to traditional econometric Phillips curves of the sort commonly employed for policy analysis.
Abstract: In recent years, a broad academic consensus has arisen around the use of rational expectations sticky-price models to capture inflation dynamics. These models are seen as providing an empirically reasonable characterization of observed inflation behavior once suitable measures of the output gap are chosen; and, moreover, are perceived to be robust to the Lucas critique in a way that earlier econometric models of inflation are not. We review the principal conclusions of this literature concerning: 1) the ability of these models to fit the data; 2) the importance of rational forward-looking expectations in price setting; and 3) the appropriate measure of inflationary pressures. We argue that existing rational expectations sticky-price models fail to provide a useful empirical description of the inflation process, especially relative to traditional econometric Phillips curves of the sort commonly employed for policy analysis.

Journal ArticleDOI
TL;DR: This paper examined the relation between board structure (size and composition) and firm performance using a sample of banking firms during 1959-1999 and found that firms with larger boards do not underperform their peers in terms of Tobin's Q. They argue that M&A activity and features of the bank holding company organizational form may make a larger board more desirable for these firms and document that board size is significantly related to characteristics of their sample firms' structures.
Abstract: We examine the relation between board structure (size and composition) and firm performance using a sample of banking firms during 1959-1999. Contrary to the evidence for non-financial firms, we find that banking firms with larger boards do not underperform their peers in terms of Tobin's Q. We argue that M&A activity and features of the bank holding company organizational form may make a larger board more desirable for these firms and document that board size is significantly related to characteristics of our sample firms' structures. Even after accounting for these potential sources of endogeneity, we do not find a negative relationship between board size and Tobin's Q. Our findings suggest that constraints on board size in the banking industry may be counter-productive.

Journal ArticleDOI
TL;DR: In this paper, the authors formalized the process of updating the nowcast and forecast on output and inflation as new releases of data become available, and the marginal contribution of a particular release for the value of the signal and its precision was evaluated by computing ''news'' on the basis of an evolving conditioning information set.
Abstract: This paper formalizes the process of updating the nowcast and forecast on output and inflation as new releases of data become available. The marginal contribution of a particular release for the value of the signal and its precision is evaluated by computing \"news\" on the basis of an evolving conditioning information set. The marginal contribution is then split into what is due to timeliness of information and what is due to economic content. We find that the Federal Reserve Bank of Philadelphia surveys have a large marginal impact on the nowcast of both inflation variables and real variables, and this effect is larger than that of the Employment Report. When we control for timeliness of the releases, the effect of hard data becomes sizeable. Prices and quantities affect the precision of the estimates of inflation, while GDP is affected only by real variables and interest rates.

Journal ArticleDOI
TL;DR: In this article, the authors examined the asymptotic and finite-sample properties of tests of equal forecast accuracy and encompassing applied to direct, multistep predictions from nested regression models.
Abstract: This paper examines the asymptotic and finite-sample properties of tests of equal forecast accuracy and encompassing applied to direct, multistep predictions from nested regression models. We first derive asymptotic distributions; these nonstandard distributions depend on the parameters of the data-generating process. We then use Monte Carlo simulations to examine finite-sample size and power. Our asymptotic approximation yields good size and power properties for some, but not all, of the tests; a bootstrap works reasonably well for all tests. The paper concludes with a reexamination of the predictive content of capacity utilization for inflation.

Journal ArticleDOI
TL;DR: In this article, the authors present a simple model that provides a framework for doing empirical work that integrates the heterogeneity of housing supply into urban development, showing that differences in the nature of house supply across space are not only responsible for higher housing prices, but also affect how cities respond to increases in productivity.
Abstract: Cities are physical structures, but the modern literature on urban economic development rarely acknowledges that fact. The elasticity of housing supply helps determine the extent to which increases in productivity will create bigger cities or just higher paid workers and more expensive homes. In this paper, we present a simple model that provides a framework for doing empirical work that integrates the heterogeneity of housing supply into urban development. Empirical analysis yields results consistent with the implications of the model that differences in the nature of house supply across space are not only responsible for higher housing prices, but also affect how cities respond to increases in productivity.

Journal ArticleDOI
TL;DR: The FDIC used cross-guarantees in order to close 38 subsidiaries of First RepublicBank Corporation in 1988 and 18 subsidiary of First City Bank Corporation in 1992 when lead banks from each of these Texas-based bank holding companies were declared insolvent.
Abstract: The FDIC used cross-guarantees in order to close 38 subsidiaries of First RepublicBank Corporation in 1988 and 18 subsidiaries of First City BankCorporation in 1992 when lead banks from each of these Texas-based bank holding companies were declared insolvent. I use this plausibly exogenous failure of otherwise healthy subsidiary banks as a natural experiment in order to study the impact of bank failure on local area real economic activity. The resolution of these institutions was associated with a significant decline in failed bank lending that led to a permanent reduction in real county income of about 3 percent. JEL codes: E5, G18, G33. Keywords: bank failures, cross-guarantee, uniqueness of banks.

Journal ArticleDOI
TL;DR: The theory of reduction is reviewed, the approach of general-to-specific modeling is summarized, and the econometrics of model selection are discussed, noting that general- to- specific modeling is the practical embodiment of reduction.
Abstract: This paper discusses the econometric methodology of general-to-specific modeling, in which the modeler simplifies an initially general model that adequately characterizes the empirical evidence within his or her theoretical framework. Central aspects of this approach include the theory of reduction, dynamic specification, model selection procedures, model selection criteria, model comparison, encompassing, computer automation, and empirical implementation. This paper thus reviews the theory of reduction, summarizes the approach of general-to-specific modeling, and discusses the econometrics of model selection, noting that general-to-specific modeling is the practical embodiment of reduction. This paper then summarizes fifty-seven articles key to the development of general-to-specific modeling.

Journal ArticleDOI
TL;DR: This article used survey forecasts of a short-term interest rate as additional input to the estimation of dynamic no-arbitrage term structure models to overcome the small-sample problem arising from the highly persistent nature of interest rates.
Abstract: The estimation of dynamic no-arbitrage term structure models with a flexible specification of the market price of risk is beset by a severe small-sample problem arising from the highly persistent nature of interest rates. We propose using survey forecasts of a short-term interest rate as additional input to the estimation to overcome the problem. The three-factor pure-Gaussian model thus estimated with the U.S. Treasury term structure for the 1990-2003 period generates a stable estimate of the expected path of the short rate, reproduces the well-known stylized patterns in the expectations hypothesis tests, and captures some of the short-run variations in the survey forecast of the changes in longer-term interest rates.

Posted Content
TL;DR: The authors argue that the real effects of the Volcker disinflation in the early 1980s were mainly due to imperfect credibility, evident in volatility and stubbornness of long-term interest rates.
Abstract: Using a simple modern macroeconomic model, we argue that the real effects of the Volcker disinflation in the early 1980s were mainly due to imperfect credibility, evident in volatility and stubbornness of long-term interest rates Studying recently released transcripts of the Federal Open Market Committee, we find -- to our surprise -- that Volcker and other FOMC members also regarded long-term interest rates as key indicators of inflation expectations and of their disinflationary policy's credibility We also consider the interplay of monetary targets, operating procedures, and credibility during the Volcker disinflation

Posted Content
TL;DR: A paper presented at the April 2001 conference "Financial Innovation and Monetary Transmission," sponsored by the Federal Reserve Bank of New York as discussed by the authors, discusses the role of monetary transmission in financial innovation and monetary transmission.
Abstract: A paper presented at the April 2001 conference "Financial Innovation and Monetary Transmission," sponsored by the Federal Reserve Bank of New York.

Book ChapterDOI
TL;DR: The authors developed a theory of the evolution of international income levels over the last millennium, which augments the Hansen-Prescott theory of economic development with the ParentePrescott concept of relative efficiencies.
Abstract: This paper develops a theory of the evolution of international income levels. In particular, it augments the Hansen-Prescott theory of economic development with the Parente-Prescott theory of relative efficiencies and shows that the unified theory accounts for the evolution of international income levels over the last millennium. The essence of this unified theory is that a country starts to experience sustained increases in its living standard when production efficiency reaches a critical point. Countries reach this critical level of efficiency at different dates not because they have access to different stocks of knowledge, but rather because they differ in the amount of society-imposed constraints on the technology choices of their citizenry.

Journal ArticleDOI
TL;DR: This paper examined the relationship between two prominent dynamic, latent factor models in this literature: the Nelson-Siegel and affine no-arbitrage term structure models and presented a new examination of the relationship among them.
Abstract: From a macroeconomic perspective, the short-term interest rate is a policy instrument under the direct control of the central bank. From a finance perspective, long rates are risk-adjusted averages of expected future short rates. Thus, as illustrated by much recent research, a joint macro-finance modeling strategy will provide the most comprehensive understanding of the term structure of interest rates. We discuss various questions that arise in this research, and we also present a new examination of the relationship between two prominent dynamic, latent factor models in this literature: the Nelson-Siegel and affine no-arbitrage term structure models.

Posted Content
TL;DR: In the United States, wealth is highly concentrated and very unequally distributed: the richest 1% hold one third of the total wealth in the economy as mentioned in this paper. But understanding the determinants of wealth inequality is a challenge for many economic models.
Abstract: In the United States wealth is highly concentrated and very unequally distributed: the richest 1% hold one third of the total wealth in the economy. Understanding the determinants of wealth inequality is a challenge for many economic models. We summarize some key facts about the wealth distribution and what economic models have been able to explain so far.