scispace - formally typeset
Search or ask a question

Showing papers by "Federal Reserve System published in 2006"


Journal ArticleDOI
TL;DR: In this article, the authors propose a more complete conceptual framework for analysis of SME credit availability issues, and emphasize a causal chain from policy to financial structures, which affect the feasibility and profitability of different lending technologies.
Abstract: We propose a more complete conceptual framework for analysis of SME credit availability issues. In this framework, lending technologies are the key conduit through which government policies and national financial structures affect credit availability. We emphasize a causal chain from policy to financial structures, which affect the feasibility and profitability of different lending technologies. These technologies, in turn, have important effects on SME credit availability. Financial structures include the presence of different financial institution types and the conditions under which they operate. Lending technologies include several transactions technologies plus relationship lending. We argue that the framework implicit in most of the literature is oversimplified, neglects key elements of the chain, and often yields misleading conclusions. A common oversimplification is the treatment of transactions technologies as a homogeneous group, unsuitable for serving informationally opaque SMEs, and a frequent misleading conclusion is that large institutions are disadvantaged in lending to opaque SMEs.

1,706 citations


Journal ArticleDOI
TL;DR: In this paper, a multivariate model, identifying monetary policy and allowing for simultaneity and regime switching in coefficients and variances, is confronted with U.S. data since 1959.
Abstract: Working Paper 2004-14 June 2004 Abstract: A multivariate model, identifying monetary policy and allowing for simultaneity and regime switching in coefficients and variances, is confronted with U.S. data since 1959. The best fit is with a model that allows time variation in structural disturbance variances only. Among models that also allow for changes in equation coefficients, the best fit is for a model that allows coefficients to change only in the monetary policy rule. That model allows switching among three main regimes and one rarely and briefly occurring regime. The three main regimes correspond roughly to periods when most observers believe that monetary policy actually differed, and the differences in policy behavior are substantively interesting, though statistically ill determined. The estimates imply monetary targeting was central in the early '80s but was also important sporadically in the '70s. The changes in regime were essential neither to the rise in inflation in the '70s nor to its decline in the '80s. JEL classification: E52, E47, C53 Key words: counterfactuals, Lucas critique, policy rule, monetary targeting, simultaneity, volatility, model comparison I. THE DEBATE OVER MONETARY POLICY CHANGE In an influential paper, Clarida, Gali and Gertler 2000 (CGG) presented evidence that US monetary policy changed between the 1970's and the 1980's, indeed that in the 70's it was drastically worse. They found that the policy rule apparently followed in the 70's was one that, when embedded in most stochastic general equilibrium models, would imply non-uniqueness of the equilibrium and hence vulnerability of the economy to "sunspot" fluctuations of arbitrarily large size. Their estimated policy rule for the later period, on the other hand, eliminated this indeterminacy. These results are a possible explanation of the volatile and rising inflation of the 70's and of its subsequent decline. The CGG analysis has two important weaknesses. One is that it fails to account for stochastic volatility. US macroeconomic variables, and particularly the federal funds rate, have gone through periods of tranquility and of agitation, with forecast error variances varying greatly from period to period. Ignoring such variation does not lead to inconsistent estimates of model parameters when the forecasting equations themselves are constant, but it strongly biases--toward a finding of changed parameters--tests of the stability of the forecasting equations. The other weakness is that the CGG analysis rests on powerful and implausible identifying assumptions. They require that we accept that the response of the monetary authority to expected future inflation and output does not depend on the recent history of inflation, money growth, or output. It is hard to understand why this should be so, especially in the 70's, when monetarism was a prominent theme in policy debates, Congress was requiring reports from the Fed of projected time paths of monetary aggregates, and financial markets were reacting sensitively to weekly money supply numbers. The requirement for existence and uniqueness of equilibrium in dynamic models is that the monetary policy rule show a more than unit response of interest rates to the sum of the logs of all nominal variables that appear on the right-hand side of the reaction function. If we force a particular measure of expected future inflation to proxy for all the nominal variables that actually appear independently in the reaction function, we are bound to get distorted conclusions. On the one hand, because expected future inflation will be a "noisy" measure of the full set of nominal influences on policy, we might get downward bias in our estimates from the usual errorsin-variables effect. On the other hand, to the extent that expected future inflation (like most expected future values) shows less variation than current nominal variables, we could find a mistaken scaling up of coefficients. …

930 citations


Journal ArticleDOI
TL;DR: In this paper, the authors build a dynamic model for GDP growth and yields that completely characterizes expectations of GDP, and confirm this finding by forecasting GDP out-of-sample.

833 citations


Journal ArticleDOI
TL;DR: In this paper, a measure of bank lending standards collected by the United States Federal Reserve reveals that shocks to lending standards are significantly correlated with innovations in commercial loans at banks and in real output.
Abstract: VAR analysis on a measure of bank lending standards collected by the Federal Reserve reveals that shocks to lending standards are significantly correlated with innovations in commercial loans at banks and in real output. Credit standards strongly dominate loan rates in explaining variation in business loans and output. Standards remain significant when we include various proxies for loan demand, suggesting that part of the standards fluctuations can be identified with changes in loan supply. Standards are also significant in structural equations of some categories of inventory investment, a GDP component closely associated with bank lending. The estimated impact of a moderate tightening of standards on inventory investment is of the same order of magnitude as the decline in inventory investment over the typical recession.

606 citations


Journal ArticleDOI
TL;DR: In this article, a parsimonious model of occupational choice that allows for entrepreneurial entry, exit, and investment decisions in the presence of borrowing constraints is presented, which fits very well a number of empirical observations, including the observed wealth distribution for entrepreneurs and workers.
Abstract: This paper constructs and calibrates a parsimonious model of occupational choice that allows for entrepreneurial entry, exit, and investment decisions in the presence of borrowing constraints. The model fits very well a number of empirical observations, including the observed wealth distribution for entrepreneurs and workers. At the aggregate level, more restrictive borrowing constraints generate less wealth concentration and reduce average firm size, aggregate capital, and the fraction of entrepreneurs. Voluntary bequests allow some high-ability workers to establish or enlarge an entrepreneurial activity. With accidental bequests only, there would be fewer very large firms and less aggregate capital and wealth concentration.

598 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a long history of high-frequency yield curve estimates of the Federal Reserve Board at a daily frequency from 1961 to the present, which can be used to compute yields or forward rates for any horizon.

576 citations


Journal ArticleDOI
TL;DR: This paper analyzed individuals' earnings in 31 different data sets from sixteen countries, from which they obtained a total of 360 wage change distributions and found a remarkable amount of variation in wage changes across workers.
Abstract: How do the complex institutions involved in wage setting affect wage changes? The International Wage Flexibility Project provides new microeconomic evidence on how wages change for continuing workers. We analyze individuals' earnings in 31 different data sets from sixteen countries, from which we obtain a total of 360 wage change distributions. We find a remarkable amount of variation in wage changes across workers. Wage changes have a notably non-normal distribution; they are tightly clustered around the median and also have many extreme values. Furthermore, nearly all countries show asymmetry in their wage distributions below the median. Indeed, we find evidence of both downward nominal and real wage rigidities. We also find that the extent of both these rigidities varies substantially across countries. Our results suggest that variations in the extent of union presence in wage bargaining play a role in explaining differing degrees of rigidities among countries.

499 citations


Journal ArticleDOI
TL;DR: The authors showed that the marginal impact of introducing Basel II depends strongly on the extent to which market discipline leads banks to vary lending standards procyclically in the absence of binding regulation.

460 citations


Journal ArticleDOI
TL;DR: In this paper, the response of aggregate lending to monetary policy is stronger in state banking markets where financially constrained banks have more market share, implying that the aggregate elasticity of output to bank lending is very small, if not zero.
Abstract: The response of aggregate lending to monetary policy is stronger in state banking markets where financially constrained banks have more market share. On the other hand, there is little difference in the response of state output across the market share financially constrained banks, implying that the aggregate elasticity of output to bank lending is very small, if not zero. I conclude that while small firms might view bank loans as special, they are not special enough for the lending channel to be an important part of how monetary policy works.

408 citations


Journal ArticleDOI
TL;DR: In this article, a new approach to modeling conditional credit loss distributions is presented, where asset value changes of firms in a credit portfolio are linked to a dynamic global macroeconometric model, allowing macroeffects to be isolated from idiosyncratic shocks from the perspective of default.
Abstract: This paper presents a new approach to modeling conditional credit loss distributions. Asset value changes of firms in a credit portfolio are linked to a dynamic global macroeconometric model, allowing macroeffects to be isolated from idiosyncratic shocks from the perspective of default (and hence loss). Default probabilities are driven primarily by how firms are tied to business cycles, both domestic and foreign, and how business cycles are linked across countries. We allow for firm-specific business cycle effects and the heterogeneity of firm default thresholds using credit ratings. The model can be used, for example, to compute the effects of a hypothetical negative equity price shock in South East Asia on the loss distribution of a credit portfolio with global exposures over one or more quarters. We show that the effects of such shocks on losses are asymmetric and nonproportional, reflecting the highly nonlinear nature of the credit risk model.

402 citations


Journal ArticleDOI
TL;DR: The authors used a portfolio framework to evaluate the impact of increased noninterest income on equity market measures of return and risk of U.S. bank holding companies from 1997 to 2004 and found that the banks most reliant on activities that generate non-interest income do not earn higher average equity returns, but are much more risky as measured by return volatility (both total and idiosyncratic) and market betas.
Abstract: This paper uses a portfolio framework to evaluate the impact of increased noninterest income on equity market measures of return and risk of U.S. bank holding companies from 1997 to 2004. The results indicate that the banks most reliant on activities that generate noninterest income do not earn higher average equity returns, but are much more risky as measured by return volatility (both total and idiosyncratic) and market betas. This suggests that the pervasive shift toward noninterest income has not improved the risk/return outcomes of U.S. banks in recent years.

Journal ArticleDOI
TL;DR: In this article, the authors use micro-level data to analyze emerging markets' private sector access to international debt markets during sovereign debt crises and find that these crises are systematically accompanied by a decline in foreign credit to domestic private firms, both during debt renegotiations and for over two years after restructuring agreements are reached.

Posted Content
TL;DR: In this paper, the authors provide a long history of high-frequency yield curve estimates of the Federal Reserve Board at a daily frequency from 1961 to the present, which can be used to compute yields or forward rates for any horizon.
Abstract: The discount function, which determines the value of all future nominal payments, is the most basic building block of finance and is usually inferred from the Treasury yield curve. It is therefore surprising that researchers and practitioners do not have available to them a long history of high-frequency yield curve estimates. This paper fills that void by making public the Treasury yield curve estimates of the Federal Reserve Board at a daily frequency from 1961 to the present. We use a well-known and simple smoothing method that is shown to fit the data very well. The resulting estimates can be used to compute yields or forward rates for any horizon. We hope that the data, which are posted on the website http://www.federalreserve.gov/pubs/feds/2006 and which will be updated periodically, will provide a benchmark yield curve that will be useful to applied economists.

Journal ArticleDOI
TL;DR: This paper used a bootstrap approach to allow for cross-correlations in city-level house-price shocks, and showed that even these more powerful tests do not reject the hypothesis of no cointegration.
Abstract: Many in the housing literature argue that house prices and income are cointegrated. I show that the data do not support this view. Standard tests using 27 years of national-level data do not find evidence of cointegration. However, standard tests for cointegration have low power, especially in small samples. I use panel-data tests for cointegration that are more powerful than their time-series counterparts to test for cointegration in a panel of 95 metro areas over 23 years. Using a bootstrap approach to allow for cross-correlations in city-level house-price shocks, I show that even these more powerful tests do not reject the hypothesis of no cointegration. Thus the error-correction specification for house prices and income commonly found in the literature may be inappropriate.

Journal ArticleDOI
TL;DR: In this paper, the authors build a database of home values, the cost of housing structures, and residential land values for 46 large US metropolitan areas from 1984 to 2004, finding that residential land value has appreciated over a much wider range of cities than is commonly believed, and almost all large US cities have seen significant increases in real residential land prices.

Journal ArticleDOI
TL;DR: The authors examined the effect of defaulter-friendly foreclosure laws on equilibrium loan size and found that these laws are correlated with a four percent to six percent decrease in the loan size.
Abstract: Foreclosure laws govern the rights of borrowers and lenders when borrowers default on mortgages. Many states protect borrowers by imposing restrictions on the foreclosure process; these restrictions, in turn, impose large costs on lenders. Lenders may respond to these higher costs by reducing loan supply; borrowers may respond to the protections imbedded in these laws by demanding larger mortgages. I examine empirically the effect of the laws on equilibrium loan size. I exploit the rich geographic information available in the 1994 and 1995 Home Mortgage Disclosure Act data to compare mortgage applications for properties located in census tracts that border each other, yet are located in different states. Using semiparametric estimation methods, I find that defaulter-friendly foreclosure laws are correlated with a four percent to six percent decrease in loan size. This result suggests that defaulter-friendly foreclosure laws impose costs on borrowers at the time of loan origination.

Journal ArticleDOI
TL;DR: In this article, the authors formulate and test hypotheses about the role of bank ownership types-foreign, state-owned, and private domestic banks-in banking relationships, using data from India.

ReportDOI
TL;DR: The authors analyzes the quality of VAR-based procedures for estimating the response of the economy to a shock and finds that structural VARs perform well regardless of whether identification is based on short-run or long-run restrictions.
Abstract: This paper analyzes the quality of VAR-based procedures for estimating the response of the economy to a shock. We focus on two key issues. First, do VAR-based confidence intervals accurately reflect the actual degree of sampling uncertainty associated with impulse response functions? Second, what is the size of bias relative to confidence intervals, and how do coverage rates of confidence intervals compare with their nominal size? We address these questions using data generated from a series of estimated dynamic, stochastic general equilibrium models. We organize most of our analysis around a particular question that has attracted a great deal of attention in the literature: How do hours worked respond to an identified shock? In all of our examples, as long as the variance in hours worked due to a given shock is above the remarkably low number of 1 percent, structural VARs perform well. This finding is true regardless of whether identification is based on short-run or long-run restrictions. Confidence intervals are wider in the case of long-run restrictions. Even so, long-run identified VARs can be useful for discriminating among competing economic models.

Journal ArticleDOI
TL;DR: The labor force participation rate in the United States increased almost continuously for two-and-a-half decades after the mid-1960s, pausing only briefly during economic downturns as mentioned in this paper.
Abstract: The labor force participation rate in the United States increased almost continuously for two-and-a-half decades after the mid-1960s, pausing only briefly during economic downturns. The pace of growth slowed considerably during the 1990s, however, and after reaching a record high of 67.3 percent in the first quarter of 2000, participation had declined by 1.5 percentage points by 2005. This paper reviews the social and demographic trends that contributed to the movements in the labor force participation rate in the second half of the twentieth century. It also examines the manner in which developments in the 2000s reflect a break from past trends and considers implications for the future.

Journal ArticleDOI
TL;DR: In this article, a two-sector general equilibrium model is proposed to match the sectoral responses to a monetary shock derived from empirical VAR and to imply an empirically realistic degree of sectoral output volatility and comovement.

Journal ArticleDOI
TL;DR: This article examined the effects of the Riegle-Neal branching deregulation in the 1990s on banking market structure, service, and performance and found that a significant portion of the observed increase in branch networks can be traced to the deregulation, allowing consumers to enjoy larger fee free networks locally and regionally.
Abstract: The paper examines the effects of the Riegle-Neal branching deregulation in the 1990s on banking market structure, service, and performance. While concentration at the regional level has increased, deregulation has left almost intact the structure of metropolitan markets, which have between two and three dominant banks—controlling over half of market deposits—both at the beginning and the end of the sample. A significant portion of the observed increase in branch networks can be traced to the deregulation, allowing consumers to enjoy larger fee-free networks locally and regionally. Costs, service fees, and credit risk increase, spreads fall, and profits are unaffected.

Journal ArticleDOI
TL;DR: In this paper, the authors test whether executive stock ownership affects firm payouts using the 2003 dividend tax cut to identify an exogenous change in the after-tax value of dividends, finding that executives with higher ownership were more likely to increase dividends after the tax cut in 2003.
Abstract: We test whether executive stock ownership affects firm payouts using the 2003 dividend tax cut to identify an exogenous change in the after-tax value of dividends. We find that executives with higher ownership were more likely to increase dividends after the tax cut in 2003, whereas no relation is found in periods when the dividend tax rate was higher. Relative to previous years, firms that initiated dividends in 2003 were more likely to reduce repurchases. The stock price reaction to the tax cut suggests that the substitution of dividends for repurchases may have been anticipated, consistent with agency conflicts. SHAREHOLDER PAYOUTS HAVE CHANGED DRAMATICALLY over the past two decades, with

Journal ArticleDOI
TL;DR: This article showed that since the late 1980s, U.S. financial markets and private sector forecasters have become better able to forecast the federal funds rate at horizons out to several months.
Abstract: Yes. This paper shows that, since the late 1980s, U.S. financial markets and private sector forecasters have become (1) better able to forecast the federal funds rate at horizons out to several months, (2) less surprised by Federal Reserve announcements, (3) more certain of their interest rate forecasts ex ante, as measured by interest rate options, and (4) less diverse in the cross-sectional variety of their interest rate forecasts. We also present evidence that strongly suggests increases in Federal Reserve transparency played a role: for example, private sector forecasts of GDP and inflation have not experienced similar improvements over the same period, indicating that the improvement in interest rate forecasts has been special.

Journal ArticleDOI
TL;DR: The authors developed an open economy DGE model featuring demand curves with variable elasticities so that a firm's pricing decision depends on its competitors' prices and found that exporters became more responsive to the prices of their competitors, explaining a sizeable portion of the observed decline in the sensitivity of U.S import prices to the exchange rate.

Journal ArticleDOI
TL;DR: In this article, the jump detection method based on bi-power variation was extended to identify realized jumps on flnancial markets and to estimate parametrically the jump intensity, mean, and variance.

Journal ArticleDOI
TL;DR: In this article, large differences in trend changes in hours worked across OECD countries over the period 1956-2004 were investigated and the extent to which these changes are consistent with the intratemporal first-order condition from the neoclassical growth model was assessed.
Abstract: We document large differences in trend changes in hours worked across OECD countries over the period 1956-2004. We then assess the extent to which these changes are consistent with the intratemporal first order condition from the neoclassical growth model. We find large and trending deviations from this condition, and that the model can account for virtually none of the changes in hours worked. We then extend the model to incorporate observed changes in taxes. Our findings suggest that taxes can account for much of the variation in hours worked both over time and across countries.

Journal ArticleDOI
TL;DR: Barone-Adesi et al. as mentioned in this paper theoretically and empirically examined the historical simulation method, a variant of historical simulation introduced by Boudoukh et al., 1998.
Abstract: Many large financial institutions compute the Value-at-Risk (VaR) of their trading portfolios using historical simulation based methods, but the methods’ properties are not well understood. This paper theoretically and empirically examines the historical simulation method, a variant of historical simulation introduced by Boudoukh et al. [Boudoukh, J., Richardson, M., Whitelaw, R., 1998. The best of both worlds, Risk 11(May) 64–67] (BRW), and the filtered historical simulation method (FHS) of Barone-Adesi et al. [Barone-Adesi, G., Bourgoin F., Giannopoulos, K., 1998. Don’t look back. Risk 11(August) 100–104; Barone-Adesi, G., Giannopoulos K., Vosper L., 1999. VaR without correlations for nonlinear portfolios. Journal of Futures Markets 19(April) 583–602]. The historical simulation and BRW methods are both under-responsive to changes in conditional risk; and respond to changes in risk in an asymmetric fashion: measured risk increases when the portfolio experiences large losses, but not when it earns large gains. The FHS method is promising, but its risk estimates are variable in small samples, and its assumption that correlations are constant is violated in large samples. Additional refinements are needed to account for time-varying correlations; and to choose the appropriate length of the historical sample period.

Journal ArticleDOI
TL;DR: The authors used a Bayesian Markov chain Monte Carlo algorithm to estimate a model that allows temporary gaps between a true expectational Phillips curve and the monetary authority's approximating nonexpectational Phillips Curve.
Abstract: Working Paper 2004-22 September 2004 Abstract: The authors use a Bayesian Markov chain Monte Carlo algorithm to estimate a model that allows temporary gaps between a true expectational Phillips curve and the monetary authority's approximating nonexpectational Phillips curve. A dynamic programming problem implies that the monetary authority's inflation target evolves as its estimated Phillips curve moves. The authors' estimates attribute the rise and fall of post-World War II inflation in the United States to an intricate interaction between the monetary authority's beliefs and economic shocks. Shocks in the 1970s altered the monetary authority's estimates and made it misperceive the tradeoff between inflation and unemployment. That misperception caused sharp rise in inflation in the 1970s. The authors' estimates indicate that policy makers updated their beliefs continuously. By the 1980s, policy makers' beliefs about the Phillips curve had changed enough to account for Fed chairman Paul Volcker's conquest of U.S. inflation in the early 1980s. JEL classification: E3, E5 Key words: updating beliefs, policy evaluation, self-confirming equilibrium, Nash inflation, Ramsey outcomes I. INTRODUCTION Today, many statesmen and macroeconomists believe that inflation can largely be determined by a government monetary authority. Then why did the Federal Reserve Board preside over high US inflation during the late 1960s and the 1970s? And why, under Paul Volcker, did it rapidly bring inflation down during the early 1980s? This paper answers these questions by estimating a model that features a particular process that makes a procession of economic shocks induce the monetary authority to alter its model of inflation-unemployment dynamics, the Phillips curve. At each date t, the monetary authority updates its beliefs about the Phillips curve and then recomputes a first-period action recommended by a "Phelps problem", a discounted dynamic programming problem that minimizes the expected value of a discounted quadratic loss function of inflation and unemployment. (1) The monetary authority pursues the same objectives at each date, using the same structural model, with only its estimates of that model changing over time. (2) This model of the systematic part of inflation puts the monetary authority's beliefs about the Phillips curve front and center. (3) We assume that the monetary authority's model of the Phillips curve deviates in two subtle but important ways from what it would be in a rational expectations model (e.g., Kydland and Prescott (1977)). The first deviation is that, while the true Phillips curve is like Kydland and Prescott's, we assume that the monetary authority omits the public's expected rate of inflation from its Phillips curve. By itself, this omission need not prevent the outcomes of our model from coinciding with those of Kydland and Prescott's, nor need it imply that the government's model is wrong in a way that could be detected from even an infinite sample. Whether the monetary authority's model is wrong in a statistically detectable way depends on how we allow the monetary authority to reestimate the parameters of its model. In particular, if the monetary authority were to believe that the coefficients of its Phillips curve are constant over time, then its estimates would converge to ones that support a self-confirming equilibrium (SCE). After convergence, its estimated Phillips curve would correctly describe occurrences along the SCE path for inflation and unemployment. Such an after-convergence version of our model has little hope of explaining the rise and fall of US inflation: that model would have inflation fluctuating randomly around a constant SCE level that coincides with Kydland and Prescott's time consistent suboptimal (i.e., excessive) level. (4) This outcome motivates our second subtle deviation from a rational expectations equilibrium. …

Posted Content
TL;DR: In this paper, the authors identify three aspects of existing empirical work which may be responsible for the absence of robust stylized facts: dierences in speci cation of the reduced-form VAR model, lack of comparability of policy experiments, and lack of compa- rability of the …scal policy experiments considered in the literature.
Abstract: In recent years VAR models have become the main econometric tool used to study the eects of …scal policy shocks. Yet, the literature has so far failed to provide ro- bust stylized facts and there is currently strong disagreement on even the qualitative response of key variables such as private consumption and employment to government spending shocks. We identify three aspects of existing empirical work which may be responsible for the absence of robust stylized facts: dierences in speci…cation of the reduced-form VAR model, dierences in identi…cation approaches and lack of compa- rability of the …scal policy experiments considered in the literature. In order to assess the importance of each of these aspects we estimate a common reduced-form VAR model using data for the U.S. economy. Our main result is that speci…cation issues and lack of comparability of policy experiments rather than dierences in identi…cation approaches explain the disagreement in the literature. In particular, all approaches yield the result that the response of private consumption to a spending shock follows a hump-shaped pattern and is signi…cantly positive in the medium run. Moreover, the results suggest that a spending increase stimulates the economy in the medium run irrespective of whether it is de…cit-…nanced or tax-…nanced. However, in the long run neither spending increases nor tax cuts have signi…cant output eects.

Journal ArticleDOI
TL;DR: In this article, the authors show that one can analyze deflation as a credibility problem if three conditions are satisfied: the government's only policy instrument is increasing the money supply by open market operations in short-term bonds; the economy is subject to large negative demand shocks; and the government cannot commit to future policy.
Abstract: I model deflation, at zero nominal interest rate, in a microfounded general equilibrium model. I show that one can analyze deflation as a credibility problem if three conditions are satisfied. First: The government's only policy instrument is increasing the money supply by open market operations in short-term bonds. Second: The economy is subject to large negative demand shocks. Third: The government cannot commit to future policy. I call the credibility problem that arises under these conditions the deflation bias. I propose several policies to solve it. They all involve printing money or issuing nominal debt. In addition they require cutting taxes, buying real assets such as stocks, or purchasing foreign exchange. The government "credibly commits to being irresponsible" by pursuing these policies. It commits to higher money supply in the future so that the private sector expects inflation instead of deflation. This is optimal since it curbs deflation and increases output by lowering the real rate of return.