scispace - formally typeset
Search or ask a question

Showing papers by "Federal Reserve System published in 2000"


Journal ArticleDOI
TL;DR: In this paper, the unconditional expectation of average household utility is expressed in terms of the unconditional variances of the output gap, price inflation, and wage inflation, where the model exhibits a tradeoff in stabilizing output gap and price inflation.

1,813 citations


Journal ArticleDOI
TL;DR: The authors found that a surge in the use of information technology capital and faster efficiency gains in the production of computers account for about two-thirds of the speed-up in productivity growth between the first and second halves of the 1990s.
Abstract: The growth of U.S. labor productivity rebounded in the second half of the 1990s, after nearly a quarter century of sluggish gains. We assess the contribution of information technology to this rebound, using the same neoclassical framework as in our earlier work. We find that a surge in the use of information technology capital and faster efficiency gains in the production of computers account for about two-thirds of the speed-up in productivity growth between the first and second halves of the 1990s. Thus, to answer the question posed in the title of the paper, information technology largely is the story.

1,813 citations


Journal ArticleDOI
TL;DR: In this paper, the authors develop a framework that provides a simple, explicit economic mechanism for understanding skill-biased technological change in terms of observable variables and use the framework to evaluate the fraction of variation in the skill premium that can be accounted for by changes in observed factor quantities.
Abstract: The supply and price of skilled labor relative to unskilled labor have changed dramatically over the postwar period. The relative quantity of skilled labor has increased substantially, and the skill premium, which is the wage of skilled labor relative to that of unskilled labor, has grown significantly since 1980. Many studies have found that accounting for the increase in the skill premium on the basis of observable variables is difficult and have concluded implicitly that latent skill-biased technological change must be the main factor responsible. This paper examines that view systematically. We develop a framework that provides a simple, explicit economic mechanism for understanding skill-biased technological change in terms of observable variables, and we use the framework to evaluate the fraction of variation in the skill premium that can be accounted for by changes in observed factor quantities. We find that with capital-skill complementarity, changes in observed inputs alone can account for most of the variations in the skill premium over the last 30 years.

1,406 citations


ReportDOI
TL;DR: In this paper, the authors show that large technology shocks are needed to produce realistic business cycles, while Solow residuals are sufficiently volatile, these imply frequent technological regress, suggesting the imminent demise of real business cycles.
Abstract: The Real Business Cycle (RBC) research program has grown spectacularly over the last decade, as its concepts and methods have diffused into mainstream macroeconomics. Yet, there is increasing skepticism that technology shocks are a major source of business fluctuations. This chapter exposits the basic RBC model and shows that it requires large technology shocks to produce realistic business cycles. While Solow residuals are sufficiently volatile, these imply frequent technological regress. Productivity studies permitting unobserved factor variation find much smaller technology shocks, suggesting the imminent demise of real business cycles. However, we show that greater factor variation also dramatically amplifies shocks: a RBC model with varying capital utilization yields realistic business cycles from small, nonnegative changes in technology.

1,255 citations


Posted Content
TL;DR: In this paper, the implications of asset price volatility for the management of monetary policy are explored and it is shown that it is desirable for central banks to focus on underlying inflationary pressures.
Abstract: We explore the implications of asset price volatility for the management of monetary policy. We show that it is desirable for central banks to focus on underlying inflationary pressures. Asset prices become relevant only to the extent they may signal potential inflationary or deflationary forces. Rules that directly target asset prices appear to have undesirable side effects. We base our conclusions on (i) simulation of different policy rules in a small scale macro model and (ii) a comparative analysis of recent U.S. and Japanese monetary policy.

1,148 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed asymptotic distribution theory for GMM estimators and test statistics when some or all of the parameters are weakly identified, and used these results to inform an empirical investigation of various CCAPM specifications; the substantive conclusions reached differ from those obtained using conventional methods.
Abstract: This paper develops asymptotic distribution theory for GMM estimators and test statistics when some or all of the parameters are weakly identified. General results are obtained and are specialized to two important cases: linear instrumental variables regression and Euler equations estimation of the CCAPM. Numerical results for the CCAPM demonstrate that weak-identification asymptotics explains the breakdown of conventional GMM procedures documented in previous Monte Carlo studies. Confidence sets immune to weak identification are proposed. We use these results to inform an empirical investigation of various CCAPM specifications; the substantive conclusions reached differ from those obtained using conventional methods.

770 citations


Journal ArticleDOI
TL;DR: In this article, a comparative anatomy of two especially influential benchmarks for credit risk models, the RiskMetrics Group's CreditMetrics and Credit Suisse Financial Product's CreditRisk, is presented.
Abstract: Within the past two years, important advances have been made in modeling credit risk at the portfolio level. Practitioners and policy makers have invested in implementing and exploring a variety of new models individually. Less progress has been made, however, with comparative analyses. Direct comparison often is not straightforward, because the diAerent models may be presented within rather diAerent mathematical frameworks. This paper oAers a comparative anatomy of two especially influential benchmarks for credit risk models, the RiskMetrics Group’s CreditMetrics and Credit Suisse Financial Product’s CreditRisk a . We show that, despite diAerences on the surface, the underlying mathematical structures are similar. The structural parallels provide intuition for the relationship between the two models and allow us to describe quite precisely where the models diAer in functional form, distributional assumptions, and reliance on approximation formulae. We then design simulation exercises which evaluate the eAect of each of these diAerences individually. ” 2000 Elsevier Science B.V. All rights reserved.

639 citations


ReportDOI
TL;DR: This paper developed an endogenous growth model that incorporates parametrically four important distortions to R&D: the surplus appropriability problem, knowledge spillovers, creative destruction, and duplication externalities.
Abstract: Research and development is a key determinant of long-run productivity and welfare. A central issue is whether a decentralized economy undertakes too little or too much R&D. We develop an endogenous growth model that incorporates parametrically four important distortions to R&D: the surplus appropriability problem, knowledge spillovers, creative destruction, and duplication externalities. Calibrating the model, we find that the decentralized economy typically underinvests in R&D relative to what is socially optimal. The only exceptions to this conclusion occur when the duplication externality is strong and the equilibrium real interest rate is simultaneously high.

596 citations


Journal ArticleDOI
TL;DR: In this paper, the principal techniques used to undertake capital arbitrage and the challenges faced by bank supervisors in attempting to deal with these activities under the current capital framework are discussed and discussed.
Abstract: In recent years, securitization and other financial innovations have provided unprecedented opportunities for banks to reduce substantially their regulatory capital requirements with little or no corresponding reduction in their overall economic risks ‐ a process termed ‘‘regulatory capital arbitrage’’. These methods are used routinely to lower the eAective risk-based capital requirements against certain portfolios to levels well below the Basel Capital Accord’s nominal 8% total risk-based capital standard. This paper discusses the principal techniques used to undertake capital arbitrage and the diAculties faced by bank supervisors in attempting to deal with these activities under the current capital framework. ” 2000 Elsevier Science B.V. All rights reserved.

516 citations


Journal ArticleDOI
TL;DR: In this article, a simple model of habit formation implies a condition relating the strength of habits to the evolution of consumption over time, and the condition is estimated with food consumption data from the Panel Study on Income Dynamics.
Abstract: This paper tests for the presence of habit formation using household data. A simple model of habit formation implies a condition relating the strength of habits to the evolution of consumption over time. When the condition is estimated with food consumption data from the Panel Study on Income Dynamics, the results yield no evidence of habit formation at the annual frequency. This finding is robust to a number of changes in the specification. It also holds for several proxies for nondurables and services consumption created by combining PSID variables with weights estimated from Consumer Expenditure Survey data.

473 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed using variance-ratio tests based on the ranks and signs of a time series to test the null that the series is a martingale difference sequence.
Abstract: This article proposes using variance-ratio tests based on the ranks and signs of a time series to test the null that the series is a martingale difference sequence. Unlike conventional variance-ratio tests, these tests can be exact. In Monte Carlo simulations, I find that they can also be more powerful than conventional variance-ratio tests. I apply the proposed tests to five exchange-rate series and find that they are capable of detecting violations of the martingale hypothesis for all five series, whereas conventional variance-ratio tests yield ambiguous results.

Book
01 Sep 2000
TL;DR: This article reported the results of a project to estimate and test the stability properties of conventional equations relating real imports and exports of goods and services for the G-7 countries to their incomes and relative prices.
Abstract: This paper reports the results of a project to estimate and test the stability properties of conventional equations relating real imports and exports of goods and services for the G-7 countries to their incomes and relative prices. We begin by estimating cointegration vectors and the error-correction formulations. We then test the stability of these equations using Chow and Kalman-Filter tests. The evidence suggests three findings. First, conventional trade equations and elasticities are stable enough, in most cases, to perform adequately in forecasting and policy simulations. Equations for German trade, as well as equations for French and Italian exports, are an exception. Second, income elasticities of U.S. trade have not been shifting in a direction that will tend to ease the trend toward deterioration in the U.S. trade position. The income-elasticity gap for Japan found in earlier studies was not confirmed in this analysis. Finally, the price channel is weak, if not wholly ineffective, in the case of continental European countries.

Journal ArticleDOI
TL;DR: This article investigated the composition of households' assets and liabilities in the United States using aggregate and survey data and found that despite the broad array of financial products available, the portfolio of the typical household remains fairly simple and safe, consisting of a checking account, savings account, and tax-deferred retirement account.
Abstract: This paper investigates the composition of households' assets and liabilities in the United States. Using aggregate and survey data, we document major trends in household portfolios in the past 15 years. We show that, despite the broad array of financial products available, the portfolio of the typical household remains fairly simple and safe, consisting of a checking account, savings account, and tax-deferred retirement account; in 1998, less than half of all households owned some form of stock. We use pooled data from the Survey of Consumer Finances to investigate determinants of portfolio choice, finding significant effects of age, wealth, income risk, and entry/information costs. wealth

Journal ArticleDOI
TL;DR: In this article, the authors present empirical results from several recent papers that offer three explanations of interest-rate smoothing: forward-looking behavior by market participants, measurement error associated with key macroeconomic variables, and uncertainty regarding relevant structural parameters.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the internal credit risk rating systems presently in use at the 50 largest US banking organizations and use the diversity of current practice to illuminate the relationships between uses of ratings, different options for rating system design, and the effectiveness of internal rating systems.
Abstract: Internal credit risk rating systems are becoming an increasingly important element of large commercial banks’ measurement and management of the credit risk of both individual exposures and portfolios. This article describes the internal rating systems presently in use at the 50 largest US banking organizations. We use the diversity of current practice to illuminate the relationships between uses of ratings, different options for rating system design, and the effectiveness of internal rating systems. Growing stresses on rating systems make an understanding of such relationships important for both banks and regulators.

Journal ArticleDOI
TL;DR: In this paper, the authors investigate whether the tendency for changes in the federal funds rate to be implemented gradually can be explained by the dynamic structure of the economy and the uncertainty surrounding that structure, without recourse to including an ad hoc interest-rate smoothing argument in the Fed's objective function.

Posted Content
TL;DR: In this article, the authors argue that the IT revolution was behind this and moreover, moreover, that the capitalization/GDP ratio is likely to decline and then rise after any major technological shift.
Abstract: Since 1968, the ratio of stock market capitalization to GDP has varied by a factor of 5. In 1972, the ratio stood at above unity, but by 1974, it had fallen to 0.45 where it stayed for the next decade. It then began a steady climb, and today it stands above 2. We argue that the IT revolution was behind this and, moreover, that the capitalization/GDP ratio is likely to decline and then rise after any major technological shift. The three assumptions that deliver the result are: 1. The IT revolution was anticipated by early 1973, 2. IT was resisted by incumbents, which led their value to fall, and 3. Takeovers are an imperfect policing device that allowed many firms to remain inefficient until the mid-1980's. We lay out some facts that the IT hypothesis explains, but that some alternative hypotheses -- oil-price shocks, increased market volatility, and bubbles -- do not.

Journal ArticleDOI
TL;DR: In this article, the authors investigate how banking market competition, informational opacity, and sensitivity to shocks have changed over the last three decades by examining the persistence of firm-level rents.
Abstract: We investigate how banking market competition, informational opacity, and sensitivity to shocks have changed over the last three decades by examining the persistence of firm-level rents. We develop propagation mechanisms with testable implications to isolate the sources of persistence. Our analysis suggests that different processes underlie persistence at the high and low ends of the performance distribution. Our tests suggest that impediments to competition and informational opacity continue to be strong determinants of persistence; that the reduction in geographic regulatory restrictions had little effect on competitiveness; and that persistence remains sensitive to regional/macroeconomic shocks. The findings also suggest reasons for the recent record profitability of the industry.

Journal ArticleDOI
TL;DR: In this article, the effects of inflation variability on economic growth in a model where money is introduced via a cash-in-advance constraint was analyzed, and it was shown that inflation adversely affects long-run growth, even when the cashin-addvance constraint applies only to consumption.

Journal ArticleDOI
TL;DR: In this article, the authors disentangle the possible factors underlying this correlation and determine whether a positive long-run effect of real depreciation on output is in the data and conclude that even after sources of spurious correlation and reverse causation are controlled for, real devaluation has led to high inflation and economic contraction.

Journal ArticleDOI
TL;DR: This paper examined the predictability of these revisions using standard statistical tests of whether the preliminary announcement is a rational forecast of the subsequently revised data and found that the degree of predictability varies throughout the G-7.
Abstract: Revisions to GDP announcements are known to be quite large in all G-7 countries; quarterly growth rate revisions are regularly more than a full percentage point at an annualized rate. We examine the predictability of these revisions using standard statistical tests of whether the preliminary announcement is a rational forecast of the subsequently revised data. Previous work suggests that U.S. GDP revisions are largely unpredictable, as would be the case if the revisions reflect news not available at the time that the preliminary number is produced.We find that the degree of predictability varies throughout the G-7. Although we find little predictability in U.S. revisions, the data revisions for several foreign countries are highly predictable. We also perform a simple real-time forecasting exercise showing that for several countries, the predictability of data revision could be used to generate improved preliminary data.

Posted Content
TL;DR: The New IS-LM model as discussed by the authors has been used to discuss the determination of macroeconomic activity and the design of monetary policy rules, which leads to strong conclusions about monetary policy in four important areas.
Abstract: Recent years have witnessed the development of a New IS-LM model that is increasingly being used to discuss the determination of macroeconomic activity and the design of monetary policy rules. It is sometimes called an “optimizing IS-LM model” because it can be built up from microfoundations. It is alternatively called an “expectational IS-LM model” because the traditional model’s behavioral equations are modified to include expectational terms suggested by these microfoundations and because the new framework is analyzed using rational expectations. The purpose of this article is to provide a simple exposition of the New IS-LM model and to discuss how it leads to strong conclusions about monetary policy in four important areas. • Desirability of price level or inflation targeting: The new model suggests that a monetary policy that targets inflation at a low level will keep economic activity near capacity. If there are no exogenous “inflation shocks,” then full stabilization of the price level will also maintain output at its capacity level. More generally, the new model indicates that time-varying inflation targets should not respond to many economic disturbances, including shocks to productivity, aggregate demand, and the demand for money. • Interest rate behavior under inflation targeting: The new model incorporates the twin principles of interest rate determination, originally developed by Irving Fisher, which are an essential component of modern macroeconomics. The real interest rate is a key intertemporal relative

Journal ArticleDOI
TL;DR: In this article, the authors propose three options for overcoming the zero bound on interest rate policy: a carry tax on money, open market operations in long bonds, and monetary transfers, which can be used to stimulate the economy by creating liquidity broadly defined.
Abstract: The paper proposes three options for overcoming the zero bound on interest rate policy: a carry tax on money, open market operations in long bonds, and monetary transfers. A variable carry tax on electronic bank reserves could enable a central bank to target negative nominal interest rates. A carry tax could be imposed on currency to create more leeway to make interest rates negative. Quantitative policy--monetary transfers and open market purchases of long bonds--could stimulate the economy by creating liquidity broadly defined. A central bank needs more fiscal support than usual from the Treasury to pursue quantitative policy at the interest rate floor.

Journal ArticleDOI
TL;DR: Boyer, Gibson and Loretan as mentioned in this paper showed that increases in the volatility of returns are generally accompanied by an increase in sampling correlations even when the true correlations are constant, and that the theoretical relationship can explain much of the movement in correlations over time.
Abstract: Financial market observers have noted that during periods of high market volatility, correlations between asset prices can differ substantially from those seen in quieter markets. For example, correlations among yield spreads were substantially higher during the fall of 1998 than in earlier or later periods. Such changes in correlations could reflect changes in the underlying distribution of returns or "contagion" across markets that is present only during periods of market turbulence. However, as noted by Boyer, Gibson and Loretan (1999), increases in the volatility of returns are generally accompanied by an increase in sampling correlations even when the true correlations are constant. We show that this result is not just of theoretical interest: When we consider quarterly measures of volatility and correlation for three pairs of asset returns, we find that the theoretical relationship can explain much of the movement in correlations over time. We then examine the implications of this link between measures of volatility and correlation for risk management, bank supervision, and monetary policy making.

Journal ArticleDOI
TL;DR: In an environment of low inflation, the Federal Reserve faces the possibility that it may not have provided enough monetary stimulus even though it had pushed the short-term nominal interest rate to its lower bound of zero as mentioned in this paper.
Abstract: In an environment of low inflation, the Federal Reserve faces the possibility that it may not have provided enough monetary stimulus even though it had pushed the short-term nominal interest rate to its lower bound of zero. Assuming the nominal Treasury-bill rate had been lowered to zero, this paper considers whether further open market purchases of Treasury bills could spur aggregate demand through increases in the monetary base. Such action may be stimulative by increasing liquidity for financial intermediaries and households; by affecting expectations of the future paths of short-term interest rates, inflation, and asset prices; through distributional effects; or by stimulating bank lending through the credit channel. This paper also examines the alternative policy tools that are available to the Federal Reserve in theory, and notes the practical limitations imposed by the Federal Reserve Act. The tools the Federal Reserve has at its disposal include open market purchases of Treasury bonds and certain types of private-sector credit instruments; unsterilized and sterilized intervention in foreign exchange; lending through the discount window; and, in some circumstances, may include the use of options.

Journal ArticleDOI
TL;DR: In this article, the authors exploit historical revisions to real-time estimates of the output gap to examine the implications of measurement error for the design of monetary policy, using the Federal Reserve's model of the U.S. economy, FRB/US.

Journal ArticleDOI
TL;DR: This paper examined the interaction between preferential trade agreements (PTAs) and multilateral tariff reduction in a model of imperfect competition and found that tariff reduction enhances the incentives to form a PTA and increases the likelihood that it is self-enforcing.

Posted Content
TL;DR: The distinction between the timeless perspective and discretionary modes of monetary policymaking, the former representing rule-based policy as recently formalized by Woodford (1999b), was discussed in this paper.
Abstract: This paper reviews the distinction between the timeless perspective and discretionary modes of monetary policymaking, the former representing rule-based policy as recently formalized by Woodford (1999b). In models with forward-looking expectations, this distinction is greater than in the models that have been typical in the rules-vs.-discretion literature; typically there is a second inefficiency from discretionary policymaking, distinct from the familiar inflationary bias. The paper presents calculations of the quantitative magnitude of this second inefficiency, using calibrated models of two types prominent in the current literature. In addition, it examines the distinction between instrument rules and targeting rules; the results indicate that targeting-rule outcomes can be very closely approximated by instrument rules. Also included is a brief investigation of operationality issues, involving the unobservability of current output and the possibility that an incorrect concept of the natural-rate level of output, essential in measuring the output gap, is used by the policymaker. In all of the cases examined, the unconditional average performance of timeless perspective policymaking is at least as good as that provided by optimal discretionary behavior.

Journal ArticleDOI
TL;DR: In this paper, the authors study optimal monetary policy design in a simple model that deviates from the linear-quadratic paradigm and provide a rationale for the practice of inflation zone targeting.

Journal ArticleDOI
TL;DR: The authors compare free trade reached through expanding regional trading blocs to free trade accomplished by multilateral negotiation, and find that the outcomes are different with sunk costs, and that world welfare during free trade is greater when it is achieved by the regional path.
Abstract: We compare free trade reached through expanding regional trading blocs to free trade accomplished by multilateral negotiation. With sunk costs, the outcomes are different. Trade in an imperfectly competitive good flows disproportionately more between the original members of a regional agreement even after free trade is reached. They secure a higher welfare level from regionalism than from free trade achieved multilaterally; nonmembers, however, reach a lower welfare level. A surprising result is that world welfare during free trade is greater when it is achieved by the regional path. We conclude with some empirical evidence from the European Union that is consistent with the model.