scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Economic Perspectives in 2010"


Journal ArticleDOI
TL;DR: Mobile telephony has brought new possibilities to the continent of sub-Saharan Africa as discussed by the authors, and 60 percent of the population has mobile phone coverage, which is the highest rate in the world.
Abstract: Sub-Saharan Africa has some of the lowest levels of infrastructure investment ub-Saharan Africa has some of the lowest levels of infrastructure investment in the world. Merely 29 percent of roads are paved, barely a quarter of the in the world. Merely 29 percent of roads are paved, barely a quarter of the population has access to electricity, and there are fewer than three landlines population has access to electricity, and there are fewer than three landlines available per 100 people (ITU, 2009; World Bank, 2009a). Yet access to and use of available per 100 people (ITU, 2009; World Bank, 2009a). Yet access to and use of mobile telephony in sub-Saharan Africa has increased dramatically over the past mobile telephony in sub-Saharan Africa has increased dramatically over the past decade. There are ten times as many mobile phones as landlines in sub-Saharan decade. There are ten times as many mobile phones as landlines in sub-Saharan Africa (ITU, 2009), and 60 percent of the population has mobile phone coverage. Africa (ITU, 2009), and 60 percent of the population has mobile phone coverage. Mobile phone subscriptions increased by 49 percent annually between 2002 and Mobile phone subscriptions increased by 49 percent annually between 2002 and 2007, as compared with 17 percent per year in Europe (ITU, 2008). 2007, as compared with 17 percent per year in Europe (ITU, 2008). Mobile telephony has brought new possibilities to the continent. Across urban– Mobile telephony has brought new possibilities to the continent. Across urban– rural and rich–poor divides, mobile phones connect individuals to individuals, rural and rich–poor divides, mobile phones connect individuals to individuals, information, markets, and services. In Mali, residents of Timbuktu can call relainformation, markets, and services. In Mali, residents of Timbuktu can call relatives living in the capital city of Bamako—or relatives in France. In Ghana, farmers tives living in the capital city of Bamako—or relatives in France. In Ghana, farmers in Tamale are able to send a text message to learn corn and tomato prices in Accra, in Tamale are able to send a text message to learn corn and tomato prices in Accra, over 400 kilometers away. In Niger, day laborers are able to call acquaintances over 400 kilometers away. In Niger, day laborers are able to call acquaintances in Benin to fi nd out about job opportunities without making the US$40 trip. In in Benin to fi nd out about job opportunities without making the US$40 trip. In Malawi, those affected by HIV and AIDS can receive text messages daily, reminding Malawi, those affected by HIV and AIDS can receive text messages daily, reminding them to take their medicines on schedule. Citizens in countries as diverse as Kenya, them to take their medicines on schedule. Citizens in countries as diverse as Kenya, Nigeria, and Mozambique are able to report violent confrontations via text message Nigeria, and Mozambique are able to report violent confrontations via text message to a centralized server that is viewable, in real time, by the entire world. to a centralized server that is viewable, in real time, by the entire world.

1,170 citations


Journal ArticleDOI
TL;DR: This paper found that only half of the difference in labor productivity between firms and countries could be explained by differential inputs, such as capital intensity, and that the productivity differences across firms and plants are temporary but persist over time.
Abstract: Economists have long puzzled over the astounding differences in productivity between firms and countries. For example, looking at disaggregated data on U.S. manufacturing industries, Syverson (2004a) found that plants at the 90th percentile produced four times as much as the plant in the 10th percentile on a per-employee basis. Only half of this difference in labor productivity could be accounted for by differential inputs, such as capital intensity. Syverson looked at industries defined at the four-digit level in the Standard Industrial Classification (SIC) system (now the North American Industry Classification System or NAICS) like 'Bakeries and Tortilla Manufacturing' or 'Plastics Product Manufacturing.' Foster, Haltiwanger, and Syverson (2008) show large differences in total factor productivity even within very homogeneous goods industries such as boxes and block ice. Some of these productivity differences across firms and plants are temporary, but in large part they persist over time. At the country level, Hall and Jones (1999) and Jones and Romer (2009) show how the stark differences in productivity across countries account for a substantial fraction of the differences in average per capita income. Both at the plant level and at the national level, differences in productivity are typically calculated as a residual-that is, productivity is inferred as the gap between output and inputs that cannot be accounted for by conventionally measured inputs.

1,169 citations


Journal ArticleDOI
TL;DR: The authors argue that the primary engine driving improvement has been a focus on the quality of empirical research designs, and that the advantages of a good research design are perhaps most easily apparent in research using random assignment.
Abstract: Since Edward Leamer's memorable 1983 paper, "Let's Take the Con out of Econometrics," empirical microeconomics has experienced a credibility revolution. While Leamer's suggested remedy, sensitivity analysis, has played a role in this, we argue that the primary engine driving improvement has been a focus on the quality of empirical research designs. The advantages of a good research design are perhaps most easily apparent in research using random assignment. We begin with an overview of Leamer's 1983 critique and his proposed remedies. We then turn to the key factors we see contributing to improved empirical work, including the availability of more and better data, along with advances in theoretical econometric understanding, but especially the fact that research design has moved front and center in much of empirical micro. We offer a brief digression into macroeconomics and industrial organization, where progress -- by our lights -- is less dramatic, although there is work in both fields that we find encouraging. Finally, we discuss the view that the design pendulum has swung too far. Critics of design-driven studies argue that in pursuit of clean and credible research designs, researchers seek good answers instead of good questions. We briefly respond to this concern, which worries us little.

876 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine descriptive and empirical evidence that sheds light on the role of fair-value accounting for U.S. banks in the 2008 financial crisis and conclude that reporting these losses under fair value accounting created additional problems.
Abstract: In its pure form, fair-value accounting involves reporting assets and liabilities on the balance sheet at fair value and recognizing changes in fair value as gains and losses in the income statement. When market prices are used to determine fair value, fair-value accounting is also called mark-to-market account ing. Some critics argue that fair-value accounting exacerbated the severity of the 2008 financial crisis. The main allegations are that fair-value accounting contrib utes to excessive leverage in boom periods and leads to excessive write-downs in busts. The write-downs due to falling market prices deplete bank capital and set off a downward spiral, as banks are forced to sell assets at "fire sale" prices, which in turn can lead to contagion as prices from asset fire sales of one bank become relevant for other banks. These arguments are often taken at face value, but evidence on problems created by fair-value accounting is rarely provided. We discuss these arguments and examine descriptive and empirical evidence that sheds light on the role of fair-value accounting for U.S. banks in the crisis. While large losses can clearly cause problems for banks and other financial institu tions, the relevant question for our article is whether reporting these losses under fair-value accounting created additional problems. Similarly, it is clear that deter mining fair values for illiquid assets in a crisis is very difficult, but did reporting fair values of illiquid assets make matters worse? Would the market have reacted dif ferently if banks had not reported their losses or used a different set of accounting

545 citations


Journal ArticleDOI
TL;DR: In this article, the authors explore how the financial regulatory structure propelled three credit rating agencies (Moody's, Standard & Poor's (S&P), and Fitch) to the center of the U.S. bond markets, and how these ingredients combined to contribute to the subprime mortgage debacle and associated financial crisis.
Abstract: This paper will explore how the financial regulatory structure propelled three credit rating agencies -- Moody's, Standard & Poor's (S&P), and Fitch -- to the center of the U.S. bond markets -- and thereby virtually guaranteed that when these rating agencies did make mistakes, these mistakes would have serious consequences for the financial sector. We begin by looking at some relevant history of the industry, including the series of events that led financial regulators to outsource their judgments to the credit rating agencies (by requiring financial institutions to use the specific bond creditworthiness information that was provided by the major rating agencies) and when the credit rating agencies shifted their business model from "investor pays" to "issuer pays." We then look at how the credit rating industry evolved and how its interaction with regulatory authorities served as a barrier to entry. We then show how these ingredients combined to contribute to the subprime mortgage debacle and associated financial crisis. Finally, we consider two possible routes for public policy with respect to the credit rating industry: One route would tighten the regulation of the rating agencies, while the other route would reduce the required centrality of the rating agencies and thereby open up the bond information process in a way that has not been possible since the 1930s.

489 citations


Journal ArticleDOI
TL;DR: This paper found that the response to competition differs for men and women, and in the examined environment, gender difference in competitive performance does not reflect the difference in noncompetitive performance, and argued that the competitive pressures associated with test taking may result in performances that do not reflect those of less competitive settings.
Abstract: The mean and standard deviation in performance on math test scores are only slightly larger for males than for females. Despite minor differences in mean performance, many more boys than girls perform at the right tail of the distribution. This gender gap has been documented for a series of math tests including the AP calculus test, the mathematics SAT, and the quantitative portion of the Graduate Record Exam (GRE). The objective of this paper is not to discuss whether the mathematical skills of males and females differ, be it a result of nurture or nature. Rather we argue that the reported test scores do not necessarily match the gender differences in math skills. We will present results that suggest that the evidence of a large gender gap in mathematics performance at high percentiles in part may be explained by the differential manner in which men and women respond to competitive test-taking environments. The effects in mixed-sex settings range from women failing to perform well in competitions, to women shying away from environments in which they have to compete. We find that the response to competition differs for men and women, and in the examined environment, gender difference in competitive performance does not reflect the difference in noncompetitive performance. We argue that the competitive pressures associated with test taking may result in performances that do not reflect those of less-competitive settings. Of particular concern is that the distortion is likely to vary by gender and that it may cause gender differences in performance to be particularly large in mathematics and for the right tail of the performance distribution. Thus the gender gap in math test scores may exaggerate the math advantage of males over females.

473 citations


Journal ArticleDOI
TL;DR: In this article, the authors examine the mechanisms of credit default swaps in their straightforward use, providing insurance against the defaults of individual companies, before turning to how they were used to take positions on subprime mortgages.
Abstract: The focus of the paper is on how credit default swaps may have contributed to the credit crisis. The author reviews the mechanisms of credit default swaps in their straightforward use — providing insurance against the defaults of individual companies — before turning to how they were used to take positions on subprime mortgages. He examines the size and growth of the credit default swap market an then turns arguments as to how credit default swaps may have contributed to the crisis.

327 citations


Journal ArticleDOI
TL;DR: For instance, the authors found that teachers display considerable heterogeneity in their effects on student achievement gains and that the standard deviation across teachers in their impact on student academic achievement gains is on the order of 0.1 to 0.2 student-level standard deviations.
Abstract: Teaching may be the most-scrutinized occupation in the economy. Over the eaching may be the most-scrutinized occupation in the economy. Over the past four decades, empirical researchers—many of them economists—have past four decades, empirical researchers—many of them economists—have accumulated an impressive amount of evidence on teachers: the heteroaccumulated an impressive amount of evidence on teachers: the heterogeneity in teacher productivity, the rise in productivity associated with teaching geneity in teacher productivity, the rise in productivity associated with teaching credentials and on-the-job experience, rates of turnover, the costs of recruitment, credentials and on-the-job experience, rates of turnover, the costs of recruitment, the relationship between supply and quality, the effect of class size and the monetary the relationship between supply and quality, the effect of class size and the monetary value of academic achievement gains over a student’s lifetime. Since the passage value of academic achievement gains over a student’s lifetime. Since the passage of the No Child Left Behind Act, along with a number of state-level educational of the No Child Left Behind Act, along with a number of state-level educational initiatives, the data needed to estimate individual teacher performance based on initiatives, the data needed to estimate individual teacher performance based on student achievement gains have become more widely available. However, there have student achievement gains have become more widely available. However, there have been relatively few efforts to examine the implications of this voluminous literature been relatively few efforts to examine the implications of this voluminous literature on teacher performance. In this paper, we ask what the existing evidence implies on teacher performance. In this paper, we ask what the existing evidence implies for how school leaders might recruit, evaluate, and retain teachers. for how school leaders might recruit, evaluate, and retain teachers. We begin by summarizing the evidence on fi ve key points, referring to existing We begin by summarizing the evidence on fi ve key points, referring to existing work and to evidence we have accumulated from our research with the nation’s work and to evidence we have accumulated from our research with the nation’s two largest school districts: Los Angeles and New York City. First, teachers display two largest school districts: Los Angeles and New York City. First, teachers display considerable heterogeneity in their effects on student achievement gains. The stanconsiderable heterogeneity in their effects on student achievement gains. The standard deviation across teachers in their impact on student achievement gains is on dard deviation across teachers in their impact on student achievement gains is on the order of 0.1 to 0.2 student-level standard deviations, which would improve the the order of 0.1 to 0.2 student-level standard deviations, which would improve the

325 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the three areas that are critical to all debt markets decisions: risk capital and risk aversion, financing repo and the risk premium, as well as counterparty risks.
Abstract: In this article the author discusses the three areas that are critical to all debt markets decisions: risk capital and the risk aversion, financing repo and the risk premium, as well as counterparty risks. In each of these areas feedback effects can arise, so that less liquidity and higher cost of financing can reinforce each other in contagious spiral. In conclusion, he analyze briefly four steps that the Federal Reserve System took to ease the crisis and how was geared a specific systemic fault that arouse during the crisis.

318 citations


Journal ArticleDOI
TL;DR: The authors argue that the complexity of macroeconomic interactions limits the knowledge we can ever attain, and that we need to place this fact at the center of our analysis, and seek analytical tools and macroeconomic policies that are robust to the enormous uncertainty to which we are confined.
Abstract: The recent financial crisis has damaged the reputation of macroeconomics, largely for its inability to predict the impending financial and economic crisis. To be honest, this inability to predict does not concern me much. It is almost tautological that severe crises are essentially unpredictable, for otherwise they would not cause such a high degree of distress. What does concern me about my discipline is that its current core—by which I mainly mean the so-called dynamic stochastic general equilibrium approach—has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one. This is dangerous for both methodological and policy reasons. To be fair to our field, an enormous amount of work at the intersection of macroeconomics and corporate finance has been chasing many of the issues that played a central role during the current crisis, including liquidity evaporation, collateral shortages, bubbles, crises, panics, fire sales, risk-shifting, contagion, and the like. However, much of this literature belongs to the periphery of macroeconomics rather than to its core. I will discuss the distinction between the core and the periphery of macroeconomics as well as the futile nature of the integrationist movement—that is, the process of gradually bringing the insights of the periphery into the dynamic stochastic general equilibrium structure. I argue that the complexity of macroeconomic interactions limits the knowledge we can ever attain, and that we need to place this fact at the center of our analysis. We should consider what this complexity does to the actions and reactions of the economic agent, and seek analytical tools and macroeconomic policies that are robust to the enormous uncertainty to which we are confined.

293 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss why neither standard macroeconomic models that abstract from financial intermediation nor traditional models of the "bank lending abstract from fi nancial intermediation and traditional bank lending channel" are adequate as a basis for understanding the recent crisis.
Abstract: a macroeconomic framework in which fi nancial intermediation matters for the allocation of resources. allocation of resources. In this paper, I fi rst discuss why neither standard macroeconomic models that In this paper, I fi rst discuss why neither standard macroeconomic models that abstract from fi nancial intermediation nor traditional models of the "bank lending abstract from fi nancial intermediation nor traditional models of the "bank lending channel" are adequate as a basis for understanding the recent crisis. I argue that channel" are adequate as a basis for understanding the recent crisis. I argue that instead we need models in which intermediation plays a crucial role, but in which instead we need models in which intermediation plays a crucial role, but in which intermediation is modeled in a way that better conforms to current institutional intermediation is modeled in a way that better conforms to current institutional realities. In particular, we need models that recognize that a market-based fi nan- realities. In particular, we need models that recognize that a market-based fi nan- cial system—one in which intermediaries fund themselves by selling securities in cial system—one in which intermediaries fund themselves by selling securities in competitive markets, rather than collecting deposits subject to reserve require- competitive markets, rather than collecting deposits subject to reserve require- ments—is not the same as a frictionless system. ments—is not the same as a frictionless system. I then sketch the basic elements of an approach that allows fi nancial inter- I then sketch the basic elements of an approach that allows fi nancial inter- mediation and credit frictions to be integrated into macroeconomic analysis in a mediation and credit frictions to be integrated into macroeconomic analysis in a straightforward way. I show how the model can be used to analyze the macroeco- straightforward way. I show how the model can be used to analyze the macroeco- nomic consequences of the recent fi nancial crisis and conclude with a discussion of nomic consequences of the recent fi nancial crisis and conclude with a discussion of some implications of the model for the conduct of monetary policy. some implications of the model for the conduct of monetary policy.

Journal ArticleDOI
TL;DR: The National Flood Insurance Program (NFIP) as discussed by the authors has been one of the longest standing government-run disaster insurance programs in the world and has been used to cover $1.23 trillion in assets.
Abstract: Hurricane Betsy, which hit Louisiana September 9, 1965, was one of the most intense, deadly, and costly storms ever to make landfall in the United States: it killed 76 people in Louisiana and caused $1.5 billion in damage—equal to nearly $10 billion in 2010 dollars. In 1965, no flood insurance was available, so victims had to rely on friends and family, charities, or federal relief. After that catastrophe, the U.S. government established a new program in 1968—the National Flood Insurance Program (NFIP)—to make flood insurance widely available. Now, after more than 40 years of operation, the NFIP is today one of the longest standing government-run disaster insurance programs in the world. In this paper, I present an overview of the 40 years of operation of the National Flood Insurance Program, starting with how and why it was created and how it has evolved to now cover $1.23 trillion in assets. I analyze the financial balance of the NFIP between 1969 and 2008. Excluding the 2005 hurricane season (which included Hurricane Katrina) as an outlier, policyholders have paid nearly $11 billion more in premiums than they have received in claim reimbursements over that period. However, the program has spent an average of 40 percent of all collected premiums on administrative expenses, more than three quarters of which were paid to private insurance intermediaries who sell and manage flood insurance policies on behalf of the federal government but do not bear any risk. I present challenges the NFIP faces today and propose ways those challenges might be overcome through innovative modifications.

Journal ArticleDOI
TL;DR: A parsimonious quasi-rational model is presented that falls between rational expectations and (naive) intuitive expectations, which predicts that agents with natural expectations turn out to form beliefs that don't sufficiently account for the fact that good times (or bad times) won't last forever.
Abstract: In recent decades, research in economics and finance has largely focused on the rational actor model, in which economic agents process all available information perfectly. In contrast to an older perspective represented by Keynes and Pigou, the rational model rules out unjustified optimism or pessimism as an amplifying force for aggregate fluctuations. Consequently, the rational model struggles to explain some of the most prominent facts we observe in macroeconomics, such as large swings in asset prices, in other words “bubbles”, as well as credit cycles, investment cycles, and other mechanisms that contribute to the length and severity of economic contractions. Relaxing the assumption of perfect rationality is a potential way of explaining aggregate volatility. Unfortunately, economists lack a consensus on how to do this. Economists often point out that psychological concepts like Keynesian “animal spirits” (Keynes, 1936) are vague and potentially even untestable (for instance, Fama, 1998). If a sample of macroeconomists were forced to write down a formal model of animal spirits, most wouldn’t know where to start and the rest would produce models that had little in common. In contrast, the rational actor model is conceptually elegant, disciplined, and parsimonious. However, even the assumption of perfect rationality is not sufficient for modeling discipline. Creative assumptions about technology, preferences, information, and market frictions can offset the parsimony purchased with rational beliefs. If the methodological goal is modeling discipline, formal quasi-rational models with a small number of free parameters should also be serious contenders. The methodological litmus tests should be parsimony, portability, and explanatory power (Gabaix and Laibson, 2008 propose a list of seven properties of good models). Rational models are only one potential means to these ends. In this paper, we make the case that quasi-rational models deserve greater attention. We begin by discussing a large body of empirical evidence which suggests that beliefs systematically deviate from perfect rationality. Much of the evidence implies that economic agents tend to form forecasts that are excessively influenced by recent changes – in other words, some form of “extrapolation bias.” We then present a parsimonious quasi-rational model that we call natural expectations, which falls between rational expectations and (naive) intuitive expectations. Intuitive expectations are formed by running growth regressions with a limited number of right-hand-side variables. As we will see, this leads to excessively extrapolative beliefs in certain classes of environments. Next, we show empirically that many U.S. macroeconomic time series have hump-shaped dynamics -- in other words, they exhibit momentum in the short run and (partial) mean reversion in the long run. Natural expectations turn out to be sophisticated enough to capture the short-run momentum but fail to fully reflect the more subtle long-run mean reversion. Hence, when the true dynamics are hump-shaped, natural expectations overstate the long-run persistence of economic shocks. In other words, agents with natural expectations turn out to form beliefs that don’t sufficiently account for the fact that good times (or bad times) won’t last forever. Finally, we embed natural expectations in a simple dynamic macroeconomic model and compare the simulated properties of the model to the available empirical evidence. The model’s predictions match many patterns observed in macroeconomic and financial time series, such as high volatility of asset prices, predictable up-and-down cycles in equity returns, and a negative relationship between current consumption growth and future equity returns. We also discuss the model’s shortcomings, how these can be alleviated, and other potential directions for research in this area.

Journal ArticleDOI
TL;DR: The authors provides an overview of the long-term impacts of the Columbian Exchange, i.e., the exchange of diseases, ideas, food crops, technologies, populations, and cultures between the New World and the Old World after Christopher Columbus' voyage to the Americas in 1492.
Abstract: This paper provides an overview of the long-term impacts of the Columbian Exchange -- that is, the exchange of diseases, ideas, food crops, technologies, populations, and cultures between the New World and the Old World after Christopher Columbus' voyage to the Americas in 1492. We focus on the aspects of the exchange that have been most neglected by economic studies; namely the transfer of diseases, food crops, and knowledge between the two Worlds. We pay particular attention to the effects of the exchange on the Old World.

Journal ArticleDOI
TL;DR: In this article, the authors make a stunningly good case for relying on either purposefully randomized or accidentally randomized experiments to relieve the doubts that afflict inferences from nonexperimental data.
Abstract: My first reaction to "The Credibility Revolution in Empirical Economics," authored by Joshua D. Angrist and Jorn-Steffen Pischke, was: Wow! This paper makes a stunningly good case for relying on purposefully randomized or accidentally randomized experiments to relieve the doubts that afflict inferences from nonexperimental data. On further reflection, I realized that I may have been overcome with irrational exuberance. Moreover, with this great honor bestowed on my "con" article, I couldn't easily throw this child of mine overboard. As Angrist and Pischke persuasively argue, either purposefully randomized experiments or accidentally randomized "natural" experiments can be extremely helpful, but Angrist and Pischke seem to me to overstate the potential benefits of the approach. I begin with some thoughts about the inevitable limits of randomization, and the need for sensitivity analysis in this area, as in all areas of applied empirical work. I argue that the recent financial catastrophe is a powerful illustration of the fact that extrapolating from natural experiments will inevitably be hazardous. I discuss how the difficulties of applied econometric work cannot be evaded with econometric innovations, offering as examples some under-recognized difficulties with instrumental variables and robust standard errors. I conclude with comments about the shortcomings of an experimentalist paradigm as applied to macroeconomics, and some warnings about the willingness of applied economists to apply push-button methodologies without sufficient hard thought regarding their applicability and shortcomings.

Journal ArticleDOI
Dani Rodrik1
TL;DR: In the last 50 years, few branches of economics have wielded as much influence on the world of policy as development economics as mentioned in this paper, and many major development strategies are associated with some pioneering research that provided its intellectual underpinnings.
Abstract: Few branches of economics have wielded as much infl uence on the world ew branches of economics have wielded as much infl uence on the world of policy as development economics. Virtually every major development of policy as development economics. Virtually every major development strategy of the last 50 years is associated with some pioneering research that strategy of the last 50 years is associated with some pioneering research that provided its intellectual underpinnings. Consider some of the key milestones. The provided its intellectual underpinnings. Consider some of the key milestones. The dominant import substitution policies of the 1950s and 1960s were the practical dominant import substitution policies of the 1950s and 1960s were the practical realization of the ideas of Prebisch (1959) and Singer (1964) and were based on the realization of the ideas of Prebisch (1959) and Singer (1964) and were based on the famous Prebisch-Singer thesis on the declining terms of trade for primary products famous Prebisch-Singer thesis on the declining terms of trade for primary products and the dynamic benefi ts of manufacturing. The emphasis on development planand the dynamic benefi ts of manufacturing. The emphasis on development planning in those same decades was greatly infl uenced by Rosenstein-Rodan’s (1943) ning in those same decades was greatly infl uenced by Rosenstein-Rodan’s (1943) “Big Push” framework, with its stress on increasing returns to scale and the need “Big Push” framework, with its stress on increasing returns to scale and the need to kick-start growth through large-scale investments, and the planning model of to kick-start growth through large-scale investments, and the planning model of Mahalanobis (1955), which argued that economic development could be accelerMahalanobis (1955), which argued that economic development could be accelerated by government encouragement of heavy industry. ated by government encouragement of heavy industry. When such models were discarded in the 1980s in favor of more outwardWhen such models were discarded in the 1980s in favor of more outwardand market-oriented strategies, it was in no small measure because of the research and market-oriented strategies, it was in no small measure because of the research published during the 1970s by Balassa (1971), Bhagwati (1978), Krueger (1978), published during the 1970s by Balassa (1971), Bhagwati (1978), Krueger (1978), and Little, Scitovsky, and Scott (1970). The “Washington Consensus” of the 1990s, and Little, Scitovsky, and Scott (1970). The “Washington Consensus” of the 1990s, despite its appellation, represented the common views of a group of despite its appellation, represented the common views of a group of Latin American technocrats and policymakers, many of whom had trained at top economics departtechnocrats and policymakers, many of whom had trained at top economics departments in the United States. The infl uential “Human Development Reports” of the

Journal ArticleDOI
TL;DR: The authors used a new data source, American Mathematics Competitions, to examine the gender gap among high school students at very high achievement levels and found that there is a large gender gap that widens dramatically at percentiles above those that can be examined using standard data sources.
Abstract: This paper uses a new data source, American Mathematics Competitions, to examine the gender gap among high school students at very high achievement levels. The data bring out several new facts. There is a large gender gap that widens dramatically at percentiles above those that can be examined using standard data sources. An analysis of unobserved heterogeneity indicates that there is only moderate variation in the gender gap across schools. The highest achieving girls in the U.S. are concentrated in a very small set of elite schools, suggesting that almost all girls with the ability to reach high math achievement levels are not doing so.

Journal ArticleDOI
Derek Neal1
TL;DR: In this article, a method for deriving context-specific measures of school performance is discussed, where a percentile performance index tells public officials how often the students in a particular school or classroom perform better than students in other schools who began the year in similar circumstances with respect to their prior achievements, the compositions of their classmates, and their family backgrounds.
Abstract: The No Child Left Behind law is flawed for many reasons, but the most important is that it is built around proficiency targets. Proficiency rates are not useful metrics of school performance because universal proficiency is not a socially efficient goal for principals and teachers. Further, the variation in proficiency rates among schools reflects, in large part, interschool differences in student background characteristics. The designers of accountability systems must move away from systems designed around a one-size-fits-all standard and begin designing systems that organize and promote competition among schools. Well-organized competition among schools is the best vehicle for making sure that schools use public funds efficiently. If education officials pursue this paradigm, they must develop relative performance measures that assess the outcomes of these contests while making reasonable allowance for differences in student populations served by public schools. I will discuss a method for deriving context-specific measures of school performance. A percentile performance index tells public officials how often the students in a particular school or classroom perform better than students in other schools who began the year in similar circumstances with respect to their prior achievements, the compositions of their classmates, and their family backgrounds. This index of relative performance provides the information policymakers need to make preliminary judgments concerning when to reorganize a given school and give a new staff the opportunity to prove they can do better.

Journal ArticleDOI
TL;DR: In this paper, an analysis of the commercial paper market during the financial crisis is presented, and the most important developments during the crisis of 2007-2009 are discussed. And three explanations of the decline in the Commercial Paper market are discussed: substitution to alternative sources of financing by commercial paper issuers, adverse selection, and institutional constraints among money market funds.
Abstract: Commercial paper is a short-term debt instrument issued by large corporations. The commercial paper market has long been viewed as a bastion of high liquidity and low risk. But twice during the financial crisis of 2007-2009, the commercial paper market nearly dried up and ceased being perceived as a safe haven. Major interventions by the Federal Reserve, including large outright purchases of commercial paper, were eventually used to support both issuers of and investors in commercial paper. We will offer an analysis of the commercial paper market during the financial crisis. First, we describe the institutional background of the commercial paper market. Second, we analyze the supply and demand sides of the market. Third, we examine the most important developments during the crisis of 2007-2009. Last, we discuss three explanations of the decline in the commercial paper market: substitution to alternative sources of financing by commercial paper issuers, adverse selection, and institutional constraints among money market funds.

Journal ArticleDOI
TL;DR: Hall et al. as mentioned in this paper presented a simple macro model of a fi nancial crisis on output and employment and showed that realistic increases in financial frictions that occurred in the crisis of late 2008 will generate increases in real GDP and employment of the magnitude that occurred.
Abstract: The worst fi nancial crisis in the history of the United States and many other he worst fi nancial crisis in the history of the United States and many other countries started in 1929. The Great Depression followed. The second-worst countries started in 1929. The Great Depression followed. The second-worst struck in the fall of 2008 and the Great Recession followed. Commentators struck in the fall of 2008 and the Great Recession followed. Commentators have dwelt endlessly on the causes of these and other deep fi nancial collapses. have dwelt endlessly on the causes of these and other deep fi nancial collapses. Less conspicuous has been the macroeconomists’ concern about why output and Less conspicuous has been the macroeconomists’ concern about why output and employment collapse after a fi nancial crisis and remain at low levels for several or employment collapse after a fi nancial crisis and remain at low levels for several or many years after the crisis. This article pursues modern answers to that question. It many years after the crisis. This article pursues modern answers to that question. It focuses on events in the United States since 2008. focuses on events in the United States since 2008. Existing macroeconomic models account successfully for the immediate effects Existing macroeconomic models account successfully for the immediate effects of a fi nancial crisis on output and employment. I will lay out a simple macro model of a fi nancial crisis on output and employment. I will lay out a simple macro model that captures the most important features of modern models and show that realistic that captures the most important features of modern models and show that realistic increases in fi nancial frictions that occurred in the crisis of late 2008 will generate increases in fi nancial frictions that occurred in the crisis of late 2008 will generate declines in real GDP and employment of the magnitude that occurred. But this declines in real GDP and employment of the magnitude that occurred. But this model cannot explain why GDP and employment failed to recover once the fi nancial model cannot explain why GDP and employment failed to recover once the fi nancial crisis subsided—the model implies a recovery as soon as fi nancial frictions return to crisis subsided—the model implies a recovery as soon as fi nancial frictions return to normal. At the end of the article I will mention the ideas that are in play to explain normal. At the end of the article I will mention the ideas that are in play to explain the persistent adverse effects of temporary crises, but these ideas have not made their the persistent adverse effects of temporary crises, but these ideas have not made their way into the mainstream model. way into the mainstream model. This article cites only a few of the many important contributions to the mainThis article cites only a few of the many important contributions to the mainstream model. My paper Hall (2009) contains many citations and the forthcoming stream model. My paper Hall (2009) contains many citations and the forthcoming new volume of the new volume of the Handbook of Monetary Economics discusses the literature fully. discusses the literature fully.

Journal ArticleDOI
TL;DR: In this paper, the authors address the criticism of structural analysis and its use in industrial organization, and consider why empirical analysis in industrial organizations differs in such striking ways from that in fields such as labor, which have recently emphasized the methods favored by Angrist and Pischke.
Abstract: Without a doubt, there has been a "credibility revolution" in applied econometrics. One contributing development has been in the improvement and increased use in data analysis of "structural methods"; that is, the use of models based in economic theory. Structural modeling attempts to use data to identify the parameters of an underlying economic model, based on models of individual choice or aggregate relations derived from them. Structural estimation has a long tradition in economics, but better and larger data sets, more powerful computers, improved modeling methods, faster computational techniques, and new econometric methods such as those mentioned above have allowed researchers to make significant improvements. While Angrist and Pischke extol the successes of empirical work that estimates "treatment effects" based on actual or quasi-experiments, they are much less sanguine about structural analysis and hold industrial organization up as an example where "progress is less dramatic." Indeed, reading their article one comes away with the impression that there is only a single way to conduct credible empirical analysis. This seems to us a very narrow and dogmatic approach to empirical work; credible analysis can come in many guises, both structural and nonstructural, and for some questions structural analysis offers important advantages. In this comment, we address the criticism of structural analysis and its use in industrial organization, and consider why empirical analysis in industrial organization differs in such striking ways from that in fields such as labor, which have recently emphasized the methods favored by Angrist and Pischke.

Journal ArticleDOI
TL;DR: For example, the authors found that male-female ratios of students scoring in the high ranges of standardized tests vary significantly across the United States, and that states where males are highly overrepresented in the top math and science scores also tend to be states where women are highly underrepresented in reading scores.
Abstract: The causes and consequences of gender disparities in standardized test scores -- especially in the high tails of achievement -- have been a topic of heated debate. The existing evidence on standardized test scores largely confirms the prevailing stereotypes that more men than women excel in math and science while more women than men excel in tests of language and reading. We provide a new perspective on this gender gap in test scores by analyzing the variation in these disparities across geographic areas. We illustrate that male-female ratios of students scoring in the high ranges of standardized tests vary significantly across the United States. This variation is systematic in several important ways. In particular, states where males are highly overrepresented in the top math and science scores also tend to be states where women are highly overrepresented in the top reading scores. This pattern suggests that states vary in their adherence to stereotypical gender performance, rather than favoring one sex over the other across all subjects. Furthermore, since the genetic distinction and the hormonal differences between sexes that might affect early cognitive development (that is, innate abilities) are likely the same regardless of the state in which a person happens to be born, the variation we find speaks to the nature-versus-nurture debates surrounding test scores and suggests environments significantly impact gender disparities in test scores.

Journal ArticleDOI
TL;DR: The authors assesses the 2007-2009 U.S. recession using neoclassical business cycle theory and find that lower labor input accounts for virtually all of the decline in income and output in the United States.
Abstract: This paper assesses the 2007–2009 recession using neoclassical business cycle theory. I find that the 2007–2009 U.S. recession differs substantially from other postwar U.S. recessions, and also from the 2008 recession in other countries, in that lower labor input accounts for virtually all of the decline in income and output in the United States, while lower productivity accounts for much of other U.S. recessions and the 2007–2009 recession in other countries. I also find that existing classes of models, including financial market imperfections models, do not explain the U.S. recession. This is because the 2007–2009 recession is almost exclusively related to what appear to be labor market distortions that drive a wedge between the marginal product of labor and the marginal rate of substitution between consumption and leisure, a topic about which current classes of financial imperfection models are largely silent. I discuss future avenues for developing this class of models, and I consider alterna...

Journal ArticleDOI
TL;DR: It is argued that forces such as changes in the structure of employer-provided pensions and Social Security are likely to propel future increases in labor force participation at older ages, as well as the shift in the skill composition of the workforce, and technological change.
Abstract: Population aging is not a looming crisis of the future—it is already here. Populations age when life expectancy rises and fertility declines. Economic challenges arise when the increase in people surviving to old age and the decline in the number of young people alive to support them cause the growth in society’s consumption needs to outpace growth in its productive capacity. The ultimate impact of population aging on our standard of living in the future depends a great deal on how long people choose to work before they retire from the labor force. Here, there is reason for optimism. The end of the twentieth century witnessed a profound change in retirement behavior. For over a century, the labor force participation rate of men over age 65 declined—falling steadily from 75 percent in the late 1800s to just 16 percent in 1990 (Moen, 1994; Costa, 1998). At the end of the twentieth century, however, the labor force participation rate of older men began to rise (Quinn, 2002). The labor force participation rate of older women rose as well, following a remarkable increase in labor force participation among younger women over many decades. A constellation of forces, some just now gaining momentum, has raised labor force participation at older ages at just the time it is needed. Age-related health declines and the reluctance of employers to hire and retain older workers present challenges, but the outlook for future gains in labor force participation at older ages is promising. The labor market is accommodating older workers to some degree, and older men and women are themselves adapting on a number of fronts, which could substantially lessen the economic impact of population aging. The paper begins by documenting the striking shift in the population age distribution well under way, the slowdown in labor force growth, and the corresponding rise in the economic dependency ratio, which can be viewed as a measure of society’s consumption needs relative to its productive capacity. We document the historic turnaround in labor force participation and show how future increases in participation could significantly dampen the rise in the economic dependency ratio. In the second section, we turn to the most important factors behind the increase in labor force participation realized to date: the shift in the skill composition of the workforce, and technological change. In the third section, we argue that forces such as changes in the structure of employer-provided pensions and Social Security are likely to propel future increases in labor force participation at older ages. The fourth section illustrates the diversity of adaptations already at play in the labor market as older men and women seek to extend their working lives. Finally, we discuss the relatively less dramatic population aging in the United States compared to other high-income countries and how the United States is better poised than many countries to attain further gains in labor force participation at older ages.

Journal ArticleDOI
TL;DR: The best known single result from the article is "Engel's law" as mentioned in this paper, which states that the poorer a family is, the larger the budget share it spends on nourishment.
Abstract: Engel curves describe how household expenditure on particular goods or services depends on household income. German statistician Ernst Engel (1821-1896) was the first to investigate this relationship systematically in an article published about 150 years ago. The best-known single result from the article is "Engel's law," which states that the poorer a family is, the larger the budget share it spends on nourishment. We revisit Engel's article, including its context and the mechanics of the argument. Because the article was completed a few decades before linear regression techniques were established and income effects were incorporated into standard consumer theory, Engel was forced to develop his own approach to analyzing household expenditure patterns. We find his work contains some interesting features in juxtaposition to both the modern and classical literature. For example, Engel's way of estimating the expenditure-income relationship resembles a data-fitting technique called the "regressogram" that is nonparametric -- in that no functional form is specified before the estimation. Moreover, Engel introduced a way of categorizing household expenditures in which expenditures on commodities that served the same purpose by satisfying the same underlying "want" were grouped together. This procedure enabled Engel to discuss the welfare implications of his results in terms of the Smithian notion that individual welfare is related to the satisfaction of wants. At the same time, he avoided making a priori assumptions about which specific goods were necessities, assumptions which were made by many classical economists like Adam Smith. Finally, we offer a few thoughts about some modern literature that builds on Engel's research.

Journal ArticleDOI
TL;DR: A review of the recent evolution of thinking and evidence regarding the effectiveness of activist fiscal policy can be found in this article, where the authors review the debate about the traditional types of fiscal policy interventions, such as broad-based tax cuts and spending increases, as well as more targeted policies.
Abstract: During and after the "Great Recession" that began in December 2007, the U.S. federal government enacted several rounds of activist fiscal policy. In this paper, we review the recent evolution of thinking and evidence regarding the effectiveness of activist fiscal policy. Although fiscal interventions aimed at stimulating and stabilizing the economy have returned to common use, their efficacy remains controversial. We review the debate about the traditional types of fiscal policy interventions, such as broad-based tax cuts and spending increases, as well as more targeted policies. While there have been improvements in estimates of the effects of broad-based policies, much of what has been learned recently concerns how such multipliers might vary with respect to economic conditions, such as the credit market disruptions and very low interest rates that were central features of the Great Recession. The eclectic and innovative interventions by the Federal Reserve and other central banks during this period highlight the imprecise divisions between monetary and fiscal policy and the many channels through which fiscal policies can be implemented.

Journal ArticleDOI
TL;DR: The standard policy tools for treating the social costs of bank failures include regulatory supervision and risk-based capital requirements to reduce the chance of a solvency threatening loss of capital; capital requirements for reducing the risk of a bank insolvency; and regressive resolution mechanisms, which give authorities the power to effiently restructure or liquidate a bank as mentioned in this paper.
Abstract: Abank is conventionally viewed as an intermediary between depositors, who bank is conventionally viewed as an intermediary between depositors, who desire short-term liquidity, and borrowers, who seek project fi desire short-term liquidity, and borrowers, who seek project fi nancing. Occasionally, perhaps from an unexpected surge in the cash withdrawals Occasionally, perhaps from an unexpected surge in the cash withdrawals of depositors or from a shock to the ability of borrowers to repay their loans, deposiof depositors or from a shock to the ability of borrowers to repay their loans, depositors may become concerned over the bank’s solvency. Depositors may then “run,” tors may become concerned over the bank’s solvency. Depositors may then “run,” accelerating or worsening the bank’s failure. The standard policy tools for treataccelerating or worsening the bank’s failure. The standard policy tools for treating the social costs of bank failures include regulatory supervision and risk-based ing the social costs of bank failures include regulatory supervision and risk-based capital requirements to reduce the chance of a solvency threatening loss of capital; capital requirements to reduce the chance of a solvency threatening loss of capital; deposit insurance to reduce the incentives of individual depositors to trigger cash deposit insurance to reduce the incentives of individual depositors to trigger cash insolvency by racing each other to withdraw their deposits; and reg ulatory resoinsolvency by racing each other to withdraw their deposits; and reg ulatory resolution mechanisms, which give authorities the power to effi ciently restructure or lution mechanisms, which give authorities the power to effi ciently restructure or liquidate a bank. liquidate a bank. During the recent fi nancial crisis, major dealer banks—that is, banks that

Journal ArticleDOI
TL;DR: The fact is, economics is not an experimental science and cannot be. as discussed by the authors The essay by Angrist and Pischke, in its enthusiasm for some real accomplishments in certain subfields of economics, makes overbroad claims for its favored methodologies.
Abstract: The fact is, economics is not an experimental science and cannot be. "Natural" experiments and "quasi" experiments are not in fact experiments. They are rhetorical devices that are often invoked to avoid having to confront real econometric difficulties. Natural, quasi-, and computational experiments, as well as regression discontinuity design, can all, when well applied, be useful, but none are panaceas. The essay by Angrist and Pischke, in its enthusiasm for some real accomplishments in certain subfields of economics, makes overbroad claims for its favored methodologies. What the essay says about macroeconomics is mainly nonsense. Consequently, I devote the central part of my comment to describing the main developments that have helped take some of the con out of macroeconomics. Recent enthusiasm for single-equation, linear, instrumental variables approaches in applied microeconomics has led many in these fields to avoid undertaking research that would require them to think formally and carefully about the central issues of nonexperimental inference -- what I see and many see as the core of econometrics. Providing empirically grounded policy advice necessarily involves confronting these difficult central issues.

Journal ArticleDOI
TL;DR: The role of economic theory in empirical work in development economics with special emphasis on general equilibrium and political economy considerations is discussed in this article, where the authors argue that economic theory plays a central role in formulating models, estimates of which can be used for counterfactual and policy analysis.
Abstract: I discuss the role of economic theory in empirical work in development economics with special emphasis on general equilibrium and political economy considerations. I argue that economic theory plays (should play) a central role in formulating models, estimates of which can be used for counterfactual and policy analysis. I discuss why counterfactual analysis based on microdata that ignores general equilibrium and political economy issues may lead to misleading conclusions. I illustrate the main arguments using examples from recent work in development economics and political economy.

Journal ArticleDOI
TL;DR: The role of the American Economic Association (AEA) Ad Hoc Committee on the Job Market in the market for new Ph.D. economists has been discussed in this paper.
Abstract: This paper, written by the members of the American Economic Association (AEA) Ad Hoc Committee on the Job Market, provides an overview of the market for new Ph.D. economists. It describes the role of the AEA in the market and focuses in particular on two mechanisms adopted in recent years at the suggestion of our Committee. First, job market applicants now have a signaling service to send an expression of special interest to up to two employers prior to interviews at the January Allied Social Science Associations (ASSA) meetings. Second, the AEA now invites candidates who are still on the market, and employers whose positions are still vacant, to participate in a web-based "scramble" to reduce search costs and thicken the late part of the job market. We present statistics on the activity in these market mechanisms and present survey evidence that both mechanisms have facilitated matches. The paper concludes by discussing the emergence of platforms for transmitting job market information and other design issues that may arise in the market for new economists.