scispace - formally typeset
Search or ask a question

Showing papers by "National Bureau of Economic Research published in 2017"


Journal ArticleDOI
30 Jun 2017-Science
TL;DR: This article developed a flexible architecture for computing damages that integrates climate science, econometric analyses, and process models, and used this approach to construct spatially explicit, probabilistic, and empirically derived estimates of economic damage in the United States from climate change.
Abstract: Estimates of climate change damage are central to the design of climate policies. Here, we develop a flexible architecture for computing damages that integrates climate science, econometric analyses, and process models. We use this approach to construct spatially explicit, probabilistic, and empirically derived estimates of economic damage in the United States from climate change. The combined value of market and nonmarket damage across analyzed sectors-agriculture, crime, coastal storms, energy, human mortality, and labor-increases quadratically in global mean temperature, costing roughly 1.2% of gross domestic product per +1°C on average. Importantly, risk is distributed unequally across locations, generating a large transfer of value northward and westward that increases economic inequality. By the late 21st century, the poorest third of counties are projected to experience damages between 2 and 20% of county income (90% chance) under business-as-usual emissions (Representative Concentration Pathway 8.5).

621 citations


Posted Content
TL;DR: In this paper, the authors analyzed micro panel data from the U.S. Economic Census since 1982 and international sources and document empirical patterns to assess a new interpretation of the fall in the labor share based on the rise of "superstar firms."
Abstract: The fall of labor's share of GDP in the United States and many other countries in recent decades is well documented but its causes remain uncertain. Existing empirical assessments of trends in labor's share typically have relied on industry or macro data, obscuring heterogeneity among firms. In this paper, we analyze micro panel data from the U.S. Economic Census since 1982 and international sources and document empirical patterns to assess a new interpretation of the fall in the labor share based on the rise of "superstar firms." If globalization or technological changes advantage the most productive firms in each industry, product market concentration will rise as industries become increasingly dominated by superstar firms with high profits and a low share of labor in firm value-added and sales. As the importance of superstar firms increases, the aggregate labor share will tend to fall. Our hypothesis offers several testable predictions: industry sales will increasingly concentrate in a small number of firms; industries where concentration rises most will have the largest declines in the labor share; the fall in the labor share will be driven largely by between-firm reallocation rather than (primarily) a fall in the unweighted mean labor share within firms; the between-firm reallocation component of the fall in the labor share will be greatest in the sectors with the largest increases in market concentration; and finally, such patterns will be observed not only in U.S. firms, but also internationally. We find support for all of these predictions.

587 citations


Journal ArticleDOI
22 Dec 2017-Science
TL;DR: Although parts of many jobs may be “suitable for ML” (SML), other tasks within these same jobs do not fit the criteria for ML well; hence, effects on employment are more complex than the simple replacement and substitution story emphasized by some.
Abstract: Digital computers have transformed work in almost every sector of the economy over the past several decades ( 1 ). We are now at the beginning of an even larger and more rapid transformation due to recent advances in machine learning (ML), which is capable of accelerating the pace of automation itself. However, although it is clear that ML is a “general purpose technology,” like the steam engine and electricity, which spawns a plethora of additional innovations and capabilities ( 2 ), there is no widely shared agreement on the tasks where ML systems excel, and thus little agreement on the specific expected impacts on the workforce and on the economy more broadly. We discuss what we see to be key implications for the workforce, drawing on our rubric of what the current generation of ML systems can and cannot do [see the supplementary materials (SM)]. Although parts of many jobs may be “suitable for ML” (SML), other tasks within these same jobs do not fit the criteria for ML well; hence, effects on employment are more complex than the simple replacement and substitution story emphasized by some. Although economic effects of ML are relatively limited today, and we are not facing the imminent “end of work” as is sometimes proclaimed, the implications for the economy and the workforce going forward are profound.

445 citations


Journal ArticleDOI
TL;DR: In this paper, a task-based framework is proposed to characterize the equilibrium in a dynamic setting where tasks previously performed by labor can be automated and more complex versions of existing tasks, in which labor has a comparative advantage, can be created.
Abstract: The advent of automation and the simultaneous decline in the labor share and employment among advanced economies raise concerns that labor will be marginalized and made redundant by new technologies. We examine this proposition using a task-based framework in which tasks previously performed by labor can be automated and more complex versions of existing tasks, in which labor has a comparative advantage, can be created. We characterize the equilibrium in this model and establish how the available technologies and the choices of firms between producing with capital or labor determine factor prices and the allocation of factors to tasks. In a static version of our model where capital is fixed and technology is exogenous, automation reduces employment and the share of labor in national income and may even reduce wages, while the creation of more complex tasks has the opposite effects. Our full model endogenizes capital accumulation and the direction of research towards automation and the creation of new complex tasks. Under reasonable conditions, there exists a stable balanced growth path in which the two types of innovations go hand-in-hand. An increase in automation reduces the cost of producing using labor, and thus discourages further automation and encourages the faster creation of new complex tasks. The endogenous response of technology restores the labor share and employment back to their initial level. Although the economy contains powerful self correcting forces, the equilibrium generates too much automation. Finally, we extend the model to include workers of different skills. We find that inequality increases during transitions, but the self-correcting forces in our model also limit the increase in inequality over the long-run.

443 citations


Posted Content
TL;DR: The authors summarize the major themes and contributions driving the empirical literature since 2011 reviews, and try to interpret the literature in light of an overarching conceptual framework about how human capital is produced early in life.
Abstract: That prenatal events can have life-long consequences is now well established. Nevertheless, research on the Fetal Origins Hypothesis is flourishing and has expanded to include the early childhood (postnatal) environment. Why does this literature have a “second act?” We summarize the major themes and contributions driving the empirical literature since our 2011 reviews, and try to interpret the literature in light of an overarching conceptual framework about how human capital is produced early in life. One major finding is that relatively mild shocks in early life can have substantial negative impacts, but that the effects are often heterogeneous reflecting differences in child endowments, budget constraints, and production technologies. Moreover, shocks, investments, and interventions can interact in complex ways that are only beginning to be understood. Many advances in our knowledge are due to increasing accessibility of comprehensive administrative data that allow events in early life to be linked to long-term outcomes. Yet, we still know relatively little about the interval between, and thus about whether it would be feasible to identify and intervene with affected individuals at some point between early life and adulthood. We do know enough, however, to be able to identify some interventions that hold promise for improving child outcomes in early life and throughout the life course.Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.

427 citations


Posted Content
TL;DR: In this article, the authors document the evolution of markups based on firm-level data for the US economy since 1950 and evaluate the macroeconomic implications of an increase in average market power, which can account for a number of secular trends.
Abstract: We document the evolution of markups based on firm-level data for the US economy since 1950. Initially, markups are stable, even slightly decreasing. In 1980, average markups start to rise from 18% above marginal cost to 67% now. There is no strong pattern across industries, though markups tend to be higher, across all sectors of the economy, in smaller firms and most of the increase is due to an increase within industry. We do see a notable change in the distribution of markups with the increase exclusively due to a sharp increase in high markup firms. We then evaluate the macroeconomic implications of an increase in average market power, which can account for a number of secular trends in the last 3 decades: 1. decrease in labor share; 2. increase in capital share; 3. decrease in low skill wages; 4. decrease in labor force participation; 5. decrease in labor flows; 6. decrease in migration rates; 7. slowdown in aggregate output.

379 citations


Journal ArticleDOI
TL;DR: The authors examined the role of uncertainty shocks in a one-sector, representative-agent dynamic stochastic general-equilibrium model and found that increased uncertainty about the future may indeed have played a signicant role in worsening the Great Recession.
Abstract: This paper examines the role of uncertainty shocks in a one-sector, representative-agent dynamic stochastic general-equilibrium model. When prices are exible, uncertainty shocks are not capable of producing business-cycle comovements among key macro variables. With countercyclical markups through sticky prices, however, uncertainty shocks can generate uctuations that are consistent with business cycles. Monetary policy usually plays a key role in osetting the negative impact of uncertainty shocks. If the central bank is constrained by the zero lower bound, then monetary policy can no longer perform its usual stabilizing function and higher uncertainty has even more negative eects on the economy. Calibrating the size of uncertainty shocks using uctuations in the VIX, we nd that increased uncertainty about the future may indeed have played a signicant role in worsening the Great Recession, which is consistent with statements by policymakers, economists, and the nancial press.

379 citations


Journal ArticleDOI
TL;DR: The authors showed that when the Fed funds rate increases, banks widen the interest spreads they charge on deposits, and deposits flow out of the banking system, in areas with less deposit competition.
Abstract: We propose and test a new channel for the transmission of monetary policy. We show that when the Fed funds rate increases, banks widen the interest spreads they charge on deposits, and deposits flow out of the banking system. We present a model in which imperfect competition among banks gives rise to these relationships. An increase in the nominal interest rate increases banks' effective market power, inducing them to increase deposit spreads. Households respond by substituting away from deposits into less liquid but higher-yielding assets. Using branch-level data on all U.S. banks, we show that following an increase in the Fed funds rate, deposit spreads increase by more, and deposit supply falls by more, in areas with less deposit competition. We control for changes in banks' lending opportunities by comparing branches of the same bank. We control for changes in macroeconomic conditions by showing that deposit spreads widen immediately after a rate change, even if it is fully expected. Our results imply that monetary policy has a significant impact on how the financial system is funded, on the quantity of safe and liquid assets it produces, and on its lending to the real economy.

357 citations


Journal ArticleDOI
TL;DR: This article found that an increase in the household debt to GDP ratio predicts lower subsequent GDP growth and higher unemployment in an unbalanced panel of 30 countries from 1960 to 2012, and uncover a global household debt cycle that partly predicts the severity of the global growth slowdown after 2007.
Abstract: An increase in the household debt to GDP ratio predicts lower subsequent GDP growth and higher unemployment in an unbalanced panel of 30 countries from 1960 to 2012. Low mortgage spreads are associated with an increase in the household debt to GDP ratio and a decline in subsequent GDP growth, highlighting the importance of credit supply shocks. Economic forecasters systematically over-predict GDP growth at the end of household debt booms, suggesting an important role of flawed expectations formation. The negative relation between the change in household debt to GDP and subsequent output growth is stronger for countries with less flexible exchange rate regimes and those closer to the zero lower bound on nominal interest rates. We also uncover a global household debt cycle that partly predicts the severity of the global growth slowdown after 2007. Countries with a household debt cycle more correlated with the global household debt cycle experience a sharper decline in growth after an increase in domestic household debt.

355 citations


Journal ArticleDOI
TL;DR: Medic Medicaid expansion was associated with increased insurance coverage and access to care during the second year of implementation, but it was also associated with longer wait times for appointments, which suggests that challenges inAccess to care persist.
Abstract: BackgroundBy September 2015, a total of 29 states and Washington, D.C., were participating in Medicaid expansions under the Affordable Care Act. We examined whether Medicaid expansions were associated with changes in insurance coverage, health care use, and health among low-income adults. MethodsWe compared changes in outcomes during the 2 years after implementation of the Medicaid expansion (2014 and 2015) relative to the 4 years before expansion (2010 through 2013) in states with and without expansions, using data from the National Health Interview Survey. The sample consisted of 60,766 U.S. citizens who were 19 to 64 years of age and had incomes below 138% of the federal poverty level. Outcomes included insurance coverage, access to and use of medical care in the past 12 months, and health status as reported by the respondents. ResultsA total of 29 states and Washington, D.C., expanded Medicaid by September 1, 2015. In year 2 after implementation, uninsurance rates were reduced in expansion states rela...

351 citations


Journal ArticleDOI
TL;DR: It is found that coverage was moderately responsive to price subsidies, with larger gains in state-based insurance exchanges than the federal exchange and among previously-eligible populations even in non-expansion states.

Book ChapterDOI
TL;DR: In this paper, the authors present econometric and statistical methods for analyzing randomized experiments, and stress the general efficiency gains from stratification, and contrast intention to treat analyses with instrumental variables analyses allowing for general treatment effect heterogeneity.
Abstract: In this chapter, we present econometric and statistical methods for analyzing randomized experiments. For basic experiments, we stress randomization-based inference as opposed to sampling-based inference. In randomization-based inference, uncertainty in estimates arises naturally from the random assignment of the treatments, rather than from hypothesized sampling from a large population. We show how this perspective relates to regression analyses for randomized experiments. We discuss the analyses of stratified, paired, and clustered randomized experiments, and we stress the general efficiency gains from stratification. We also discuss complications in randomized experiments such as noncompliance. In the presence of noncompliance, we contrast intention-to-treat analyses with instrumental variables analyses allowing for general treatment effect heterogeneity. We consider, in detail, estimation and inference for heterogenous treatment effects in settings with (possibly many) covariates. These methods allow researchers to explore heterogeneity by identifying subpopulations with different treatment effects while maintaining the ability to construct valid confidence intervals. We also discuss optimal assignment to treatment based on covariates in such settings. Finally, we discuss estimation and inference in experiments in settings with interactions between units, both in general network settings and in settings where the population is partitioned into groups with all interactions contained within these groups.

Journal ArticleDOI
TL;DR: In this article, the authors argue that maximization of shareholder welfare is not the same as maximizing market value, and propose that company and asset managers should pursue policies consistent with the preferences of their investors.
Abstract: What is the appropriate objective function for a firm? We analyze this question for the case where shareholders are prosocial and externalities are not perfectly separable from production decisions. We argue that maximization of shareholder welfare is not the same as maximization of market value. We propose that company and asset managers should pursue policies consistent with the preferences of their investors. Voting by shareholders on corporate policy is one way to achieve this.

Journal ArticleDOI
TL;DR: High-resolution satellite imagery can be used to make predictions of smallholder agricultural productivity that are roughly as accurate as the survey-based measures traditionally used in research and policy applications, and they indicate a substantial near-term potential to quickly generate useful datasets on productivity in smallholder systems, even with minimal or no field training data.
Abstract: The emergence of satellite sensors that can routinely observe millions of individual smallholder farms raises possibilities for monitoring and understanding agricultural productivity in many regions of the world. Here we demonstrate the potential to track smallholder maize yield variation in western Kenya, using a combination of 1-m Terra Bella imagery and intensive field sampling on thousands of fields over 2 y. We find that agreement between satellite-based and traditional field survey-based yield estimates depends significantly on the quality of the field-based measures, with agreement highest ( R 2 R2 up to 0.4) when using precise field measures of plot area and when using larger fields for which rounding errors are smaller. We further show that satellite-based measures are able to detect positive yield responses to fertilizer and hybrid seed inputs and that the inferred responses are statistically indistinguishable from estimates based on survey-based yields. These results suggest that high-resolution satellite imagery can be used to make predictions of smallholder agricultural productivity that are roughly as accurate as the survey-based measures traditionally used in research and policy applications, and they indicate a substantial near-term potential to quickly generate useful datasets on productivity in smallholder systems, even with minimal or no field training data. Such datasets could rapidly accelerate learning about which interventions in smallholder systems have the most positive impact, thus enabling more rapid transformation of rural livelihoods.

Journal ArticleDOI
TL;DR: This work combines eight previously proposed measures to construct an index of political polarization among US adults and finds that polarization has increased the most among the demographic groups least likely to use the Internet and social media.
Abstract: We combine eight previously proposed measures to construct an index of political polarization among US adults. We find that polarization has increased the most among the demographic groups least likely to use the Internet and social media. Our overall index and all but one of the individual measures show greater increases for those older than 65 than for those aged 18–39. A linear model estimated at the age-group level implies that the Internet explains a small share of the recent growth in polarization.

Journal ArticleDOI
TL;DR: For example, the authors found that highly novel papers are more likely to be a top 1% highly cited paper in the long run, to inspire follow-on highly cited research, and to be cited in a broader set of disciplines and in disciplines that are more distant from their home field.

Journal ArticleDOI
TL;DR: It is shown that incorporating impacts on the frequency and intensity of peak load consumption during hot days implies sizable required investments in peak generating capacity (or major advances in storage technology or the structure of electricity prices), which results in substantially larger impacts than those from just changes in overall consumption.
Abstract: It has been suggested that climate change impacts on the electric sector will account for the majority of global economic damages by the end of the current century and beyond [Rose S, et al. (2014) Understanding the Social Cost of Carbon: A Technical Assessment]. The empirical literature has shown significant increases in climate-driven impacts on overall consumption, yet has not focused on the cost implications of the increased intensity and frequency of extreme events driving peak demand, which is the highest load observed in a period. We use comprehensive, high-frequency data at the level of load balancing authorities to parameterize the relationship between average or peak electricity demand and temperature for a major economy. Using statistical models, we analyze multiyear data from 166 load balancing authorities in the United States. We couple the estimated temperature response functions for total daily consumption and daily peak load with 18 downscaled global climate models (GCMs) to simulate climate change-driven impacts on both outcomes. We show moderate and heterogeneous changes in consumption, with an average increase of 2.8% by end of century. The results of our peak load simulations, however, suggest significant increases in the intensity and frequency of peak events throughout the United States, assuming today's technology and electricity market fundamentals. As the electricity grid is built to endure maximum load, our findings have significant implications for the construction of costly peak generating capacity, suggesting additional peak capacity costs of up to 180 billion dollars by the end of the century under business-as-usual.

Posted Content
TL;DR: The authors show that most US food, drugstore, and mass merchandise chains charge nearly-uniform prices across stores, despite wide variation in consumer demographics and competition, and that uniform pricing may significantly increase the prices paid by poorer households relative to the rich, dampen the response of prices to local economic shocks, alter the analysis of mergers in antitrust, and shift the incidence of intra-national trade costs.
Abstract: We show that most US food, drugstore, and mass merchandise chains charge nearly-uniform prices across stores, despite wide variation in consumer demographics and competition. Demand estimates reveal substantial within-chain variation in price elasticities and suggest that the median chain sacrifices $16m of annual profit relative to a benchmark of optimal prices. In contrast, differences in average prices between chains are broadly consistent with the optimal benchmark. We discuss a range of explanations for nearly-uniform pricing, highlighting managerial inertia and brand-image concerns as mechanisms frequently mentioned by industry participants. Relative to our optimal benchmark, uniform pricing may significantly increase the prices paid by poorer households relative to the rich, dampen the response of prices to local economic shocks, alter the analysis of mergers in antitrust, and shift the incidence of intra-national trade costs.

Journal ArticleDOI
TL;DR: The authors showed that homes exposed to sea level rise (SLR) sell for approximately 7% less than observably equivalent unexposed properties equidistant from the beach and that this discount has grown over time and is driven by sophisticated buyers and communities worried about global warming.
Abstract: Homes exposed to sea level rise (SLR) sell for approximately 7% less than observably equivalent unexposed properties equidistant from the beach. This discount has grown over time and is driven by sophisticated buyers and communities worried about global warming. Consistent with causal identification of long horizon SLR costs, we find no relation between SLR exposure and rental rates and a 4% discount among properties not projected to be flooded for almost a century. Our findings contribute to the literature on the pricing of long-run risky cash flows and provide insights for optimal climate change policy.

ReportDOI
TL;DR: This article provided a comprehensive history of anchor or reference currencies, exchange rate arrangements, and a new measure of foreign exchange restrictions for 194 countries and territories over 1946-2016, and extended their chronologies as far back as possible, even though they only classify regimes from 1946 onwards.
Abstract: Detailed country-by-country chronologies are an informative companion piece to our paper “Exchange Arrangements Entering the 21st Century: Which Anchor Will Hold?,” which provides a comprehensive history of anchor or reference currencies, exchange rate arrangements, and a new measure of foreign exchange restrictions for 194 countries and territories over 1946-2016. The individual country chronologies are also a central component of our approach to classifying regimes. These country histories date dual or multiple exchange rate episodes, as well as to differentiate between pre-announced pegs, crawling pegs, and bands from their de facto counterparts. We think it is important to distinguish between say, de facto pegs or bands from announced pegs or bands, because their properties are potentially different. The chronologies also flag the dates for important turning points, such as when the exchange rate first floated, or when the anchor currency was changed. We extend our chronologies as far back as possible, even though we only classify regimes from 1946 onwards.Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.

ReportDOI
TL;DR: In this paper, the aggregate real rate of return in the economy was analyzed for all major asset classes, including housing, and the annual data on total returns for equity, housing, bonds, and bills cover 16 advanced economies from 1870 to 2015.
Abstract: What is the aggregate real rate of return in the economy? Is it higher than the growth rate of the economy and, if so, by how much? Is there a tendency for returns to fall in the long run? Which particular assets have the highest long-run returns? We answer these questions on the basis of a new and comprehensive data set for all major asset classes, including housing. The annual data on total returns for equity, housing, bonds, and bills cover 16 advanced economies from 1870 to 2015, and our new evidence reveals many new findings and puzzles.

Journal ArticleDOI
TL;DR: A computer vision method is developed that connects changes in the physical appearance of five US cities with economic and demographic data and finds three factors that predict neighborhood improvement that are compatible with the economic literature linking human capital and local success.
Abstract: Which neighborhoods experience physical improvements? In this paper, we introduce a computer vision method to measure changes in the physical appearances of neighborhoods from time-series street-level imagery. We connect changes in the physical appearance of five US cities with economic and demographic data and find three factors that predict neighborhood improvement. First, neighborhoods that are densely populated by college-educated adults are more likely to experience physical improvements-an observation that is compatible with the economic literature linking human capital and local success. Second, neighborhoods with better initial appearances experience, on average, larger positive improvements-an observation that is consistent with "tipping" theories of urban change. Third, neighborhood improvement correlates positively with physical proximity to the central business district and to other physically attractive neighborhoods-an observation that is consistent with the "invasion" theories of urban sociology. Together, our results provide support for three classical theories of urban change and illustrate the value of using computer vision methods and street-level imagery to understand the physical dynamics of cities.

Posted Content
TL;DR: In this article, the authors characterize the factors that determine who becomes an inventor in America by using de-identified data on 1.2 million inventors from patent records linked to tax records.
Abstract: We characterize the factors that determine who becomes an inventor in America by using de-identified data on 1.2 million inventors from patent records linked to tax records. We establish three sets of results. First, children from high-income (top 1%) families are ten times as likely to become inventors as those from below-median income families. There are similarly large gaps by race and gender. Differences in innate ability, as measured by test scores in early childhood, explain relatively little of these gaps. Second, exposure to innovation during childhood has significant causal effects on children's propensities to become inventors. Growing up in a neighborhood or family with a high innovation rate in a specific technology class leads to a higher probability of patenting in exactly the same technology class. These exposure effects are gender-specific: girls are more likely to become inventors in a particular technology class if they grow up in an area with more female inventors in that technology class. Third, the financial returns to inventions are extremely skewed and highly correlated with their scientific impact, as measured by citations. Consistent with the importance of exposure effects and contrary to standard models of career selection, women and disadvantaged youth are as under-represented among high-impact inventors as they are among inventors as a whole. We develop a simple model of inventors' careers that matches these empirical results. The model implies that increasing exposure to innovation in childhood may have larger impacts on innovation than increasing the financial incentives to innovate, for instance by reducing tax rates. In particular, there are many "lost Einsteins" - individuals who would have had highly impactful inventions had they been exposed to innovation.

Journal ArticleDOI
TL;DR: The authors showed that stocks owned by ETFs exhibit significantly higher intraday and daily volatility and that the mean-reverting component of stock prices is inflated by ETF ownership, which suggests that ETFs attract a new layer of demand shocks to the stock market due to their high liquidity.
Abstract: An ongoing debate in finance centers on the impact of derivatives on the efficiency of prices of the underlying securities. The paper contributes to this literature by studying whether exchange traded funds (ETFs) — an asset of increasing importance — affect the non-fundamental volatility of the stocks in their baskets. Using identification strategies based on the mechanical variation in ETF ownership, including regression discontinuity, we show that stocks owned by ETFs exhibit significantly higher intraday and daily volatility. Variance-ratio tests, as well as price reversals, suggest that the mean-reverting component of stock prices is inflated by ETF ownership. We estimate that an increase of one standard deviation in ETF ownership is associated with an increase of 19% in intraday stock volatility. Our evidence suggests that ETFs attract a new layer of demand shocks to the stock market due to their high liquidity.

Journal ArticleDOI
TL;DR: For example, the authors found that highly novel papers are more likely to be a top 1% highly cited paper in the long run, to inspire follow on highly cited research, and to be cited in a broader set of disciplines.
Abstract: Research which explores unchartered waters has a high potential for major impact but also carries a higher uncertainty of having impact. Such explorative research is often described as taking a novel approach. This study examines the complex relationship between pursuing a novel approach and impact. Viewing scientific research as a combinatorial process, we measure novelty in science by examining whether a published paper makes first time ever combinations of referenced journals, taking into account the difficulty of making such combinations. We apply this newly developed measure of novelty to all Web of Science research articles published in 2001 across all scientific disciplines. We find that highly novel papers, defined to be those that make more (distant) new combinations, deliver high gains to science: they are more likely to be a top 1% highly cited paper in the long run, to inspire follow on highly cited research, and to be cited in a broader set of disciplines. At the same time, novel research is also more risky, reflected by a higher variance in its citation performance. In addition, we find that novel research is significantly more highly cited in "foreign" fields but not in its "home" field. We also find strong evidence of delayed recognition of novel papers and that novel papers are less likely to be top cited when using a short time window. Finally, novel papers typically are published in journals with a lower than expected Impact Factor. These findings suggest that science policy, in particular funding decisions which rely on traditional bibliometric indicators based on short-term direct citation counts and Journal Impact Factors, may be biased against "high risk/high gain" novel research. The findings also caution against a mono-disciplinary approach in peer review to assess the true value of novel research.

Journal ArticleDOI
TL;DR: This paper studied how hedge fund activism reshapes corporate innovation and found that firms targeted by hedge fund activists experience an improvement in innovation efficiency during the five-year period following the intervention, despite a tightening in R&D expenditures.
Abstract: This paper studies how hedge fund activism reshapes corporate innovation Firms targeted by hedge fund activists experience an improvement in innovation efficiency during the five-year period following the intervention Despite a tightening in R&D expenditures, target firms experience increases in innovation output, measured by both patent counts and citations, with stronger effects seen among firms with more diversified innovation portfolios We also find that the reallocation of innovative resources and the redeployment of human capital contribute to the refocusing of the scope of innovation Finally, additional tests refute alternative explanations attributing the improvement to mean reversion, sample attrition, management’s voluntary reforms, or activists’ stock-picking abilities

Journal ArticleDOI
TL;DR: In this article, the NOxBudget Program (NBP), an important cap-and-trade market for nitrogen oxides (NOx) emissions, a key ingredient in ozone air pollution, is studied.
Abstract: The demand for air quality depends on health impacts and defensive investments that improvehealth, but little research assesses the empirical importance of defenses. We study the NOxBudget Program (NBP), an important cap-and-trade market for nitrogen oxides (NOx) emissions,a key ingredient in ozone air pollution. A rich quasi-experiment suggests that the NBP decreasedNOx emissions, ambient ozone concentrations, pharmaceutical expenditures, and mortality rates.Reductions in pharmaceutical purchases and mortality are valued at about $800 million and $1.5billion annually, respectively, in a region covering 19 Eastern and Midwestern United States;these findings suggest that defensive investments account for more than one-third of thewillingness-to-pay for reductions in NOx emissions. Further, the NBP’s estimated benefits easilyexceed its costs and instrumental variable estimates indicate that the estimated benefits of NOx reductions are substantial.

Journal ArticleDOI
TL;DR: Using a large administrative change in reimbursements for surgical versus medical care, it is found that private prices follow Medicare’s lead, and these payment spillovers amplify Medicare”s impact on specialty choice and other welfare-relevant aspects of physician practices.
Abstract: We analyze Medicare's influence on private insurers' payments for physicians' services. Using a large administrative change in reimbursements for surgical versus medical care, we find that private prices follow Medicare's lead. A $1.00 increase in Medicare's fees increases corresponding private prices by $1.16. A second set of Medicare fee changes, which generates area-specific payment shocks, has a similar effect on private reimbursements. Medicare's influence is strongest in areas with concentrated insurers and competitive physician markets, consistent with insurer-doctor bargaining. By echoing Medicare's pricing changes, these payment spillovers amplify Medicare's impact on specialty choice and other welfare-relevant aspects of physician practices.

Journal ArticleDOI
TL;DR: The q-factor model has the lowest average magnitude of (and the lowest number of significant) high-minus-low alphas among all the models as discussed by the authors and outperforms a competing five-factor models in explaining momentum and profitability anomalies.
Abstract: This paper compiles an extensive data library with 437 anomaly variables. Controlling for microcaps leads to 161 significant anomalies with NYSE breakpoints and value-weighted returns and 216 with all-but-micro breakpoints and equal-weighted returns. Liquidity is largely insignificant. The q-factor model has the lowest average magnitude of (and the lowest number of significant) high-minus-low alphas among all the models. The q-factor model outperforms a competing five-factor model in explaining momentum and profitability anomalies. Fundamentals, including investment and profitability, not liquidity, are the key driving forces of the broad cross section of average stock returns.

Journal ArticleDOI
TL;DR: In this article, a discrete instrument is used to identify the marginal treatment effects under a functional structure that allows for treatment heterogeneity among individuals with the same observed characteristics and self-selection based on the unobserved gain from treatment.
Abstract: We show how a discrete instrument can be used to identify the marginal treatment effects under a functional structure that allows for treatment heterogeneity among individuals with the same observed characteristics and self-selection based on the unobserved gain from treatment. Guided by this identification result, we perform a marginal treatment effect analysis of the interaction between the quantity and quality of children. Our estimates reveal that the family size effects vary in magnitude and even sign and that families act as if they possess some knowledge of the idiosyncratic effects in the fertility decision.