scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 2010"


Posted Content
TL;DR: In this article, the results of the simulation experiments are summarized by means of response surface regressions in which critical values depend on the sample size and can be read off directly, and critical values for any finite sample size can easily be computed with a hand calculator.
Abstract: [The original version of this paper appeared as a University of California San Diego working paper in 1990 but has since disappeared from the web. This version includes a new appendix.] This paper provides tables of critical values for some popular tests of cointegration and unit roots. Although these tables are necessarily based on computer simulations, they are much more accurate than those previously available. The results of the simulation experiments are summarized by means of response surface regressions in which critical values depend on the sample size. From these regressions, asymptotic critical values can be read off directly, and critical values for any finite sample size can easily be computed with a hand calculator. Added in 2010 version: A new appendix contains additional results that are more accurate and cover more cases than the ones in the original paper.

3,211 citations


Posted Content
TL;DR: In this article, the authors developed a model of strategic communication in which a better-informed Sender (S) sends a possibly noisy signal to a Receiver (R), who then takes an action that determines the welfare of both.
Abstract: This paper develops a model of strategic communication, in which a better-informed Sender (S) sends a possibly noisy signal to a Receiver (R), who then takes an action that determines the welfare of both. We characterize the set of Bayesian Nash equilibria under standard assumptions, and show that equilibrium signaling always takes a strikingly simple form, in which S partitions the support of the (scalar) variable that represents his private information and introduces noise into his signal by reporting, in effect, only which element of the partition his observation actually lies in. We show under further assumptions that before S observes his private information, the equilibrium whose partition has the greatest number of elements is Pareto-superior to all other equilibria, and that if agents coordinate on this equilibrium, R's equilibrium expected utility rises when agents' preferences become more similar. Since R bases his choice of action on rational expectations, this establishes a sense in which equilibrium signaling is more informative when agents' preferences are more similar.

3,126 citations


Posted Content
TL;DR: The paper extends arguments used by Maynard Smith & Price (1973) showing that ritualized behaviour can evolve by individual selection, and the concept of an evolutionarily stable strategy, or ESS, is defined.
Abstract: The evolution of behaviour patterns used in animal conflicts is discussed, using models based on the theory of games. The paper extends arguments used by Maynard Smith & Price (1973) showing that ritualized behaviour can evolve by individual selection. The concept of an evolutionarily stable strategy, or ESS, is defined. Two types of ritualized contests are distinguished, “tournaments” and “displays”; the latter, defined as contests without physical contact in which victory goes to the contestant which continues longer, are analyzed in detail. Three main conclusions are drawn. The degree of persistence should be very variable, either between individuals or for the same individual at different times; a negative exponential distribution of persistence times is predicted. Individuals should display with constant intensity, independent of how much longer they will in fact continue. An initial asymmetry in the conditions of a contest can be used to settle it, even if it is irrelevant to the outcome of a more protracted conflict if one were to take place.

1,985 citations


Posted Content
TL;DR: In this article, the authors present methods that allow researchers to test causal claims in situations where randomization is not possible or when causal interpretation could be confounded; these methods include fixed-effects panel, sample selection, instrumental variable, regression discontinuity, and difference-in-differences models.
Abstract: Social scientists often estimate models from correlational data, where the independent variable has not been exogenously manipulated; they also make implicit or explicit causal claims based on these models. When can these claims be made? We answer this question by first discussing design and estimation conditions under which model estimates can be interpreted, using the randomized experiment as the gold standard. We show how endogeneity – which includes omitted variables, omitted selection, simultaneity, common-method variance, and measurement error – renders estimates causally uninterpretable. Second, we present methods that allow researchers to test causal claims in situations where randomization is not possible or when causal interpretation could be confounded; these methods include fixed-effects panel, sample selection, instrumental variable, regression discontinuity, and difference-in-differences models. Third, we take stock of the methodological rigor with which causal claims are being made in a social sciences discipline by reviewing a representative sample of 110 articles on leadership published in the previous 10 years in top-tier journals. Our key finding is that researchers fail to address at least 66% and up to 90% of design and estimation conditions that make causal claims invalid. We conclude by offering 10 suggestions on how to improve non-experimental research.

1,537 citations



BookDOI
TL;DR: The Handbook now includes updated chapters on the best known metaheuristics, including simulated annealing, tabu search, variable neighborhood search, scatter search and path relinking, genetic algorithms, memetic algorithms, genetic programming, ant colony optimization, and multi-start methods.
Abstract: The first edition of the Handbook of Metaheuristics was published in 2003 under the editorship of Fred Glover and Gary A. Kochenberger. Given the numerous developments observed in the field of metaheuristics in recent years, it appeared that the time was ripe for a second edition of the Handbook. When Glover and Kochenberger were unable to prepare this second edition, they suggested that Michel Gendreau and Jean-Yves Potvin should take over the editorship, and so this important new edition is now available. Through its 21 chapters, this second edition is designed to provide a broad coverage of the concepts, implementations and applications in this important field of optimization. Original contributors either revised or updated their work, or provided entirely new chapters. The Handbook now includes updated chapters on the best known metaheuristics, including simulated annealing, tabu search, variable neighborhood search, scatter search and path relinking, genetic algorithms, memetic algorithms, genetic programming, ant colony optimization, multi-start methods, greedy randomized adaptive search procedure, guided local search, hyper-heuristics and parallel metaheuristics. It also contains three new chapters on large neighborhood search, artificial immune systems and hybrid metaheuristics. The last four chapters are devoted to more general issues related to the field of metaheuristics, namely reactive search, stochastic search, fitness landscape analysis and performance comparison.

1,208 citations


Posted Content
TL;DR: In this paper, the authors use the RCOV framework to consider business model evolution, looking particularly at the dynamic created by interactions between its business model's components, and illustrate the case of the English football club Arsenal FC over the last decade.
Abstract: The business model concept generally refers to the articulation between different areas of a firm's activity designed to produce a proposition of value to customers. Two different uses of the term can be noted. The first is the static approach - as a blueprint for the coherence between core business model components. The second refers to a more transformational approach, using the concept as a tool to address change and innovation in the organization, or in the model itself. We build on the RCOV framework - itself inspired by a Penrosian view of the firm – to try to reconcile these two approaches to consider business model evolution, looking particularly at the dynamic created by interactions between its business model's components. We illustrate our framework with the case of the English football club Arsenal FC over the last decade. We view business model evolution as a fine tuning process involving voluntary and emergent changes in and between permanently linked core components, and find that firm sustainability depends on anticipating and reacting to sequences of voluntary and emerging change, giving the label ‘dynamic consistency’ to this firm capability to build and sustain its performance while changing its business model.

1,192 citations


Posted Content
TL;DR: The views on the potential role that online experiments can play within the social sciences are presented, and software development priorities and best practices are recommended.
Abstract: Online labor markets have great potential as platforms for conducting experiments, as they provide immediate access to a large and diverse subject pool and allow researchers to conduct randomized controlled trials. We argue that online experiments can be just as valid--both internally and externally--as laboratory and field experiments, while requiring far less money and time to design and to conduct. In this paper, we first describe the benefits of conducting experiments in online labor markets; we then use one such market to replicate three classic experiments and confirm their results. We confirm that subjects (1) reverse decisions in response to how a decision-problem is framed, (2) have pro-social preferences (value payoffs to others positively), and (3) respond to priming by altering their choices. We also conduct a labor supply field experiment in which we confirm that workers have upward sloping labor supply curves. In addition to reporting these results, we discuss the unique threats to validity in an online setting and propose methods for coping with these threats. We also discuss the external validity of results from online domains and explain why online results can have external validity equal to or even better than that of traditional methods, depending on the research question. We conclude with our views on the potential role that online experiments can play within the social sciences, and then recommend software development priorities and best practices.

1,186 citations


Posted Content
TL;DR: The authors found that futures prices of different commodities in the US became increasingly correlated with each other and this trend was significantly more pronounced for commodities in two popular GSCI and DJ-UBS commodity indices.
Abstract: This paper finds that, concurrent with the rapid growing index investment in commodities markets since early 2000s, futures prices of different commodities in the US became increasingly correlated with each other and this trend was significantly more pronounced for commodities in the two popular GSCI and DJ-UBS commodity indices. This finding reflects a financialization process of commodities markets and helps explain the synchronized price boom and bust of a broad set of seemingly unrelated commodities in the US in 2006-2008. In contrast, such commodity price comovements were absent in China, which refutes growing commodity demands from emerging economies as the driver.

990 citations


Posted Content
TL;DR: This article reviewed progress in empirical economics since Leamer's critique and pointed out that the credibility revolution in empirical work can be traced to the rise of a design-based approach that emphasizes the identification of causal effects.
Abstract: This essay reviews progress in empirical economics since Leamer's (1983) critique. Leamer highlighted the benefits of sensitivity analysis, a procedure in which researchers show how their results change with changes in specification or functional form. Sensitivity analysis has had a salutary but not a revolutionary effect on econometric practice. As we see it, the credibility revolution in empirical work can be traced to the rise of a design-based approach that emphasizes the identification of causal effects. Design-based studies typically feature either real or natural experiments and are distinguished by their prima facie credibility and by the attention investigators devote to making the case for a causal interpretation of the findings their designs generate. Design-based studies are most often found in the microeconomic fields of Development, Education, Environment, Labor, Health, and Public Finance, but are still rare in Industrial Organization and Macroeconomics. We explain why IO and Macro would do well to embrace a design-based approach. Finally, we respond to the charge that the design-based revolution has overreached.

913 citations


Posted Content
TL;DR: This article found that managers who believe that their firm is undervalued view external financing as overpriced, especially equity, and use less external finance and, conditional on accessing risky capital, issue less equity than their peers.
Abstract: We show that measurable managerial characteristics have significant explanatory power for corporate financing decisions beyond traditional capital-structure determinants First, managers who believe that their firm is undervalued view external financing as overpriced, especially equity Such overconfident managers use less external finance and, conditional on accessing risky capital, issue less equity than their peers Second, CEOs with Depression experience are averse to debt and lean excessively on internal finance Third, CEOs with military experience pursue more aggressive policies, including heightened leverage Complementary measures of CEO traits based on press portrayals confirm the results

ReportDOI
TL;DR: In this article, the authors surveyed evidence on the "funding gap" for investment innovation and concluded that while small and new innovative firms experience high costs of capital that are only partly mitigated by the presence of venture capital, the evidence for high costs for large firms is mixed.
Abstract: Evidence on the “funding gap” for investment innovation is surveyed. The focus is on financial market reasons for underinvestment that exist even when externality-induced underinvestment is absent. We conclude that while small and new innovative firms experience high costs of capital that are only partly mitigated by the presence of venture capital, the evidence for high costs of R&D capital for large firms is mixed. Nevertheless, large established firms do appear to prefer internal funds for financing such investments and they manage their cash flow to ensure this. Evidence shows that there are limits to venture capital as a solution to the funding gap, especially in countries where public equity markets for VC exit are not highly developed. We conclude by suggesting areas for further research.

Posted Content
TL;DR: In this article, it was shown that although backward induction cannot be applied, and perfect psychological equilibria may not exist, subgame perfect and sequential equilibrium always do exist, and that the payoff to each player depends not only on what every player does but also on what he thinks every player believes, and on what they think they believe others believe.
Abstract: In psychological games the payoff to each player depends not only on what every player does but also on what he thinks every player believes, and on what he thinks they believe others believe, and so on. In equilibrium, beliefs are assumed to correspond to reality. Yet psychological games and psychological equilibria allow one to model belief-dependent emotions such as anger and surprise that are problematic for conventional game theory. We are particularly interested in issues of sequential rationality for psychological games. We show that although backward induction cannot be applied, and “perfect” psychological equilibria may not exist, subgame perfect and sequential equilibria always do exist.

Posted Content
TL;DR: In this paper, the authors propose a conceptual framework for better understanding geographical dimensions of sustainability transitions, and begin to highlight some of the boundary boundaries of transition networks, and explicitly acknowledge and investigate a variety of transition pathways.
Abstract: In the past decade, the literature on transitions towards sustainable socio-technical systems has made a considerable contribution in understanding the complex and multi-dimensional shifts considered necessary to adapt societies and economies to sustainable modes of production and consumption. However, transition analyses have often neglected where transitions take place, and the geographical configurations and dynamics of the networks within which transition evolve. An explicit analysis of the geography of transitions contributes to the extant transitions literature in a variety of ways. Firstly it provides a contextualization and reflection on the limited territorial sensitivity of existing transitions analysis. The majority of empirical studies have been conducted in a small number of countries, and primarily the Netherlands, UK or Scandinavia, with an increasing interest in Asian countries. Secondly, it explicitly acknowledges and investigates a variety of transition pathways. Thirdly, it encompasses not only greater emphasis but also better conceptual and theoretical devices for understanding the international, trans-local nature of transition dynamics. Drawing on recent insights from economic geography, this paper seeks to improve existing transition theory by (1) creating a conceptual framework for better understanding geographical dimensions of sustainability transitions, and (2) beginning to highlight some of the boundary

Posted Content
TL;DR: In this paper, a comprehensive overview of local food systems explores alternative definitions of local foods, estimates market size and reach, describes the characteristics of local consumers and producers, and examines early indications of the economic and health impacts of Local Food Systems.
Abstract: This comprehensive overview of local food systems explores alternative definitions of local food, estimates market size and reach, describes the characteristics of local consumers and producers, and examines early indications of the economic and health impacts of local food systems. There is no consensus on a definition of “local” or “local food systems” in terms of the geographic distance between production and consumption. But defining “local” based on marketing arrangements, such as farmers selling directly to consumers at regional farmers’ markets or to schools, is well recognized. Statistics suggest that local food markets account for a small, but growing, share of U.S. agricultural production. For smaller farms, direct marketing to consumers accounts for a higher percentage of their sales than for larger farms. Findings are mixed on the impact of local food systems on local economic development and better nutrition levels among consumers, and sparse literature is so far inconclusive about whether localization reduces energy use or greenhouse gas emissions.

Book ChapterDOI
Wesley M. Cohen1
TL;DR: The authors reviewed the empirical literature on the determination of firms and industries' innovative activity and performance, highlighting the questions addressed, the approaches adopted, impediments to progress in the field, and research opportunities.
Abstract: This chapter reviews the empirical literature on the determination of firms’ and industries’ innovative activity and performance, highlighting the questions addressed, the approaches adopted, impediments to progress in the field, and research opportunities. We review the “neo-Schumpeterian” empirical literature that examines the effects of firm size and market concentration upon innovation, focusing on robust findings, questions of interpretation, and the identification of major gaps. We also consider the more modest literature that considers the effect on innovation of firm characteristics other than size. Finally, we review the literature that considers three classes of factors that affect interindustry variation in innovative activity and performance: demand, appropriability, and technological opportunity conditions.

Posted Content
TL;DR: In this article, the authors provide a model of competition among credit ratings Agencies (CRAs) in which there are three possible sources of conflicts: 1) the CRA conflict of interest of understating credit risk to attract more business; 2) the ability of issuers to purchase only the most favorable ratings; and 3) the trusting nature of some investor clienteles who may take ratings at face value.
Abstract: The collapse of so many AAA-rated structured finance products in 2007-2008 has brought renewed attention to the causes of ratings failures and the conflicts of interest in the Credit Ratings Industry. We provide a model of competition among Credit Ratings Agencies (CRAs) in which there are three possible sources of conflicts: 1) the CRA conflict of interest of understating credit risk to attract more business; 2) the ability of issuers to purchase only the most favorable ratings; and 3) the trusting nature of some investor clienteles who may take ratings at face value. We show that when combined, these give rise to three fundamental equilibrium distortions. First, competition among CRAs can reduce market efficiency, as competition facilitates ratings shopping by issuers. Second, CRAs are more prone to inflate ratings in boom times, when there are more trusting investors, and when the risks of failure which could damage CRA reputation are lower. Third, the industry practice of tranching of structured products distorts market efficiency as its role is to deceive trusting investors. We argue that regulatory intervention requiring: i) upfront payments for rating services (before CRAs propose a rating to the issuer), ii) mandatory disclosure of any rating produced by CRAs, and iii) oversight of ratings methodology can substantially mitigate ratings inflation and promote efficiency.

Posted Content
TL;DR: In this paper, the authors show that vector auto regression with Bayesian shrinkage is an appropriate tool for large dynamic models and that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis.
Abstract: This paper shows that vector auto regression (VAR) with Bayesian shrinkage is an appropriate tool for large dynamic models. We build on the results of De Mol and co-workers (2008) and show that, when the degree of shrinkage is set in relation to the cross-sectional dimension, the forecasting performance of small monetary VARs can be improved by adding additional macroeconomic variables and sectoral information. In addition, we show that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis. Copyright © 2009 John Wiley & Sons, Ltd.

Book ChapterDOI
TL;DR: This chapter reviews developments in ACO and gives an overview of recent research trends, including the development of high-performing algorithmic variants and theoretical understanding of properties of ACO algorithms.
Abstract: Ant Colony Optimization (ACO) is a metaheuristic that is inspired by the pheromone trail laying and following behavior of some ant species. Artificial ants in ACO are stochastic solution construction procedures that build candidate solutions for the problem instance under concern by exploiting (artificial) pheromone information that is adapted based on the ants’ search experience and possibly available heuristic information. Since the proposal of the Ant System, the first ACO algorithm, many significant research results have been obtained. These contributions focused on the development of high-performing algorithmic variants, the development of a generic algorithmic framework for ACO algorithms, successful applications of ACO algorithms to a wide range of computationally hard problems, and the theoretical understanding of properties of ACO algorithms. This chapter reviews these developments and gives an overview of recent research trends in ACO.

Posted Content
TL;DR: The authors study how firms differ from their competitors using new time-varying measures of product differentiation based on text-based analysis of product descriptions from 50,673 firm 10-K statements filed yearly with the Securities and Exchange Commission.
Abstract: We study how firms differ from their competitors using new time-varying measures of product differentiation based on text-based analysis of product descriptions from 50,673 firm 10-K statements filed yearly with the Securities and Exchange Commission This year-by-year set of product differentiation measures allows us to generate a new set of industries and corresponding new measures of industry competition where firms can have their own distinct set of competitors Our new sets of industry competitors better explain specific discussion of high competition by management, rivals identified by managers as peer firms and firm characteristics such as profitability and leverage than do existing classifications We also find evidence that firm R&D and advertising are associated with subsequent differentiation from competitors, consistent with theories of endogenous product differentiation

Posted Content
TL;DR: In this article, all data and codes necessary to replicate results of the article were provided, including the data for figures, and all the data and code necessary to verify the results.
Abstract: All data and codes necessary to replicate results of the article. Also includes the data for figures.

Posted Content
TL;DR: This article examined adverse liquidity shocks on main developed country banking systems and their relationships to emerging markets across Europe, Asia, and Latin America, isolating loan supply from loan demand effects, and found that loan supply in emerging markets was affected significantly through three separate channels: 1) a contraction in direct, cross-border lending by foreign banks; 2) an increase in local lending by local banks' affiliates in emerging market; and 3) an overall contraction in loan supply by domestic banks, resulting from the funding shock to their balance sheets induced by the decline in interbank, crossborder lending
Abstract: Global banks played a significant role in transmitting the 2007-09 financial crisis to emerging-market economies. We examine adverse liquidity shocks on main developed-country banking systems and their relationships to emerging markets across Europe, Asia, and Latin America, isolating loan supply from loan demand effects. Loan supply in emerging markets across Europe, Asia, and Latin America was affected significantly through three separate channels: 1) a contraction in direct, cross-border lending by foreign banks; 2) a contraction in local lending by foreign banks' affiliates in emerging markets; and 3) a contraction in loan supply by domestic banks, resulting from the funding shock to their balance sheets induced by the decline in interbank, cross-border lending. Policy interventions, such as the Vienna Initiative introduced in Europe, influenced the lending-channel effects on emerging markets of shocks to head-office balance sheets.

Posted Content
TL;DR: In this article, the authors presented estimations of the shadow economies for 162 countries, including developing, Eastern European, Central Asian, and high-income countries over the period 1999 to 2006/2007.
Abstract: This paper presents estimations of the shadow economies for 162 countries, including developing, Eastern European, Central Asian, and high-income countries over the period 1999 to 2006/2007. According to the estimations, the weighted average size of the shadow economy (as a percentage of"official"gross domestic product) in Sub-Saharan Africa is 38.4 percent; in Europe and Central Asia (mostly transition countries), it is 36.5 percent, and in high-income OECD countries, it is 13.5 percent. The authors find a clear negative trend in the size of the shadow economy: The unweighted average of the 162 countries in 1999 was 34.0 percent and in 2007 31.0 percent; hence a reduction of 3 percentage points!.The driving forces of the shadow economy are an increased burden of taxation (both direct and indirect), combined with labor market regulations and the quality of public goods and services, as well as the state of the"official"economy.

Posted Content
TL;DR: In this paper, the authors extend activity analysis into consumption theory and assume that goods possess, or give rise to, multiple characteristics in fixed proportions and that it is these characteristics, not goods themselves, on which the consumer's preferences are exercised.
Abstract: Activity analysis is extended into consumption theory. It is assumed that goods possess, or give rise to, multiple characteristics in fixed proportions and that it is these characteristics, not goods themselves, on which the consumer’s preferences are exercised.

Posted Content
TL;DR: Global and HedonicWB measures appear to index different aspects of WB over the lifespan, and the postmidlife increase in WB, especially in Hedonic WB, deserves continued exploration.
Abstract: Psychological well-being (WB) includes a person?s overall appraisal of his or her life (Global WB) and affective state (Hedonic WB), and it is considered a key aspect of the health of individuals and groups. Several cross-sectional studies have documented a relation between Global WB and age. Little is known, however, about the age distribution of Hedonic WB. It may yield a different view of aging because it is less influenced by the cognitive reconstruction inherent in Global WB measures and because it includes both positive and negative components of WB. In this study we report on both Global and Hedonic WB assessed in a 2008 telephone survey of 340,847 people in the United States. Consistent with prior studies, Global WB and positive Hedonic WB generally had U-shaped age profiles showing increasedWB after the age of 50 years. However, negative Hedonic WB variables showed distinctly different and stronger patterns: Stress and Anger steeply declined from the early 20s, Worry was elevated through middle age and then declined, and Sadness was essentially flat. Unlike a prior study, men and women had very similar age profiles of WB. Several measures that could plausibly covary with the age-WB association (e.g., having children at home) did not alter the age-WB patterns. Global and Hedonic WB measures appear to index different aspects of WB over the lifespan, and the postmidlife increase in WB, especially in Hedonic WB, deserves continued exploration.

Report SeriesDOI
TL;DR: In this paper, the authors define a global middle class as all those living in households with daily per capita incomes of between USD10 and USD100 in PPP terms and show that Asia accounts for less than one-quarter of today's middle class.
Abstract: The shift in global goods production towards Asia is well documented. But global consumer demand has so far been concentrated in the rich economies of the OECD. Will that also shift towards Asia as these countries get richer? This paper defines a global middle class as all those living in households with daily per capita incomes of between USD10 and USD100 in PPP terms. By combining household survey data with growth projections for 145 countries, it shows that Asia accounts for less than one-quarter of today’s middle class. By 2020, that share could double. More than half the world’s middle class could be in Asia and Asian consumers could account for over 40 per cent of global middle class consumption. This is because a large mass of Asian households have incomes today that position them just below the global middle class threshold and so increasingly large numbers of Asians are expected to become middle class in the next ten years. The paper explores how this can help sustain global growth in the medium term, driven by product differentiation, branding and marketing in the new growth markets of Asia.

Posted Content
TL;DR: Gorton's "The Panic of 2007" garnered enormous attention and is considered by many to be the most convincing take on the recent economic meltdown as mentioned in this paper. But as any banking system, it was vulnerable to a panic, and the events starting in August 2007 can best be understood as a wholesale panic, rather than a retail panic-involving financial firms "running" on other financial firms, resulting in the system becoming insolvent.
Abstract: Originally written for a conference of the Federal Reserve, Gary Gorton's "The Panic of 2007" garnered enormous attention and is considered by many to be the most convincing take on the recent economic meltdown. Now, in Slapped by the Invisible Hand, Gorton builds upon this seminal work, explaining how the securitized banking system, the nexus of financial markets and instruments unknown to most people, stands at the heart of the financial crisis. The securitized banking system is, in fact, a real banking system, allowing institutional investors and firms to make large, short-term deposits. But, as any banking system, it was vulnerable to a panic. Indeed the events starting in August 2007 can best be understood as a panic-a wholesale panic, rather than a retail panic-involving financial firms "running" on other financial firms, resulting in the system becoming insolvent. As the financial crisis unfolded, Gorton was working inside an institution that played a central role in the collapse; thus this book presents the unparalleled perspective of a top scholar who was also a central insider.

Posted Content
TL;DR: The Multidimensional Poverty Index (MPI) as discussed by the authors is the first multidimensional poverty index for 104 developing countries and is composed of ten indicators corresponding to the same three dimensions as the Human Development Index: education, health and standard of living.
Abstract: This paper presents a new Multidimensional Poverty Index (MPI) for 104 developing countries. It is the first time multidimensional poverty is estimated using micro datasets (household surveys) for such a large number of countries which cover about 78 percent of the world’s population. The MPI has the mathematical structure of one of the Alkire and Foster poverty multidimensional measures and it is composed of ten indicators corresponding to same three dimensions as the Human Development Index: Education, Health and Standard of Living. Our results indicate that 1,700 million people in the world live in acute poverty, a figure that is between the $1.25/day and $2/day poverty rates. Yet it is no $1.5/day measure. The MPI captures direct failures in functionings that Amartya Sen argues should form the focal space for describing and reducing poverty. It constitutes a tool with an extraordinary potential to target the poorest, track the Millennium Development Goals, and design policies that directly address the interlocking deprivations poor people experience. This paper presents the methodology and components in the MPI, describes main results, and shares basic robustness tests.

Posted Content
TL;DR: This introductory book on the new science of networks takes an interdisciplinary approach, using economics, sociology, computing, information science and applied mathematics to address fundamental questions about the links that connect us, and the ways that the authors' decisions can have consequences for others.
Abstract: Are all film stars linked to Kevin Bacon? Why do the stock markets rise and fall sharply on the strength of a vague rumour? How does gossip spread so quickly? Are we all related through six degrees of separation? There is a growing awareness of the complex networks that pervade modern society. We see them in the rapid growth of the internet, the ease of global communication, the swift spread of news and information, and in the way epidemics and financial crises develop with startling speed and intensity. This introductory book on the new science of networks takes an interdisciplinary approach, using economics, sociology, computing, information science and applied mathematics to address fundamental questions about the links that connect us, and the ways that our decisions can have consequences for others.

Posted Content
TL;DR: This article showed that agriculture is significantly more effective than non-agriculture in reducing poverty among the poorest of the poor (as reflected in the $1-day squared poverty gap).
Abstract: The role of agriculture in development remains much debated. This paper takes an empirical perspective and focuses on poverty, as opposed to growth alone. The contribution of a sector to poverty reduction is shown to depend on its own growth performance, its indirect impact on growth in other sectors, the extent to which poor people participate in the sector, and the size of the sector in the overall economy. Bringing together these different effects using cross-country econometric evidence indicates that agriculture is significantly more effective than nonagriculture in reducing poverty among the poorest of the poor (as reflected in the $1-day squared poverty gap). It is also up to 3.2 times better at reducing $1-day headcount poverty in low-income and resource-rich countries (including those in sub-Saharan Africa), at least when societies are not fundamentally unequal. However, when it comes to the better-off poor (reflected in the $2-day measure), non-agriculture has the edge. These results are driven by the much larger participation of poorer households in growth from agriculture and the lower poverty-reducing effect of non-agriculture in the presence of extractive industries.