scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 2017"


Posted Content
TL;DR: In this paper, the authors analyzed the effect of the increase in industrial robot usage between 1990 and 2007 on US local labor markets, and showed that robots may reduce employment and wages, and that the local labor market effects of robots can be estimated by regressing the change in employment and wage on the exposure to robots in each local labour market, defined from the national penetration of robots into each industry and the local distribution of employment across industries.
Abstract: As robots and other computer-assisted technologies take over tasks previously performed by labor, there is increasing concern about the future of jobs and wages. We analyze the effect of the increase in industrial robot usage between 1990 and 2007 on US local labor markets. Using a model in which robots compete against human labor in the production of different tasks, we show that robots may reduce employment and wages, and that the local labor market effects of robots can be estimated by regressing the change in employment and wages on the exposure to robots in each local labor market—defined from the national penetration of robots into each industry and the local distribution of employment across industries. Using this approach, we estimate large and robust negative effects of robots on employment and wages across commuting zones. We bolster this evidence by showing that the commuting zones most exposed to robots in the post-1990 era do not exhibit any differential trends before 1990. The impact of robots is distinct from the impact of imports from China and Mexico, the decline of routine jobs, offshoring, other types of IT capital, and the total capital stock (in fact, exposure to robots is only weakly correlated with these other variables). According to our estimates, one more robot per thousand workers reduces the employment to population ratio by about 0.18-0.34 percentage points and wages by 0.25-0.5 percent.

979 citations



Posted Content
TL;DR: In this paper, the authors systematically review empirical evidence on the impact of entrepreneurship education (EE) in higher education on a range of learning outcomes, analysing 159 published articles from 2004-2016.
Abstract: Using a teaching model framework, we systematically review empirical evidence on the impact of entrepreneurship education (EE) in higher education on a range of learning outcomes, analysing 159 published articles from 2004-2016. The teaching model framework allows us for the first time to start rigorously examining relationships between pedagogical methods and specific outcomes. Re-confirming past reviews and meta-analyses, we find that EE impact research still predominantly focuses on short-term and subjective outcome measures and tends to severely under-describe the actual pedagogies being tested. Moreover, we use our review to provide an up-to-date and empirically rooted call for less obvious, yet greatly promising, new or underemphasised directions for future research on the impact of university-based entrepreneurship education. This includes, for example, the use of novel impact indicators related to emotion and mindset, focus on the impact indicators related to the intention-to-behaviour transition, and explore the reasons for some of the contradictory findings in impact studies including person-, context- and pedagogical model-specific moderators.

642 citations


Posted Content
TL;DR: In this paper, the authors analyzed micro panel data from the U.S. Economic Census since 1982 and international sources and document empirical patterns to assess a new interpretation of the fall in the labor share based on the rise of "superstar firms."
Abstract: The fall of labor's share of GDP in the United States and many other countries in recent decades is well documented but its causes remain uncertain. Existing empirical assessments of trends in labor's share typically have relied on industry or macro data, obscuring heterogeneity among firms. In this paper, we analyze micro panel data from the U.S. Economic Census since 1982 and international sources and document empirical patterns to assess a new interpretation of the fall in the labor share based on the rise of "superstar firms." If globalization or technological changes advantage the most productive firms in each industry, product market concentration will rise as industries become increasingly dominated by superstar firms with high profits and a low share of labor in firm value-added and sales. As the importance of superstar firms increases, the aggregate labor share will tend to fall. Our hypothesis offers several testable predictions: industry sales will increasingly concentrate in a small number of firms; industries where concentration rises most will have the largest declines in the labor share; the fall in the labor share will be driven largely by between-firm reallocation rather than (primarily) a fall in the unweighted mean labor share within firms; the between-firm reallocation component of the fall in the labor share will be greatest in the sectors with the largest increases in market concentration; and finally, such patterns will be observed not only in U.S. firms, but also internationally. We find support for all of these predictions.

587 citations


Posted ContentDOI
TL;DR: A common European Framework for the Digital Competence of Educators (DigCompEdu) is presented in this paper, which is a scientifically sound background framework which helps to guide policy and can be directly adapted to implement regional and national tools and training programmes.
Abstract: As the teaching professions face rapidly changing demands, educators require an increasingly broad and more sophisticated set of competences than before. In particular the ubiquity of digital devices and the duty to help students become digitally competent requires educators to develop their own digital competence. On International and national level a number of frameworks, self-assessment tools and training programmes have been developed to describe the facets of digital competence for educators and to help them assess their competence, identify their training needs and offer targeted training. Analysing and clustering these instruments, this report presents a common European Framework for the Digital Competence of Educators (DigCompEdu). DigCompEdu is a scientifically sound background framework which helps to guide policy and can be directly adapted to implement regional and national tools and training programmes. In addition, it provides a common language and approach that will help the dialogue and exchange of best practices across borders. The DigCompEdu framework is directed towards educators at all levels of education, from early childhood to higher and adult education, including general and vocational training, special needs education, and non-formal learning contexts. It aims to provide a general reference frame for developers of Digital Competence models, i.e. Member States, regional governments, relevant national and regional agencies, educational organisations themselves, and public or private professional training providers.

521 citations


Posted Content
TL;DR: Clewlow et al. as discussed by the authors presented findings from a comprehensive travel and residential survey deployed in seven major U.S. cities, in two phases from 2014 to 2016, with a targeted, representative sample of their urban and suburban populations.
Abstract: Author(s): Clewlow, Regina R.; Mishra, Gouri S. | Abstract: The rapid adoption of ride-hailing poses significant challenges for transportation researchers, policymakers, and planners, as there is limited information and data about how these services affect transportation decisions and travel patterns. Given the long-range business, policy, and planning decisions that are required to support transportation infrastructure (including public transit, roads, bike lanes, and sidewalks), there is an urgent need to collect data on the adoption of these new services, and in particular their potential impacts on travel choices. This paper presents findings from a comprehensive travel and residential survey deployed in seven major U.S. cities, in two phases from 2014 to 2016, with a targeted, representative sample of their urban and suburban populations. The purpose of this report is to provide early insight on the adoption of, use, and travel behavior impacts of ride-hailing.

421 citations


ReportDOI
TL;DR: In this paper, the authors characterize intergenerational income mobility at each college in the United States using data for over 30 million college students from 1999-2013, and find that access to colleges varies greatly by parent income.
Abstract: We characterize intergenerational income mobility at each college in the United States using data for over 30 million college students from 1999-2013. We document four results. First, access to colleges varies greatly by parent income. For example, children whose parents are in the top 1% of the income distribution are 77 times more likely to attend an Ivy League college than those whose parents are in the bottom income quintile. Second, children from low- and high-income families have similar earnings outcomes conditional on the college they attend, indicating that low-income students are not mismatched at selective colleges. Third, rates of upward mobility – the fraction of students who come from families in the bottom income quintile and reach the top quintile – differ substantially across colleges because low-income access varies significantly across colleges with similar earnings outcomes. Rates of bottom-to-top quintile mobility are highest at certain mid-tier public universities, such as the City University of New York and California State colleges. Rates of upper-tail (bottom quintile to top 1%) mobility are highest at elite colleges, such as Ivy League universities. Fourth, the fraction of students from low-income families did not change substantially between 2000-2011 at elite private colleges, but fell sharply at colleges with the highest rates of bottom-to-top-quintile mobility. Although our descriptive analysis does not identify colleges’ causal effects on students’ outcomes, the publicly available statistics constructed here highlight colleges that deserve further study as potential engines of upward mobility.

416 citations


Posted Content
TL;DR: In this article, the authors document the evolution of markups based on firm-level data for the US economy since 1950 and evaluate the macroeconomic implications of an increase in average market power, which can account for a number of secular trends.
Abstract: We document the evolution of markups based on firm-level data for the US economy since 1950. Initially, markups are stable, even slightly decreasing. In 1980, average markups start to rise from 18% above marginal cost to 67% now. There is no strong pattern across industries, though markups tend to be higher, across all sectors of the economy, in smaller firms and most of the increase is due to an increase within industry. We do see a notable change in the distribution of markups with the increase exclusively due to a sharp increase in high markup firms. We then evaluate the macroeconomic implications of an increase in average market power, which can account for a number of secular trends in the last 3 decades: 1. decrease in labor share; 2. increase in capital share; 3. decrease in low skill wages; 4. decrease in labor force participation; 5. decrease in labor flows; 6. decrease in migration rates; 7. slowdown in aggregate output.

379 citations


ReportDOI
TL;DR: In this paper, the authors argue that lags have likely been the biggest contributor to the paradox of the mismatch between expectations and statistics in Artificial Intelligence, arguing that the most impressive capabilities of AI, particularly those based on machine learning, have not yet diffused widely and that their full effects won't be realized until waves of complementary innovations are developed and implemented.
Abstract: We live in an age of paradox. Systems using artificial intelligence match or surpass human level performance in more and more domains, leveraging rapid advances in other technologies and driving soaring stock prices. Yet measured productivity growth has declined by half over the past decade, and real income has stagnated since the late 1990s for a majority of Americans. We describe four potential explanations for this clash of expectations and statistics: false hopes, mismeasurement, redistribution, and implementation lags. While a case can be made for each, we argue that lags have likely been the biggest contributor to the paradox. The most impressive capabilities of AI, particularly those based on machine learning, have not yet diffused widely. More importantly, like other general purpose technologies, their full effects won’t be realized until waves of complementary innovations are developed and implemented. The required adjustment costs, organizational changes, and new skills can be modeled as a kind of intangible capital. A portion of the value of this intangible capital is already reflected in the market value of firms. However, going forward, national statistics could fail to measure the full benefits of the new technologies and some may even have the wrong sign.

370 citations


Posted Content
TL;DR: In this article, the authors investigated the asymmetric relationship between energy consumption and economic growth by incorporating financial development, capital and labour into a production function covering the Indian economy from 1960Q1-2015Q4.
Abstract: This paper investigates the asymmetric relationship between energy consumption and economic growth by incorporating financial development, capital and labour into a production function covering the Indian economy from 1960Q1–2015Q4. The nonlinear autoregressive distributed lag bounds testing approach is applied to examine the asymmetric cointegration between the variables. An asymmetric causality test is also employed to examine the causal association between the considered variables. The results indicate cointegration between the variables in the presence of asymmetries. The asymmetric causality results show that only negative shocks in energy consumption have impacts on economic growth. In the same vein, only negative shocks in financial development have impacts on economic growth. By contrast, symmetrically, capital formation causes economic growth. Finally, over the study period, a neutral effect exists between the labour force and economic growth in India. The implications of these results for growth policies in India are also discussed.

354 citations


Posted Content
TL;DR: This paper found that exposure to the EU in terms of immigration and trade provides relatively little explanatory power for the referendum vote, instead, fundamental characteristics of the voting population were key drivers of the Vote Leave share, in particular their education profiles, their historical dependence on manufacturing employment as well as low income and high unemployment.
Abstract: On 23 June 2016, the British electorate voted to leave the European Union. We analyze vote and turnout shares across 380 local authority areas in the United Kingdom. We find that exposure to the EU in terms of immigration and trade provides relatively little explanatory power for the referendum vote. Instead, we find that fundamental characteristics of the voting population were key drivers of the Vote Leave share, in particular their education profiles, their historical dependence on manufacturing employment as well as low income and high unemployment. At the much finer level of wards within cities, we find that areas with deprivation in terms of education, income and employment were more likely to vote Leave. Our results indicate that a higher turnout of younger voters, who were more likely to vote Remain, would not have overturned the referendum result.

Posted Content
TL;DR: This paper argued that clustering is in essence a design problem, either a sampling design or an experimental design issue, and that the clustering adjustment is justified by the fact that there are clusters in the population that we do not see in the sample.
Abstract: In empirical work in economics it is common to report standard errors that account for clustering of units. Typically, the motivation given for the clustering adjustments is that unobserved components in outcomes for units within clusters are correlated. However, because correlation may occur across more than one dimension, this motivation makes it difficult to justify why researchers use clustering in some dimensions, such as geographic, but not others, such as age cohorts or gender. It also makes it difficult to explain why one should not cluster with data from a randomized experiment. In this paper, we argue that clustering is in essence a design problem, either a sampling design or an experimental design issue. It is a sampling design issue if sampling follows a two stage process where in the first stage, a subset of clusters were sampled randomly from a population of clusters, while in the second stage, units were sampled randomly from the sampled clusters. In this case the clustering adjustment is justified by the fact that there are clusters in the population that we do not see in the sample. Clustering is an experimental design issue if the assignment is correlated within the clusters. We take the view that this second perspective best fits the typical setting in economics where clustering adjustments are used. This perspective allows us to shed new light on three questions: (i) when should one adjust the standard errors for clustering, (ii) when is the conventional adjustment for clustering appropriate, and (iii) when does the conventional adjustment of the standard errors matter.

Posted Content
TL;DR: In this article, the authors made an attempt to estimate the environmental Kuznets curve (EKC) for CO2 emission in India for the period of 1971-2015 using unit root test with multiple structural breaks and autoregressive distributed lag (ARDL) approach to cointegration.
Abstract: The existing literature on environmental Kuznets curve (EKC) is mainly focused on finding out the optimal sustainable path for any economy. Looking at the present renewable energy generation scenario in India, this study has made an attempt to estimate the EKC for CO2 emission in India for the period of 1971-2015. Using unit root test with multiple structural breaks and autoregressive distributed lag (ARDL) approach to cointegration, this study has found the evidence of inverted U-shaped EKC for India, with the turnaround point at USD 2937.77. The renewable energy has found to have significant negative impact on CO2 emissions, whereas for overall energy consumption, the long run elasticity is found to be higher than short run elasticity. Moreover, trade is negatively linked with carbon emissions. Based on the results, this study concludes with suitable policy prescriptions.

ReportDOI
TL;DR: In this article, the authors consider how a central bank digital currency can transform all aspects of the monetary system and facilitate the systematic and transparent conduct of monetary policy and find a compelling rationale for establishing a CBDC that serves as a stable unit of account, a practically costless medium of exchange, and a secure store of value.
Abstract: We consider how a central bank digital currency (CBDC) can transform all aspects of the monetary system and facilitate the systematic and transparent conduct of monetary policy. Drawing on a very long strand of literature in monetary economics, we find a compelling rationale for establishing a CBDC that serves as a stable unit of account, a practically costless medium of exchange, and a secure store of value. In particular, the CBDC should be universally accessible and interest-bearing, and the central bank should adjust its interest rate to foster true price stability.

Posted ContentDOI
TL;DR: The Digital Competence Framework for Citizens (DigComp 2.1) as mentioned in this paper is a further development of the digital competence framework for citizens and includes 8 proficiency levels and examples of use applied to the learning and employment field.
Abstract: DigComp 2.1 is a further development of the Digital Competence Framework for Citizens. Based on the reference conceptual model published in DigComp 2.0, we present now 8 proficiency levels and examples of use applied to the learning and employment field.

Posted Content
TL;DR: In this paper, the authors present a wide range of evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply, and they find that ideas are getting harder and harder to find.
Abstract: In many growth models, economic growth arises from people creating ideas, and the long-run growth rate is the product of two terms: the effective number of researchers and their research productivity. We present a wide range of evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply. A good example is Moore's Law. The number of researchers required today to achieve the famous doubling every two years of the density of computer chips is more than 18 times larger than the number required in the early 1970s. Across a broad range of case studies at various levels of (dis)aggregation, we find that ideas — and in particular the exponential growth they imply — are getting harder and harder to find. Exponential growth results from the large increases in research effort that offset its declining productivity.

Posted Content
TL;DR: Based on the one-pillar model of sustainable development, this paper presented the first systematic review of the literature on ecological sustainable entrepreneurship, which revealed a strong focus on the drivers of engagement in sustainable entrepreneurship and the strategic actions taken by ecological sustainable enterprises.
Abstract: In line with an intensified call for conducting business in a greener and more sustainable way, sustainability-related entrepreneurship has become an important subfield of entrepreneurship research. The variety of terms, such as "sustainable entrepreneurship", "ecopreneurship", "environmental entrepreneurship/enviropreneurship", and "green entrepreneurship", reflects the fragmented and inconsistent findings of this research field. Based on the one-pillar model of sustainable development, i.e., ecological sustainability, we present the first systematic review of the literature on ecological sustainable entrepreneurship. This analysis of 114 scientific articles reveals a strong focus on the drivers of engagement in ecological sustainable entrepreneurship, the drivers of conducting business in an ecological sustainable way, the strategic actions taken by ecological sustainable enterprises, and the outcomes, enabling factors and challenges of ecological sustainable entrepreneurship. Based on this thematic clustering, we develop an integrative framework for ecological sustainable entrepreneurship and a coherent agenda for future research. This work may help researchers to take stock of the existing literature and advance this research field.

Posted Content
TL;DR: An attempt has been taken here to review the reliability and validity, and threat to them, of measurement instruments that are used in research.
Abstract: Reliability and validity are the two most important and fundamental features in the evaluation of any measurement instrument or tool for a good research. The purpose of this research is to discuss the validity and reliability of measurement instruments that are used in research. Validity concerns what an instrument measures, and how well it does so. Reliability concerns the faith that one can have in the data obtained from the use of an instrument, that is, the degree to which any measuring tool controls for random error. An attempt has been taken here to review the reliability and validity, and threat to them in some details.

Posted Content
TL;DR: This paper surveyed parents of 6- to 14-year-olds in eight European countries (N=6,400) and found that enabling mediation is associated with increased online opportunities but also risks.
Abstract: As internet use becomes widespread at home, parents are trying to maximize their children’s online opportunities while also minimizing online risks. We surveyed parents of 6- to 14-year-olds in eight European countries (N=6,400). A factor analysis revealed two strategies. Enabling mediation is associated with increased online opportunities but also risks. This strategy incorporates safety efforts, responds to child agency and is employed when parent or child is relatively digitally skilled, so may not support harm. Restrictive mediation is associated with fewer online risks but at the cost of opportunities, reflecting policy advice that regards media use as primarily problematic. It is favoured when parent or child digital skills are lower, potentially keeping vulnerable children safe yet undermining their digital inclusion.

Posted Content
TL;DR: The first study to systematically investigate key cryptocurrency industry sectors by collecting empirical, non-public data was conducted by as discussed by the authors, who collected survey data from nearly 150 cryptocurrency companies and individuals.
Abstract: This is the first study to systematically investigate key cryptocurrency industry sectors by collecting empirical, non-public data. The study gathered survey data from nearly 150 cryptocurrency companies and individuals, and it covers 38 countries from five world regions. The study details the key industry sectors that have emerged and the different entities that inhabit them.

Posted Content
TL;DR: In this paper, the authors examined how machine learning can be used to improve and understand human decision-making in particular, focusing on a decision that has important policy consequences Millions of times each year, judges must decide where defendants will await trial, and this decision hinges on the judge's prediction of what the defendant would do if released.
Abstract: We examine how machine learning can be used to improve and understand human decision-making In particular, we focus on a decision that has important policy consequences Millions of times each year, judges must decide where defendants will await trial—at home or in jail By law, this decision hinges on the judge’s prediction of what the defendant would do if released This is a promising machine learning application because it is a concrete prediction task for which there is a large volume of data available Yet comparing the algorithm to the judge proves complicated First, the data are themselves generated by prior judge decisions We only observe crime outcomes for released defendants, not for those judges detained This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions Second, judges may have a broader set of preferences than the single variable that the algorithm focuses on; for instance, judges may care about racial inequities or about specific crimes (such as violent crimes) rather than just overall crime risk We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges Even accounting for these concerns, our results suggest potentially large welfare gains: a policy simulation shows crime can be reduced by up to 248% with no change in jailing rates, or jail populations can be reduced by 420% with no increase in crime rates Moreover, we see reductions in all categories of crime, including violent ones Importantly, such gains can be had while also significantly reducing the percentage of African-Americans and Hispanics in jail We find similar results in a national dataset as well In addition, by focusing the algorithm on predicting judges’ decisions, rather than defendant behavior, we gain some insight into decision-making: a key problem appears to be that judges to respond to ‘noise’ as if it were signal These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals

Posted Content
TL;DR: In this paper, the authors identify, synthesize, and organize three streams of micro-CSR studies focusing on individual drivers of CSR engagement, individual processes, and individual reactions to CSR initiatives into a coherent behavioral framework.
Abstract: This article aims to consolidate the psychological microfoundations of corporate social responsibility (CSR) by taking stock and evaluating the recent surge of person-focused CSR research. With a systematic review, the authors identify, synthesize, and organize three streams of micro-CSR studies—focused on (i) individual drivers of CSR engagement, (ii) individual processes of CSR evaluations, and (iii) individual reactions to CSR initiatives—into a coherent behavioral framework. This review highlights significant gaps, methodological issues, and imbalances in the treatment of the three components in prior micro-CSR research. It uncovers the need to conceptualize how multiple drivers of CSR interact and how the plurality of mechanisms and boundary conditions that can explain individual reactions to CSR might be integrated theoretically. By organizing micro-CSR studies into a coherent framework, this review also reveals the lack of connections within and between substreams of micro-CSR research; to tackle them, this article proposes an agenda for further research, focused on six key challenges

Posted Content
TL;DR: The authors show that most US food, drugstore, and mass merchandise chains charge nearly-uniform prices across stores, despite wide variation in consumer demographics and competition, and that uniform pricing may significantly increase the prices paid by poorer households relative to the rich, dampen the response of prices to local economic shocks, alter the analysis of mergers in antitrust, and shift the incidence of intra-national trade costs.
Abstract: We show that most US food, drugstore, and mass merchandise chains charge nearly-uniform prices across stores, despite wide variation in consumer demographics and competition. Demand estimates reveal substantial within-chain variation in price elasticities and suggest that the median chain sacrifices $16m of annual profit relative to a benchmark of optimal prices. In contrast, differences in average prices between chains are broadly consistent with the optimal benchmark. We discuss a range of explanations for nearly-uniform pricing, highlighting managerial inertia and brand-image concerns as mechanisms frequently mentioned by industry participants. Relative to our optimal benchmark, uniform pricing may significantly increase the prices paid by poorer households relative to the rich, dampen the response of prices to local economic shocks, alter the analysis of mergers in antitrust, and shift the incidence of intra-national trade costs.

Posted Content
TL;DR: This article developed a model in which rising housing prices in high-income areas deter low-skill migration and slow income convergence using a new panel measure of housing supply regulations, demonstrating the importance of this channel in the data.
Abstract: The past thirty years have seen a dramatic decline in the rate of income convergence across states and in population flows to high-income places These changes coincide with a disproportionate increase in housing prices in high-income places, a divergence in the skill-specific returns to moving to high-income places, and a redirection of low-skill migration away from high-income places We develop a model in which rising housing prices in high-income areas deter low-skill migration and slow income convergence Using a new panel measure of housing supply regulations, we demonstrate the importance of this channel in the data

Posted Content
TL;DR: This article found that deviations from the covered interest rate parity condition imply large, persistent, and systematic arbitrage opportunities in one of the largest asset markets in the world Contrary to the common view, these deviations for major currencies are not explained away by credit risk or transaction costs They are particularly strong for forward contracts that appear on the banks' balance sheets at the end of the quarter.
Abstract: We find that deviations from the covered interest rate parity condition (CIP) imply large, persistent, and systematic arbitrage opportunities in one of the largest asset markets in the world Contrary to the common view, these deviations for major currencies are not explained away by credit risk or transaction costs They are particularly strong for forward contracts that appear on the banks' balance sheets at the end of the quarter, pointing to a causal effect of banking regulation on asset prices The CIP deviations also appear significantly correlated with other fixed-income spreads and with nominal interest rates

ReportDOI
TL;DR: In this article, the authors provide a taxonomy of the associated economic issues and provide several simple economic models to describe how policy can counter these effects, even in the case of a "singularity" where machines come to dominate human labor.
Abstract: Inequality is one of the main challenges posed by the proliferation of artificial intelligence (AI) and other forms of worker-replacing technological progress. This paper provides a taxonomy of the associated economic issues: First, we discuss the general conditions under which new technologies such as AI may lead to a Pareto improvement. Secondly, we delineate the two main channels through which inequality is affected – the surplus arising to innovators and redistributions arising from factor price changes. Third, we provide several simple economic models to describe how policy can counter these effects, even in the case of a “singularity” where machines come to dominate human labor. Under plausible conditions, non-distortionary taxation can be levied to compensate those who otherwise might lose. Fourth, we describe the two main channels through which technological progress may lead to technological unemployment – via efficiency wage effects and as a transitional phenomenon. Lastly, we speculate on how technologies to create super-human levels of intelligence may affect inequality and on how to save humanity from the Malthusian destiny that may ensue.

Posted Content
TL;DR: The authors proposed a nonparametric method to test which characteristics provide independent information for the cross-section of expected returns, and used the adaptive group LASSO to select characteristics and to estimate how they affect expected returns nonparametrically.
Abstract: We propose a nonparametric method to test which characteristics provide independent information for the cross section of expected returns. We use the adaptive group LASSO to select characteristics and to estimate how they affect expected returns nonparametrically. Our method can handle a large number of characteristics, allows for a exible functional form, and is insensitive to outliers. Many of the previously identified return predictors do not provide incremental information for expected returns, and nonlinearities are important. Our proposed method has higher out-of-sample explanatory power compared to linear panel regressions, and increases Sharpe ratios by 50%.

Posted Content
TL;DR: This guide for practitioners offers suggestions for the empirical researcher wishing to use the regression discontinuity in time framework that differ from the more standard cross-sectional RD framework.
Abstract: Recent empirical work in several economic fields, particularly environmental and energy economics, has adapted the regression discontinuity framework to applications where time is the running variable and treatment occurs at the moment of the discontinuity. In this guide for practitioners, we discuss several features of this "Regression Discontinuity in Time" framework that differ from the more standard cross-sectional RD. First, many applications (particularly in environmental economics) lack cross-sectional variation and are estimated using observations far from the cut-off. This is in stark contrast to a cross-sectional RD, which is conceptualized for an estimation bandwidth going to zero even as the sample size increases. Second, estimates may be biased if the time-series properties of the data are ignored, for instance in the presence of an autoregressive process. Finally, tests for sorting or bunching near the discontinuity are often irrelevant, making the methodology closer to an event study than a regression discontinuity design. Based on these features and motivated by hypothetical examples using air quality data, we offer suggestions for the empirical researcher wishing to use the RD in time design.

Report SeriesDOI
TL;DR: The paper examines how patient harm can be minimised effectively and efficiently and is informed by a snapshot survey of a panel of eminent academic and policy experts in patient safety.
Abstract: About one in ten patients are harmed during health care. This paper estimates the health, financial and economic costs of this harm. Results indicate that patient harm exerts a considerable global health burden. The financial cost on health systems is also considerable and if the flow-on economic consequences such as lost productivity and income are included the costs of harm run into trillions of dollars annually. Because many of the incidents that cause harm can be prevented, these failures represent a considerable waste of healthcare resources, and the cost of failure dwarfs the investment required to implement effective prevention. The paper then examines how patient harm can be minimised effectively and efficiently. This is informed by a snapshot survey of a panel of eminent academic and policy experts in patient safety. System- and organisational-level initiatives were seen as vital to provide a foundation for the more local interventions targeting specific types of harm. The overarching requirement was a culture conducive to safety.

Posted ContentDOI
TL;DR: The fundamental principles of the Blockchain are introduced and it is explained how this technology may both disrupt institutional norms and empower learners in an education context.
Abstract: Luxembourg : Publications Office of the European Union 2017, 132 S. - (JRC Science for Policy Report) Padagogische Teildisziplin: Sonstige; Bildungsorganisation, Bildungsplanung und Bildungsrecht; als elektronischer Volltext verfugbar