scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The future of employment: How susceptible are jobs to computerisation?

01 Jan 2017-Technological Forecasting and Social Change (North-Holland)-Vol. 114, Iss: 114, pp 254-280
TL;DR: In this paper, a Gaussian process classifier was used to estimate the probability of computerisation for 702 detailed occupations, and the expected impacts of future computerisation on US labour market outcomes, with the primary objective of analyzing the number of jobs at risk and the relationship between an occupations probability of computing, wages and educational attainment.
About: This article is published in Technological Forecasting and Social Change.The article was published on 2017-01-01 and is currently open access. It has received 4853 citations till now. The article focuses on the topics: Educational attainment.

Summary (1 min read)

Introduction

  • This study aims to characterize the phenological behaviour, thermal requirements and the physical and physicochemical features at harvest of seedless grapes varieties ‘BRS Morena’, ‘BRS Clara’ and ‘BRS Linda’ under the conditions of the Submiddle São Francisco Valley.
  • For this purpose, phenological periods were characterized, since pruning until early stage of budding, flowering, fruiting, ripening and harvesting, in addition to thermal requirements for each phase, expressed in degree-days (DG).
  • At the harvest point, the bunch weight, the diameter of the berries, the pulp firmness, the soluble solids (SS), titratable acidity (TA) and the pH were determined.

DG for ‘BRS Linda ‘. In general, the cultivars ‘BRS Morena’ and ‘BRS Clara’ had the physical and physicochemical potential for domestic and international markets, provided that management practices are used to improve some physical characteristics of these cultivars.

  • Phenology, degrees-day, seedless grapes, Vitis A. E. O. Santos et al. Rev. Bras, also known as Key words.
  • A viticultura consiste numa atividade promissora, devido principalmente ao crescente aumento no consumo de sucos de uva e vinhos, além do consumo “in natura” (Sato et al., 2009).
  • Com produção anual estimada em 1.300.000 toneladas em uma área cultivada de aproximadamente 81 mil hectares, o Brasil ocupa, em 2010, a décima quinta posição na produção mundial de uva de mesa, sendo a Itália e a China os dois maiores produtores (FAO, 2011).
  • Como resultado dessas pesquisas, a Embrapa Uva e Vinho lançou, em 2003, as primeiras uvas apirenas brasileiras, registradas como 'BRS Morena', 'BRS Clara' e 'BRS Linda'.
  • A necessidade térmica de uma cultura é determinada, comumente, por meio da adoção de índices biometeorológicos.

Material e Métodos

  • Os experimentos foram conduzidos em área de produção comercial, no município de Petrolina, PE (latitude: 9º15’ Sul; longitude: 40º25’ Oeste), com altitude média de 366 metros.
  • O clima da microrregião é classificado como semiárido quente, BSh’W (classificação de Köppen), com temperatura média anual do ar de 26ºC.
  • Médias mensais de temperatura, precipitação pluviométrica e umidade relativa, do ano de 2010, Petrolina-PE Foram utilizados os cultivares de videira 'BRS Morena', 'BRS Clara' e 'BRS Linda', da espécie Vitis vinifera L., que consistem em uvas finas de mesa, apirenas, lançadas pela Embrapa Uva e Vinho em 2003.
  • O delineamento experimental para as avaliações das fases fenológicos foi o inteiramente casualizado (DIC), com 20 repetições, sendo a unidade experimental constituída por uma planta, com dois ramos de produção.
  • Os valores, obtidos em quilograma-força (kgf), foram convertidos em Newton (N).

Did you find this useful? Give us your feedback

Citations
More filters
Posted Content
TL;DR: A list of five practical research problems related to accident risk, categorized according to whether the problem originates from having the wrong objective function, an objective function that is too expensive to evaluate frequently, or undesirable behavior during the learning process, are presented.
Abstract: Rapid progress in machine learning and artificial intelligence (AI) has brought increasing attention to the potential impacts of AI technologies on society. In this paper we discuss one such potential impact: the problem of accidents in machine learning systems, defined as unintended and harmful behavior that may emerge from poor design of real-world AI systems. We present a list of five practical research problems related to accident risk, categorized according to whether the problem originates from having the wrong objective function ("avoiding side effects" and "avoiding reward hacking"), an objective function that is too expensive to evaluate frequently ("scalable supervision"), or undesirable behavior during the learning process ("safe exploration" and "distributional shift"). We review previous work in these areas as well as suggesting research directions with a focus on relevance to cutting-edge AI systems. Finally, we consider the high-level question of how to think most productively about the safety of forward-looking applications of AI.

1,569 citations

Journal ArticleDOI
TL;DR: The authors argue that the interplay between machine and human comparative advantage allows computers to substitute for workers in performing routine, codifiable tasks while amplifying the comparative advantage of workers in supplying problem-solving skills, adaptability, and creativity.
Abstract: In this essay, I begin by identifying the reasons that automation has not wiped out a majority of jobs over the decades and centuries. Automation does indeed substitute for labor�as it is typically intended to do. However, automation also complements labor, raises output in ways that leads to higher demand for labor, and interacts with adjustments in labor supply. Journalists and even expert commentators tend to overstate the extent of machine substitution for human labor and ignore the strong complementarities between automation and labor that increase productivity, raise earnings, and augment demand for labor. Changes in technology do alter the types of jobs available and what those jobs pay. In the last few decades, one noticeable change has been a "polarization" of the labor market, in which wage gains went disproportionately to those at the top and at the bottom of the income and skill distribution, not to those in the middle; however, I also argue, this polarization is unlikely to continue very far into future. The final section of this paper reflects on how recent and future advances in artificial intelligence and robotics should shape our thinking about the likely trajectory of occupational change and employment growth. I argue that the interplay between machine and human comparative advantage allows computers to substitute for workers in performing routine, codifiable tasks while amplifying the comparative advantage of workers in supplying problem-solving skills, adaptability, and creativity.

1,325 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a state-of-the-art review of Industry 4.0 based on recent developments in research and practice, and present an overview of different opportunities for sustainable manufacturing in Industry 5.0.

1,276 citations

Posted Content
TL;DR: In this paper, the authors analyzed the effect of the increase in industrial robot usage between 1990 and 2007 on US local labor markets, and showed that robots may reduce employment and wages, and that the local labor market effects of robots can be estimated by regressing the change in employment and wage on the exposure to robots in each local labour market, defined from the national penetration of robots into each industry and the local distribution of employment across industries.
Abstract: As robots and other computer-assisted technologies take over tasks previously performed by labor, there is increasing concern about the future of jobs and wages. We analyze the effect of the increase in industrial robot usage between 1990 and 2007 on US local labor markets. Using a model in which robots compete against human labor in the production of different tasks, we show that robots may reduce employment and wages, and that the local labor market effects of robots can be estimated by regressing the change in employment and wages on the exposure to robots in each local labor market—defined from the national penetration of robots into each industry and the local distribution of employment across industries. Using this approach, we estimate large and robust negative effects of robots on employment and wages across commuting zones. We bolster this evidence by showing that the commuting zones most exposed to robots in the post-1990 era do not exhibit any differential trends before 1990. The impact of robots is distinct from the impact of imports from China and Mexico, the decline of routine jobs, offshoring, other types of IT capital, and the total capital stock (in fact, exposure to robots is only weakly correlated with these other variables). According to our estimates, one more robot per thousand workers reduces the employment to population ratio by about 0.18-0.34 percentage points and wages by 0.25-0.5 percent.

979 citations


Cites methods from "The future of employment: How susce..."

  • ...Based on the tasks that workers perform, Frey and Osborne (2013), for instance, classify 702 occupations by how susceptible they are to automation....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a conceptual approach that is rooted in the service, robotics and AI literature is used to explore the potential role service robots will play in the future and to advance a research agenda for service researchers.
Abstract: Purpose The service sector is at an inflection point with regard to productivity gains and service industrialization similar to the industrial revolution in manufacturing that started in the eighteenth century. Robotics in combination with rapidly improving technologies like artificial intelligence (AI), mobile, cloud, big data and biometrics will bring opportunities for a wide range of innovations that have the potential to dramatically change service industries. The purpose of this paper is to explore the potential role service robots will play in the future and to advance a research agenda for service researchers. Design/methodology/approach This paper uses a conceptual approach that is rooted in the service, robotics and AI literature. Findings The contribution of this paper is threefold. First, it provides a definition of service robots, describes their key attributes, contrasts their features and capabilities with those of frontline employees, and provides an understanding for which types of service tasks robots will dominate and where humans will dominate. Second, this paper examines consumer perceptions, beliefs and behaviors as related to service robots, and advances the service robot acceptance model. Third, it provides an overview of the ethical questions surrounding robot-delivered services at the individual, market and societal level. Practical implications This paper helps service organizations and their management, service robot innovators, programmers and developers, and policymakers better understand the implications of a ubiquitous deployment of service robots. Originality/value This is the first conceptual paper that systematically examines key dimensions of robot-delivered frontline service and explores how these will differ in the future.

871 citations

References
More filters
Book
01 Jan 1942
TL;DR: In this paper, the authors present a history of the first half of the 20th century, from 1875 to 1914, of the First World War and the Second World War.
Abstract: Introduction. Part I: The Marxian Doctrine. Prologue. I. Marx the Prophet. II. Marx the Sociologist. III. Marx the Economist. IV Marx the Teacher. Part II: Can Capitalism Survive? Prologue. V. The Rate of Increase of Total Output. VI. Plausible Capitalism. VII. The Process of Creative Destruction. VIII. Monopolistics Practices. IX. Closed Season. X. The Vanishing of Investment Opportunity. XI. The Civilization of Capitalism. XII. Crumbling Walls. XIII. Growing Hostility. XIV. Decomposition. Part III: Can Socialism Work? XV. Clearing Decks. XVI. The Socialist Blueprint. XVII. Comparison of Blueprints. XVIII. The Human Element. XIX. Transition. Part IV: Socialism and Democracy. XX. The Setting of the Problem. XXI. The Classical Doctrine of Democracy. XXII. Another Theory of Democracy. XXIII. The Inference. Part V: A Historical Sketch of Socialist Parties. Prologue. XXIV. The Nonage. XXV. The Situation that Marx Faced. XXVI. From 1875 to 1914. XXVII. From the First to the Second World War. XXVIII. The Consequences of the Second World War. Preface to the First Edition, 1942. Preface to the Second Edition, 1946. Preface to the Third Edition, 1949. The March Into Socialism. Index.

16,667 citations

Book
23 Nov 2005
TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Abstract: A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

11,357 citations


"The future of employment: How susce..." refers methods in this paper

  • ...Richer models are provided by Gaussian process classifiers (Rasmussen and Williams, 2006)....

    [...]

Book
24 Aug 2012
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Abstract: Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package--PMTK (probabilistic modeling toolkit)--that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

8,059 citations

Book
01 Jan 1817
TL;DR: The editors of this monumental undertaking as discussed by the authors have achieved near perfection as near to perfection as anything human can be, and nothing but praise can be accorded to the editors and reviewers.
Abstract: Nothing but praise can be accorded to the editors of this monumental undertaking. As near perfection as anything human can be.

5,995 citations

Journal ArticleDOI
TL;DR: In this article, the authors argue that the conflict between capitalism and socialism is not necessarily in competition or conflict with each other, at least not conceptually (whether they could in practice coexist with one another is a different and empirical question).
Abstract: C A P I T A L IS M and socialism are generally taken to be irreconcilable opposites, and the conflict between their adherents has seemed so intense as to threaten the survival of the human species. In practice, no doubt, all sorts of compromises, accommodations and mixtures of the two are possible, but conceptually, considered as blueprints for the organization of society, capitalist and socialist ownership of the means of production appear mutually exclysive. I shall argue that this is by no means the case-that capitalism and socialism are, in fact, conceptually quite compatible; that a society be at the same time capitalist and socialist (by that I do not refer to a 'mixed economy') involves no contradiction. For it turns out, on closer examination than the matter usually receives, that capitalism and socialism are features of different parts of the social structure, and are therefore not necessarily in competition or conflict with one another-at least, not conceptually (whether they could in practice coexist with one another is a different and empirical question, which is raised by, for example, 'functionalist' theories of social structure'). In brief, while capitalism is a feature of society's economic organization, socialism is rather an aspect of its political system. In fact, as we shall see, socialism is a part of political democracy, and any democratic political system is therefore necessarily socialist.

5,034 citations


"The future of employment: How susce..." refers background in this paper

  • ...As stressed by Schumpeter (1962), it was not the lack of inventive ideas that...

    [...]

  • ...As stressed by Schumpeter (1962), it was not the lack of inventive ideas that set the boundaries for economic development, but rather powerful social and economic interests promoting the technological status quo....

    [...]

Frequently Asked Questions (15)
Q1. What are the contributions in "The future of employment" ?

The authors examine how susceptible jobs are to computerisation. Based on these estimates, the authors examine expected impacts of future computerisation on us labour market outcomes, with the primary objective of analysing the number of jobs at risk and the relationship between an occupation ’ s probability of computerisation, wages and educational attainment. The authors further provide evidence that wages and educational attainment exhibit a strong negative relationship with an occupation ’ s probability of computerisation. The authors are indebted to Stuart Armstrong, Nick Bostrom, Eris Chinellato, Mark Cummins, Daniel Dewey, David Dorn, Alex Flint, Claudia Goldin, John Muellbauer, Vincent Mueller, Paul Newman, Seán Ó hÉigeartaigh, Anders Sandberg, Murray Shanahan, and Keith Woolcock for their excellent suggestions. 

in which the construction object is partially assembled in a factory before being transported to the construction site, provides a way of largely removing the requirement for adaptability. 

While basic geometric identification is reasonably mature, enabled by the rapid development of sophisticated sensors and lasers, significant challenges remain for more complex perception tasks, such as identifying objects and their properties in a cluttered field of view. 

In short, while factory assembly lines, with their extreme division of labour, had required vast quantities of human operatives, electrification allowed many stages of the production process to be automated, which in turn increased the demand for relatively skilled blue-collar production workers to operate the machinery. 

While the computer substitution for both cognitive and manual routine tasks is evident, non-routine tasks involve everything from legal writing, truck driving and medical diagnoses, to persuading and selling. 

According to their estimate, 47 percent of total us employment is in the high risk category, meaning that associated occupations are potentially automatable over some unspecified number of years, perhaps a decade or two. 

Computer capital can now equally substitute for a wide range of tasks commonly defined as nonroutine (Brynjolfsson and McAfee, 2011), meaning that the task model will not hold in predicting the impact of computerisation on the task content of employment in the twenty-first century. 

The states of California and Nevada are, for example, currently in the process of making legislatory changes to allow for driverless cars. 

Because the diffusion of various manufacturing technologies did not impose a risk to the value of their assets, and some property owners stood to benefit from the export of manufactured goods, the artisans simply did not have the political power to repress them. 

With Parliamentary supremacy established over the Crown, legislation was passed in 1769 making the destruction of machinery punishable by death (Mokyr, 1990, p. 257). 

By contrast, airship technology is widely recognised as having been popularly abandoned as a consequence of the reporting of the Hindenburg disaster. 

The continued technological development of robotic hardware is having notable impact upon employment: over the past decades, industrial robots have taken on the routine tasks of most operatives in manufacturing. 

More specifically, between 1980 and 2005, the share of us labour hours in service occupations grew by 30 percent after having been flat or declining in the three prior decades. 

A related challenge is failure recovery – i.e. identifying and rectifying the mistakes of the robot when it has, for example, dropped an object. 

in short, generalist occupations requiring knowledge of human heuristics, and specialist occupations involving the development of novel ideas and artifacts, are the least susceptible to computerisation. 

Trending Questions (2)
The future of employment: how susceptible are jobs to computerisation?

The paper examines the susceptibility of jobs to computerization and estimates the probability of computerization for different occupations. It analyzes the expected impact of computerization on the US labor market, including the number of jobs at risk and the relationship between computerization probability, wages, and educational attainment.

The future of employment: How susceptible are jobs to computerisation?

The paper examines the probability of computerisation for different occupations and its impact on the US labor market.