scispace - formally typeset
Proceedings ArticleDOI

The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk

TLDR
This paper uses the main findings of the five year log analysis to propose features used in a predictive model aiming at determining the expected performance of any batch at a specific point in time, and shows that the number of tasks left in a batch and how recent the batch is are two key features of the prediction.
Abstract
Micro-task crowdsourcing is rapidly gaining popularity among research communities and businesses as a means to leverage Human Computation in their daily operations. Unlike any other service, a crowdsourcing platform is in fact a marketplace subject to human factors that affect its performance, both in terms of speed and quality. Indeed, such factors shape the \emph{dynamics} of the crowdsourcing market. For example, a known behavior of such markets is that increasing the reward of a set of tasks would lead to faster results. However, it is still unclear how different dimensions interact with each other: reward, task type, market competition, requester reputation, etc. In this paper, we adopt a data-driven approach to (A) perform a long-term analysis of a popular micro-task crowdsourcing platform and understand the evolution of its main actors (workers, requesters, tasks, and platform). (B) We leverage the main findings of our five year log analysis to propose features used in a predictive model aiming at determining the expected performance of any batch at a specific point in time. We show that the number of tasks left in a batch and how recent the batch is are two key features of the prediction. (C) Finally, we conduct an analysis of the demand (new tasks posted by the requesters) and supply (number of tasks completed by the workforce) and show how they affect task prices on the marketplace.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Demographics and Dynamics of Mechanical Turk Workers

TL;DR: An analysis of the population dynamics and demographics of Amazon Mechanical Turk workers based on the results of the survey, with more than 85K responses from 40K unique participants, indicates that there are more than 100K workers available in Amazon»s crowdsourcing platform, the participation of the workers in the platform follows a heavy-tailed distribution, and at any given time there are over 2K active workers.
Proceedings ArticleDOI

A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk

TL;DR: In this paper, a task-level analysis revealed that workers earn a median hourly wage of only ~$2/h, and only 4% earned more than $7.25/h.
Journal ArticleDOI

Online Labour Index: Measuring the Online Gig Economy for Policy and Research

TL;DR: The Online Labour Index (OLI) as discussed by the authors is a data repository for the data underlying the online labour index, which is used by the International Organization of Industrial Relations (IOLR).
Journal ArticleDOI

Conducting interactive experiments online

TL;DR: It is concluded that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.
Posted Content

A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk

TL;DR: The characteristics of tasks and working patterns that yield higher hourly wages are explored, and platform design and worker tools are informed to create a more positive future for crowd work.
References
More filters
Journal ArticleDOI

Random Forests

TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Journal ArticleDOI

Designing games with a purpose

TL;DR: Data generated as a side effect of game play also solves computational problems and trains AI algorithms.
Proceedings ArticleDOI

The future of crowd work

TL;DR: This paper outlines a framework that will enable crowd work that is complex, collaborative, and sustainable, and lays out research challenges in twelve major areas: workflow, task assignment, hierarchy, real-time response, synchronous collaboration, quality control, crowds guiding AIs, AIs guiding crowds, platforms, job design, reputation, and motivation.
Posted Content

Analyzing the Amazon Mechanical Turk Marketplace

TL;DR: An associate professor at New York Universitys Stern School of Business uncovers answers about who are the employers in paid crowdsourcing, what tasks they post, and how much they pay.
Posted Content

The Future of Crowd Work

TL;DR: In this paper, the authors outline a framework that will enable crowd work that is complex, collaborative, and sustainable, and lay out research challenges in twelve major areas: workflow, task assignment, hierarchy, real-time response, synchronous collaboration, quality control, crowds guiding AIs, AIs guiding crowds, platforms, job design, reputation, and motivation.
Related Papers (5)