Topic
Crowdsourcing
About: Crowdsourcing is a research topic. Over the lifetime, 12889 publications have been published within this topic receiving 230638 citations.
Papers published on a yearly basis
Papers
More filters
•
01 Dec 2011TL;DR: An evaluation framework for spoken dialogue systems using AMT users is described and the obtained results suggest that the use of crowdsourcing technology is feasible and it can provide reliable results.
Abstract: This paper describes a framework for evaluation of spoken dialogue systems. Typically, evaluation of dialogue systems is performed in a controlled test environment with carefully selected and instructed users. However, this approach is very demanding. An alternative is to recruit a large group of users who evaluate the dialogue systems in a remote setting under virtually no supervision. Crowdsourcing technology, for example Amazon Mechanical Turk (AMT), provides an efficient way of recruiting subjects. This paper describes an evaluation framework for spoken dialogue systems using AMT users and compares the obtained results with a recent trial in which the systems were tested by locally recruited users. The results suggest that the use of crowdsourcing technology is feasible and it can provide reliable results. Copyright © 2011 ISCA.
94 citations
•
AT&T1
TL;DR: In this paper, a system performing word pronunciation crowdsourcing identifies spoken words, or word pronunciations in a dictionary of words, for review by a turker, who provides feedback on the correctness/incorrectness of the machine made pronunciation.
Abstract: Disclosed herein are systems, methods, and computer-readable storage media for crowdsourcing verification of word pronunciations. A system performing word pronunciation crowdsourcing identifies spoken words, or word pronunciations in a dictionary of words, for review by a turker. The identified words are assigned to one or more turkers for review. Assigned turkers listen to the word pronunciations, providing feedback on the correctness/incorrectness of the machine made pronunciation. The feedback can then be used to modify the lexicon, or can be stored for use in configuring future lexicons.
93 citations
••
21 May 2011TL;DR: StakeSource2.0 is described, a web-based tool that uses social networks and collaborative filtering, a "crowdsourcing" approach, to identify and prioritise stakeholders and their requirements.
Abstract: Software projects typically rely on system analysts to conduct requirements elicitation, an approach potentially costly for large projects with many stakeholders and requirements. This paper describes StakeSource2.0, a web-based tool that uses social networks and collaborative filtering, a "crowdsourcing" approach, to identify and prioritise stakeholders and their requirements.
93 citations
••
05 Oct 2014TL;DR: An approach to decomposing programming work into microtasks that might increase participation in open source software development by lowering the barriers to contribu-tion and dramatically decrease time to market by increasing the parallelism in development work is developed.
Abstract: Microtask crowdsourcing organizes complex work into workflows, decomposing large tasks into small, relatively independent microtasks. Applied to software development, this model might increase participation in open source software development by lowering the barriers to contribu-tion and dramatically decrease time to market by increasing the parallelism in development work. To explore this idea, we have developed an approach to decomposing programming work into microtasks. Work is coordinated through tracking changes to a graph of artifacts, generating appropriate microtasks and propagating change notifications to artifacts with dependencies. We have implemented our approach in CrowdCode, a cloud IDE for crowd development. To evaluate the feasibility of microtask programming, we performed a small study and found that a small crowd of 12 workers was able to successfully write 480 lines of code and 61 unit tests in 14.25 person-hours of time.
93 citations
•
TL;DR: A better understanding of what crowdsourcing systems are and what typical design aspects are considered in the development of such systems is strives to gain a better understanding.
Abstract: Many organizations are now starting to introduce crowdsourcing as a new model of business to outsource tasks, which are traditionally performed by a small group of people, to an undefined large workforce. While the utilization of crowdsourcing offers a lot of advantages, the development of the required system carries some risks, which are reduced by establishing a profound theo- retical foundation. Thus, this article strives to gain a better understanding of what crowdsourcing systems are and what typical design aspects are considered in the development of such systems. In this paper, the author conducted a sys- tematic literature review in the domain of crowdsourcing systems. As a result, 17 definitions of crowdsourcing systems were found and categorized into four perspectives: the organizational, the technical, the functional, and the human- centric. In the second part of the results, the author derived and presented com- ponents and functions that are implemented in a crowdsourcing system.
93 citations