scispace - formally typeset
G

Gianluca Demartini

Researcher at University of Queensland

Publications -  188
Citations -  3883

Gianluca Demartini is an academic researcher from University of Queensland. The author has contributed to research in topics: Crowdsourcing & Computer science. The author has an hindex of 27, co-authored 156 publications receiving 3169 citations. Previous affiliations of Gianluca Demartini include University of California, Berkeley & Leibniz University of Hanover.

Papers
More filters
Proceedings Article

Platform-Related Factors in Repeatability and Reproducibility of Crowdsourcing Tasks

TL;DR: This paper identifies some key task design variables that cause variations and proposes an experimentally validated set of actions to counteract these effects thus achieving reliable and repeatable crowdsourced data collection experiments.
Proceedings ArticleDOI

Understanding Engagement through Search Behaviour

TL;DR: This paper investigates the potential to predict how users perceive engagement with search by modelling behavioural signals from log files using supervised learning methods and shows that time- and query-related features are best suited for predicting user perceived engagement.
Proceedings ArticleDOI

Crowdsourced Fact-Checking at Twitter: How Does the Crowd Compare With Experts?

TL;DR: This work studies the first large-scale effort of crowdsourced fact-checking deployed in practice, started by Twitter with the Birdwatch program, and shows that crowdsourcing may be an effective fact- checking strategy in some settings, but does not lead to consistent, actionable results in others.
Proceedings ArticleDOI

Health Cards for Consumer Health Search

TL;DR: This is the first study that thoroughly investigates the effectiveness of health cards in supporting consumer health search and reveals how and when health cards are beneficial to users in completing consumer healthsearch tasks.
Journal ArticleDOI

On the State of Reporting in Crowdsourcing Experiments and a Checklist to Aid Current Practices

TL;DR: In this article, the authors examine the current state of reporting of crowdsourcing experiments and offer guidance to address associated reporting issues, such as variability sources that can affect the experiment's reproducibility and prevent a fair assessment of research outcomes.