Towards Fair and Equitable Incentives to Motivate Paid and Unpaid Crowd Contributions

要旨

Researchers commonly rely on contributions from either unpaid contributors or work done by paid crowdworkers. Rarely are the motivations of these workers and the accuracy of their contributions studied simultaneously in the wild over time. We maintain a public system where anyone can edit an evolving tabular dataset of Computer Science faculty profiles useful for the field of CS, and in this work, we analyze both the accuracy of contributions and the motivations of paid crowdworkers and unpaid contributors, combining data from real-world edit histories and a discrete choice experiment. The accuracy of edits made by unpaid contributors was 1.9 times higher than that of paid crowdworkers for difficult-to-find data and 1.5 times greater for data requiring domain-specific expertise. \actwo{Our discrete choice experiment reveals that while both groups are motivated by common attributes describing a contribution task: pay level, estimated completion time, interest, and the ability to help others, they make different trade-offs between these attributes when choosing crowd contribution tasks.} We provide recommendations to build hybrid data systems that mix extrinsic and intrinsic motivators to motivate highly accurate contributors, whether paid or unpaid.

著者
Shaun Wallace
University of Rhode Island, Kingston, Rhode Island, United States
Talie Massachi
Brown University, Providence, Rhode Island, United States
Jiaqi Su
Brown University, Providence, Rhode Island, United States
Dave B. Miller
Tufts University, Medford, Massachusetts, United States
Jeff Huang
Brown University, Providence, Rhode Island, United States
DOI

10.1145/3706598.3714195

論文URL

https://dl.acm.org/doi/10.1145/3706598.3714195

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Crowdsourcing and Tech in the Wild

Annex Hall F204
7 件の発表
2025-04-29 23:10:00
2025-04-30 00:40:00
日本語まとめ
読み込み中…