Researchers commonly rely on contributions from either unpaid contributors or work done by paid crowdworkers. Rarely are the motivations of these workers and the accuracy of their contributions studied simultaneously in the wild over time. We maintain a public system where anyone can edit an evolving tabular dataset of Computer Science faculty profiles useful for the field of CS, and in this work, we analyze both the accuracy of contributions and the motivations of paid crowdworkers and unpaid contributors, combining data from real-world edit histories and a discrete choice experiment. The accuracy of edits made by unpaid contributors was 1.9 times higher than that of paid crowdworkers for difficult-to-find data and 1.5 times greater for data requiring domain-specific expertise. \actwo{Our discrete choice experiment reveals that while both groups are motivated by common attributes describing a contribution task: pay level, estimated completion time, interest, and the ability to help others, they make different trade-offs between these attributes when choosing crowd contribution tasks.} We provide recommendations to build hybrid data systems that mix extrinsic and intrinsic motivators to motivate highly accurate contributors, whether paid or unpaid.
https://dl.acm.org/doi/10.1145/3706598.3714195
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)