X's Community Notes is a crowdsourced fact-checking system. To improve its scalability, X introduced ``Request Community Note'' feature, enabling users to solicit fact-checks from contributors on specific posts. Yet, its implications for the system---what gets checked, by whom, and with what quality---remain unclear. Using 98,685 requested posts and their associated notes, we evaluate how requests shape the Community Notes system. We find that requested posts with higher GPT-estimated misleadingness and from authors with greater misinformation exposure are more likely to receive notes. Conversely, requested political posts (vs. non-political) are less likely to receive notes. We also observe partisan asymmetries: posts from Republicans are more likely to receive notes than those from Democrats. Although only 12% of requested posts receive request-fostered notes from top contributors, these notes are rated as more helpful and less polarized than others, partly reflecting top contributors' selective fact-checking of misleading posts. Our findings highlight both the limitations and promise of requests for scaling high-quality community-based fact-checking.
ACM CHI Conference on Human Factors in Computing Systems