The Effect of Gender De-biased Recommendations – A User Study on Gender-specific Preferences

要旨

Recommender systems treat users inherently differently. Sometimes, however, personalization turns into discrimination. Gender bias occurs when a system treats users differently based on gender. While most research discusses measures and countermeasures for gender bias, one recent study explored whether users enjoy gender de-biased recommendations. However, its methodology has significant shortcomings; It fails to validate its de-biasing method appropriately and compares biased and unbiased models that differ in key properties. We reproduce the study in a 2x2 between-subjects design with n=800 participants. Moreover, we examine the authors' hypothesis that educating users on gender bias improves their attitude towards de-biasing. We find that the genders perceive de-biasing differently. The female users —the majority group — rate biased recommendations significantly higher while the male users —the minority group — indicate no preference. Educating users on gender bias increased acceptance non-significantly. We consider our contribution vital towards understanding how gender de-biasing affects different user groups.

著者
Thorsten Krause
German Research Center for Artificial Intelligence, Osnabrück, Niedersachsen, Germany
Lorena Göritz
German Research Center for Artificial Intelligence, Osnabrück, Germany
Robin Gratz
German Research Center for Artificial Intelligence, Osnabrück, Germany
DOI

10.1145/3706598.3713155

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713155

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Stereotypes and Gender

G402
7 件の発表
2025-04-29 01:20:00
2025-04-29 02:50:00
日本語まとめ
読み込み中…