Writing with AI Can Reduce Gender Bias in Hiring Evaluations

要旨

Women remain underrepresented in the workplace, partly due to stereotypes associating competence traits with men rather than women. Efforts to change such stereotypes often yield mixed results. As language models become integrated into daily life, AI writing assistants offer an opportunity to shift gender images. In a preregistered experiment (N=672), participants evaluated résumés for a female ("Jennifer") and a male ("John") candidate applying to a financial analyst role. They wrote evaluations using AI-generated suggestions in one of three conditions: suggestions for Jennifer integrated stereotypically male, female, or neutral traits. Suggestions for John remained neutral. Participants exposed to male-trait suggestions evaluated Jennifer as more competent, selected her as the leader, and offered higher salaries. However, we also observed signs of backlash: participants were less willing to work with competent Jennifer. We discuss implications for designing AI writing assistants to mitigate gender bias in hiring contexts.

受賞
Best Paper
著者
Alicia T.H.. Liu
University of Chicago, Chicago, Illinois, United States
Mina Lee
University of Chicago, Chicago, Illinois, United States
Xuechunzi Bai
University of Chicago, Chicago, Illinois, United States

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Gendered Experiences

P1 - Room 114
7 件の発表
2026-04-13 20:15:00
2026-04-13 21:45:00