Companies across a wide range of industries are integrating new hiring technologies, including AI-powered and other automated employment decision systems, into various stages of the hiring process. Although proponents argue that these technologies can help identify suitable candidates and reduce bias, researchers and advocates have identified ethical and legal risks, including discriminatory impacts on members of marginalized groups. This work examines the impacts of "digitized assessments," commonly used by employers, on disabled workers in the U.S. We utilized a qualitative, human-centered design approach to look into the experiences of disabled workers who were asked to complete simulated digitized assessments. Participants indicated that assessments were (1) discriminatory and perpetuated biases throughout; (2) presented accessibility barriers; (3) caused emotionally taxing experiences; and (4) contributed to exclusion. The findings aim to inform employers, policymakers, advocates, and researchers and to suggest steps toward more effective and accessible digitized assessments.
ACM CHI Conference on Human Factors in Computing Systems