Facial expression interactions play a crucial role in fostering social bonds and expressing emotions. However, in the dynamic, fast-paced, and noisy environments of parties, various factors hinder blind and low-vision individuals from engaging fully in facial expression interactions. While previous research has explored how BLV users can convey emotions through non-verbal visual cues, it has largely overlooked the challenges they face in engaging with facial expressions after perceiving these cues. To address this gap, we conducted a formative study with 10 BLV users to identify their challenges and expectations regarding facial expression interactions in parties. Guided by these insights, we developed EmojiFan, an AI-powered smart fan designed to offer a personalized representation of facial expressions through dynamic, expressive emojis. Finally, we carried out an in-the-field study with 6 BLV participants and 8 sighted social partners to examine the effectiveness of EmojiFan in enhancing facial-expression interactions during parties. Overall, our goal is to empower BLV individuals' autonomy to actively participate in social interactions through digital facial expression, thereby contributing new insights for the accessibility community on designing expressive, socially responsive assistive technologies.
ACM CHI Conference on Human Factors in Computing Systems