E-commerce platforms increasingly deploy explainability features to address concerns about algorithmic opacity. However, most XAI research has focused on younger, tech-savvy users, leaving open questions about how older adults engage with these features in everyday shopping. To address this gap, we conducted a qualitative study with 20 older adults aged 60+ who regularly use NAVER Shopping, one of South Korea's largest e-commerce platforms, examining their engagement with global (system-level) explanations, local (item-level) explanations, and a user-model dashboard. Our findings reveal that explainability does not operate uniformly. Many participants did not notice the explanation features during routine use or mistook them for advertisements. After guided interaction, global explanations elicited polarized responses: some participants deferred uncritically to algorithmic authority, whereas others dismissed the explanations as sophisticated marketing rhetoric. In contrast, local explanations grounded in users' behavior helped recalibrate skepticism, while a user-model dashboard exposed tensions between empowerment and surveillance. Based on these findings, we propose actionable design strategies for building inclusive and adaptive XAI systems for older adults.
ACM CHI Conference on Human Factors in Computing Systems