Assistive technologies (ATs) have the potential to empower blind and low vision (BLV) people. Yet, they often remain underutilised due to their immobility and limited applicability across scenarios. This paper presents LifeInsight, an AI-powered assistive wearable for BLV people that uses a wearable camera, microphone and single-click interface for goal-oriented visual querying. To inform the design of LifeInsight, we first collected a corpus of BLV people’s daily experiences using video probes and interviews. Ten BLV people recorded their daily experiences over one week using GoPro cameras, providing empirical insights. Based on these, we report on LifeInsight and its evaluation with 13 BLV people across six scenarios. LifeInsight effectively responded to visual queries, such as distinguishing between jars or identifying the status of a candle. Drawing on our work, we conclude with key lessons and practical recommendations to guide future research and advance the development and evaluation of AI-powered assistive wearables.
https://dl.acm.org/doi/10.1145/3706598.3713486
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)