Dark environment challenges low-vision (LV) individuals to engage in running by following sighted guide—a Caller-style guided running—due to insufficient illumination, because it prevents them from using their residual vision to follow the guide and be aware about their environment. We design, develop, and evaluate RunSight, an augmented reality (AR)-based assistive tool to support LV individuals to run at night. RunSight combines see-through HMD and image processing to enhance one's visual awareness of the surrounding environment (e.g., potential hazard) and visualize the guide's position with AR-based visualization. To demonstrate RunSight's efficacy, we conducted a user study with 8 LV runners. The results showed that all participants could run at least 1km (mean = 3.44 km) using RunSight, while none could engage in Caller-style guided running without it. Our participants could run safely because they effectively synthesized RunSight-provided cues and information gained from runner-guide communication.
https://dl.acm.org/doi/10.1145/3706598.3714284
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)