Navigation systems for runners commonly provide turn-by-turn directions via voice and/or map-based visualizations. While voice directions require permanent attention, map-based guidance requires regular consultation. Both disrupt the running activity. To address this, we designed RunAhead, a navigation system using head scanning to query for navigation feedback, and we explored its suitability for runners in an outdoor experiment. In our design, we provide the runner with simple and intuitive navigation feedback on the path s/he is looking at through three different feedback modes: haptic, music and audio cues. In our experiment, we compare the resulting three versions of RunAhead with a baseline voice-based navigation system. We find that demand and error are equivalent across all four conditions. However, the head scanning based haptic and music conditions are preferred over the baseline and these preferences are impacted by runners' habits. With this study we contribute insights for designing navigation support for runners.
https://doi.org/10.1145/3313831.3376828
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)