RunAhead: Exploring Head Scanning based Navigation for Runners

要旨

Navigation systems for runners commonly provide turn-by-turn directions via voice and/or map-based visualizations. While voice directions require permanent attention, map-based guidance requires regular consultation. Both disrupt the running activity. To address this, we designed RunAhead, a navigation system using head scanning to query for navigation feedback, and we explored its suitability for runners in an outdoor experiment. In our design, we provide the runner with simple and intuitive navigation feedback on the path s/he is looking at through three different feedback modes: haptic, music and audio cues. In our experiment, we compare the resulting three versions of RunAhead with a baseline voice-based navigation system. We find that demand and error are equivalent across all four conditions. However, the head scanning based haptic and music conditions are preferred over the baseline and these preferences are impacted by runners' habits. With this study we contribute insights for designing navigation support for runners.

キーワード
Navigation for Running
Head Scanning
Audio Feedback
Haptic Feedback
著者
Danilo Gallo
Naver Labs Europe, Grenoble, France
Shreepriya Shreepriya
Naver Labs Europe, Grenoble, France
Jutta Willamowski
Naver Labs Europe, Grenoble, France
DOI

10.1145/3313831.3376828

論文URL

https://doi.org/10.1145/3313831.3376828

会議: CHI 2020

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)

セッション: Use your head & run

Paper session
314 LANA'I
5 件の発表
2020-04-30 01:00:00
2020-04-30 02:15:00
日本語まとめ
読み込み中…