The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision

要旨

Wayfinding is a critical but challenging task for people who have low vision, a visual impairment that falls short of blindness. Prior wayfinding systems for people with visual impairments focused on blind people, providing only audio and tactile feedback. Since people with low vision use their remaining vision, we sought to determine how audio feedback compares to visual feedback in a wayfinding task. We developed visual and audio wayfinding guidance on smartglasses based on de facto standard approaches for blind and sighted people and conducted a study with 16 low vision participants. We found that participants made fewer mistakes and experienced lower cognitive load with visual feedback. Moreover, participants with a full field of view completed the wayfinding tasks faster when using visual feedback. However, many participants preferred audio feedback because of its shorter learning curve. We propose design guidelines for wayfinding systems for low vision.

キーワード
Accessibility
augmented reality
low vision
visual feedback
audio feedback
wayfinding
著者
Yuhang Zhao
Cornell University, New York, NY, USA
Elizabeth Kupferstein
Cornell University, New York, NY, USA
Hathaitorn Rojnirun
Cornell University, New York, NY, USA
Leah Findlater
University of Washington, Seattle, WA, USA
Shiri Azenkot
Cornell University, New York, NY, USA
DOI

10.1145/3313831.3376516

論文URL

https://doi.org/10.1145/3313831.3376516

会議: CHI 2020

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)

セッション: Interactive descriptions & wayfinding

Paper session
316B MAUI
5 件の発表
2020-04-29 23:00:00
2020-04-30 00:15:00
日本語まとめ
読み込み中…