Mixed Reality (MR) interfaces have traditionally relied on visual modalities, but this poses challenges in high-stakes or cognitively demanding contexts where continuous visual attention is impractical. This exploratory study investigates audio-centric interaction in MR, specifically focusing on 3D spatial navigation. Unlike prior work limited to 2D navigation, we developed a custom augmented reality (AR) navigation system to compare \textit{audio AR} (AAR) with (1) visual AR, (2) combined audio-visual AR, and (3) traditional navigation without aids. Results show that AAR enables users to maintain environmental awareness comparable to those without AR assistance, while showing no meaningful difference in navigation performance with visual AR in complex 3D environments. Qualitative reports and awareness tests suggest AAR reduces visual tunneling. These exploratory findings suggest the potential use of AAR in many surrounding fields, such as accessibility for blind or visually impaired individuals or training resources for first responders, offering insights into audio interface design for safety-critical applications.
ACM CHI Conference on Human Factors in Computing Systems