Contemporary research in Virtual Reality (VR) for users who are visually impaired often employs navigation and interaction modalities that are either non-conventional or constrained by physical spaces or both. We designed and examined a hapto-acoustic VR system that mitigates this by enabling non-visual exploration of large virtual environments using white cane simulation and walk-in place locomotion. The system features a complex urban cityscape incorporating a physical cane prototype coupled with a virtual cane for rendering surface textures and an omnidirectional slide mill for navigation. In addition, spatialized audio is rendered based on the progression of sound through the geometry around the user. A study involving twenty sighted participants evaluated the system through three formative tasks while blindfolded to simulate absolute blindness. 19/20 participants successfully completed all the tasks while effectively navigating through the environment. This work highlights the potential for accessible non-visual VR experiences requiring minimal training and limited prior VR exposure.
https://dl.acm.org/doi/10.1145/3706598.3713400
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)