In this paper, we introduce a novel Virtual Reality (VR) navigation method using gaze ray and hand, named RayHand navigation. It supports controlling navigation speed and direction by quickly indicating the initial direction using gaze and then using dexterous hand movement for controlling the speed and direction based on the relative position between the gaze ray and user’s hand. We conducted a user study comparing our approach to the head-hand and torso-leaning-based navigation methods, and also evaluated their learning effect. The results showed that the RayHand and head-hand navigations were less physically demanding than the torso-leaning navigation, and the RayHand supported rich navigation experience with high hedonic quality and solved the issue of the user unintentionally stepping out from the designated interaction area. In addition, our approach showed a significant improvement over time with a learning effect.
https://doi.org/10.1145/3613904.3642147
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)