Head-mounted displays let users explore virtual environments through a viewport that is coupled with head movement. In this work, we investigate gaze as an alternative modality for viewport control, enabling exploration of virtual worlds with less head movement. We designed three techniques that leverage gaze based on different eye movements: Dwell Snap for viewport rotation in discrete steps, Gaze Gain for amplified viewport rotation based on gaze angle, and Gaze Pursuit for central viewport alignment of gaze targets. All three techniques enable 360-degree viewport control through naturally coordinated eye and head movement. We evaluated the techniques in comparison with controller snap and head amplification baselines, for both coarse and precise viewport control, and found them to be as fast and accurate. We observed a high variance in performance which may be attributable to the different degrees to which humans tend to support gaze shifts with head movement.
https://doi.org/10.1145/3613904.3642838
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)