A Fitts' Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces

要旨

Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts' Law study design with targets presented at different depths in the visual scene, to assess effect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efficient when targets are close to the image plane but less performant with increasing target depth due to parallax.

著者
Uta Wagner
Aarhus University, Aarhus N, Denmark
Mathias N.. Lystbæk
Aarhus University, Aarhus, Denmark
Pavel Manakhov
Aarhus University, Aarhus, Denmark
Jens Emil Sloth. Grønbæk
Aarhus University, Aarhus, Denmark
Ken Pfeuffer
Aarhus University, Aarhus, Denmark
Hans Gellersen
Lancaster University, Lancaster, United Kingdom
論文URL

https://doi.org/10.1145/3544548.3581423

動画

会議: CHI 2023

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)

セッション: Eye Gaze and New Body

Hall E
6 件の発表
2023-04-25 23:30:00
2023-04-26 00:55:00