Virtual hand selection techniques in AR/VR face a persistent challenge due to the inherent speed–accuracy trade-off. Although target prediction offers a promising direction, its practical adoption is limited by the inevitable errors of predictive models. We present Motion-Touch, a selection technique that integrates a Kinematics-Based Adaptive Switch (KBAS) with deep-learning-based target prediction. KBAS switches between the two phases of pointing process: an untriggerable ballistic phase and a corrective phase in which only the AI-predicted target can be triggered through Touch. The technique can adaptively switch between these phases under distinct kinematic conditions. We collected a hand kinematics dataset from 20 participants to support model training and mechanism calibration. Compared to baseline techniques, Motion-Touch achieves selection times statistically comparable to the fastest reliable controller, while offering controller-free, error-free selection with minimal trigger effort. Our findings demonstrate how Motion-Touch achieves a near-optimal compromise for the speed–accuracy trade-off in virtual hand selection.
ACM CHI Conference on Human Factors in Computing Systems