When users reach their arms to different locations in physical space, they often adapt how they move (i.e., kinematic properties of their reaches) depending on the: (1) direction they move, (2) hand they use, and (3) side of the body where the movement occurs. However, it is not yet clear if and how these three properties of reaching tasks may interact to influence users’ behavior when they reach to objects in VR. To address this question, we had users perform virtual hand reaches in five different directions, on both sides of their bodies, using both their dominant and non-dominant hands. The results revealed that users adapted their virtual hand reaching movements in response to changes in all three properties. The findings provide practitioners insights on how to measure and interpret users’ movements, which has applicability in emerging contexts that include detecting VR usability issues and using VR for stroke rehabilitation.
https://doi.org/10.1145/3544548.3581191
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)