We present a Model Predictive Control (MPC) framework to simulate movement in interaction with computers, focusing on mid-air pointing as an example. Starting from understanding interaction from an Optimal Feedback Control (OFC) perspective, we assume that users aim to minimize an internalized cost function, subject to the constraints imposed by the human body and the interactive system. Unlike previous approaches used in HCI, MPC can compute optimal controls for nonlinear systems. This allows to use state-of-the-art biomechanical models and handle nonlinearities that occur in almost any interactive system. Instead of torque actuation, our model employs second-order muscles acting directly at the joints. We compare three different cost functions and evaluate the simulation against user movements in a pointing study. Our results show that the combination of distance, control, and joint acceleration cost matches individual users’ movements best, and predicts movements with an accuracy that is within the between-user variance. To aid HCI researchers and designers applying our approach for different users, interaction techniques, or tasks, we make our SimMPC framework, including CFAT, a tool to identify maximum voluntary torques in joint-actuated models, publicly available, and give step-by-step instructions.
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)