We propose a novel interface concept in which interactive systems directly manipulate the user’s head orientation. We implement this using electrical-muscle-stimulation (EMS) of the neck muscles, which turns the head around its yaw (left/right) and pitch (up/down) axis. As the first exploration of EMS for head actuation, we characterized which muscles can be robustly actuated. Second, we evaluated the accuracy of our system for actuating participants' head orientation towards static targets and trajectories. Third, we demonstrated how it enables interactions not possible before by building a range of applications, such as (1) synchronizing head orientations of two users, which enables a user to communicate head nods to another user while listening to music, and (2) directly changing the user's head orientation to locate objects in AR. Finally, in our second study, participants felt that our head actuation contributed positively to their experience in four distinct applications.
https://dl.acm.org/doi/abs/10.1145/3491102.3501910
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)