Wearable robotic arms (WRA) open up a unique interaction space that closely integrates the user's body with an embodied robotic collaborator. This space affords diverse interaction styles, including body movement, hand gestures, or gaze. Yet, it is so-far unexplored which commands are desirable from a user perspective. Contributing findings from an elicitation study (N=14), we provide a comprehensive set of interactions for basic robot control, navigation, object manipulation, and emergency situations, performed when hands are free or occupied. Our study provides insights into preferred body parts, input modalities, and the users' underlying sources of inspiration. Comparing interaction styles between WRAs and off-body robots, we highlight how WRAs enable a range of interactions specific for on-body robots and how users use WRAs both as tools and as collaborators. We conclude by providing guidance on the design of ad-hoc interaction with WRAs informed by user behavior.
https://doi.org/10.1145/3544548.3581184
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)