Olfactory experiences are increasingly in demand due to their immersive benefits. However, most interaction implementations are passive and rely on conventions established for other modalities. In this work, we investigated proactive olfactory interactions, where users actively engage with scents, focusing on mid-air gestures as an input modality miming real-world object- and scent-manipulation, e.g., fanning away an odor. Our study had participants develop a user-defined gesture set for interacting with scents in Virtual Reality (VR), covering various object types (solid, liquid, gas) and interaction modes (out-of-reach, \revision{not graspable}, graspable), participants compared interacting with scents in VR using traditional controllers versus proactive gestures, revealing that proactive gestures enhanced user experience, presence, and task performance. Finally, an exploratory study showed strong participants' preferences for personalization, enhanced interaction capabilities, and multi-sensory integration. Based on these findings, we propose design guidelines and applications for proactive interactions with scents.
https://dl.acm.org/doi/10.1145/3706598.3713964
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)