This study explores how nonverbal, sensory cues—olfactory and wind—can serve as subtle channels for behavioral guidance in mobile human-robot interaction. As multimodal interaction becomes increasingly integral to HRI, implicit communication remains underexplored, particularly through non-visual and non-auditory modalities. To address this gap, we conducted a Wizard-of-Oz study with 35 participants who experienced three types of stimuli—strong wind, weak wind, and olfactory cues—across six contextual scenarios. Our findings show that such sensory cues can induce affective interpretations ranging from support to surveillance, depending on the context. Olfactory cues generally evoked more positive impressions and a greater sense of care than wind, while wind cues were perceived as more directive and intrusive in comparison. These results suggest that scent and wind offer promising potential as ambient, affective, and non-intrusive notification channels for future human-robot interaction systems.
ACM CHI Conference on Human Factors in Computing Systems