When a Robot Communicates Through Air: Contextual Interpretations of Wind and Olfactory Cues

要旨

This study explores how nonverbal, sensory cues—olfactory and wind—can serve as subtle channels for behavioral guidance in mobile human-robot interaction. As multimodal interaction becomes increasingly integral to HRI, implicit communication remains underexplored, particularly through non-visual and non-auditory modalities. To address this gap, we conducted a Wizard-of-Oz study with 35 participants who experienced three types of stimuli—strong wind, weak wind, and olfactory cues—across six contextual scenarios. Our findings show that such sensory cues can induce affective interpretations ranging from support to surveillance, depending on the context. Olfactory cues generally evoked more positive impressions and a greater sense of care than wind, while wind cues were perceived as more directive and intrusive in comparison. These results suggest that scent and wind offer promising potential as ambient, affective, and non-intrusive notification channels for future human-robot interaction systems.

著者
Chaeeun Noh
Chungnam National University, Daejeon, Korea, Republic of
Jaejeung Kim
Chungnam National University, Daejeon, Korea, Republic of

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Human-Robot Interaction & Embodied Sensing

P1 - Room 134
7 件の発表
2026-04-15 18:00:00
2026-04-15 19:30:00