In this paper, we explore how expressive auditory gestures added to the speech of a pedagogical agent influence the human-agent relationship and learning outcomes. In a between-subjects experiment, 41 participants assumed the role of a tutor to teach a voice-based agent. The agent used either: expressive interjections (e.g.,"yay'', "hmm'', "oh''), brief expressive musical executions, or no auditory gestures at all (control condition), throughout the interaction. Overall, the results indicate that both gestures can positively affect the interaction, but in particular, interjections can significantly increase feelings of emotional rapport with the agent and enhance motivation in learners. The implications of our findings are discussed as our work adds to the understanding of conversational agent design and can be useful for education as well as other domains in which dialogue systems are used.
https://dl.acm.org/doi/abs/10.1145/3491102.3517599
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)