We propose an algorithm that generates a vibration, an impact, or a vibration+impact haptic effect by processing a sound signal in real time. Our algorithm is selective in that it matches the most appropriate type of haptic effects to the sound using a machinelearning classifier (random forest) that is built on expert-labeled datasets. Our algorithm is tailored to enhance user experiences for video game play, and we present two examples for the RPG (roleplaying game) and FPS (first-person shooter) genres. We demonstrate the effectiveness of our algorithm by a user study in comparison to other state-of-the-art (SOTA) methods for the same crossmodal conversion. Our system elicits better multisensory user experiences than the SOTA algorithms for both game genres.
https://doi.org/10.1145/3544548.3580787
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)