Real-time Semantic Full-Body Haptic Feedback Converted from Sound for Virtual Reality Gameplay

要旨

We present a multisensory virtual reality (VR) system that enables users to experience concurrent visual, auditory, and haptic feedback, featuring semantic classification of events from sound, sound-to-haptic conversion, and full-body haptic effects. This concept is applied to enhance the user experience of virtual reality (VR) gameplay. The system utilizes a Long-Short-Term Memory (LSTM) model to classify game sounds and detect key events such as gunfire, explosions, and hits. These events are translated into full-body haptic patterns through a haptic suit, providing users with realistic and immersive haptic experiences. The system operates with low latency, ensuring the seamless synchrony between sound and haptic feedback. Evaluations through user studies demonstrate significant improvements in user experience compared to traditional sound-to-haptic methods, emphasizing the importance of accurate sound classification and well-designed haptic effects.

著者
Gyeore Yun
Kyungpook National University, Daegu, Korea, Republic of
Seungmoon Choi
Pohang University of Science and Technology (POSTECH), Pohang, Gyeongbuk, Korea, Republic of
DOI

10.1145/3706598.3713355

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713355

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Haptic Interactions

G402
7 件の発表
2025-04-28 23:10:00
2025-04-29 00:40:00
日本語まとめ
読み込み中…