Automatic Tuning of Haptic Motion Effects to Evoke Specific Feelings in Multisensory Content

要旨

Automating the authoring of haptic motion effects, while enabling designers to carefully consider user feelings to provide high-quality user experiences, is crucial for effective multisensory content. We present a motion effect-tuning method that elicits desired perceptual or affective attributes from users watching a video. To this end, we test three modulation methods: (1) Altering the extent of low-frequency motion fluctuations, (2) Changing the motion amplitude in a high-frequency band, and (3) Sampling and interpolating significant motion peaks. Our tuning method transforms an input draft waveform using the modulation techniques to obtain an output motion effect that elicits the goal adjective scores. This method requires two regression models accounting for the effects of motion modulation and audiovisual stimuli, respectively, and we obtain them by conducting perceptual experiments. Lastly, we confirm the method's effectiveness through another user study and explore potential users' feedback and suggestions for future applications through open-ended survey questions.

著者
Jiwan Lee
Pohang University of Science and Technology (POSTECH), Pohang, Korea, Republic of
Dawoon Jeong
POSTECH, Pohang, Gyeongsangbuk-do, Korea, Republic of
Sung H. Han
Pohang University of Science and Technology (POSTECH), Pohang, Gyeongsangbuk-do, Korea, Republic of
Seungmoon Choi
Pohang University of Science and Technology (POSTECH), Pohang, Gyeongbuk, Korea, Republic of
DOI

10.1145/3706598.3713908

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713908

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Design for Physical Interactions

G318+G319
7 件の発表
2025-04-29 23:10:00
2025-04-30 00:40:00
日本語まとめ
読み込み中…