FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

要旨

This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.

キーワード
Haptics
robot arm
immersive environments
virtual reality
user study
perception
presence
emotion
著者
Alexander Wilberz
Hochschule Bonn-Rhein-Sieg, Sankt Augustin, Germany
Dominik Leschtschow
Hochschule Bonn-Rhein-Sieg, Sankt Augustin, Germany
Christina Trepkowski
Hochschule Bonn-Rhein-Sieg, Sankt Augustin, Germany
Jens Maiero
Hochschule Bonn-Rhein-Sieg, Sankt Augustin, Germany
Ernst Kruijff
Hochschule Bonn-Rhein-Sieg, Sankt Augustin, Germany
Bernhard Riecke
Simon Fraser University, Vancouver, BC, Canada
DOI

10.1145/3313831.3376481

論文URL

https://doi.org/10.1145/3313831.3376481

動画

会議: CHI 2020

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)

セッション: On-body interaction

Paper session
311 KAUA'I
5 件の発表
2020-04-27 23:00:00
2020-04-28 00:15:00
日本語まとめ
読み込み中…