A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing

要旨

Supporting natural communication cues is critical for people to work together remotely and face-to-face. In this paper we present a Mixed Reality (MR) remote collaboration system that enables a local worker to share a live 3D panorama of his/her surroundings with a remote expert. The remote expert can also share task instructions back to the local worker using visual cues in addition to verbal communication. We conducted a user study to investigate how sharing augmented gaze and gesture cues from the remote expert to the local worker could affect the overall collaboration performance and user experience. We found that by combing gaze and gesture cues, our remote collaboration system could provide a significantly stronger sense of co-presence for both the local and remote users than using the gaze cue alone. The combined cues were also rated significantly higher than the gaze in terms of ease of conveying spatial actions.

キーワード
Mixed Reality
Augmented Reality
Virtual Reality
remote collaboration
3D panorama
scene reconstruction
eye gaze
hand gesture
著者
Huidong Bai
University of Auckland, Auckland, New Zealand
Prasanth Sasikumar
University of Auckland, Auckland, New Zealand
Jing Yang
ETH Zürich, Zürich, Switzerland
Mark Billinghurst
University of Auckland, Auckland, New Zealand
DOI

10.1145/3313831.3376550

論文URL

https://doi.org/10.1145/3313831.3376550

会議: CHI 2020

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)

セッション: Collaboration & learning in new realities

Paper session
306AB
5 件の発表
2020-04-29 23:00:00
2020-04-30 00:15:00
日本語まとめ
読み込み中…