Interactions in AR/VR

Paper session

会議の名前
CHI 2020
Touché: Data-Driven Interactive Sword Fighting in Virtual Reality
要旨

VR games offer new freedom for players to interact naturally using motion. This makes it harder to design games that react to player motions convincingly. We present a framework for VR sword fighting experiences against a virtual character that simplifies the necessary technical work to achieve a convincing simulation. The framework facilitates VR design by abstracting from difficult details on the lower "physical" level of interaction, using data-driven models to automate both the identification of user actions and the synthesis of character animations. Designers are able to specify the character's behaviour on a higher "semantic" level using parameterised building blocks, which allow for control over the experience while minimising manual development work. We conducted a technical evaluation, a questionnaire study and an interactive user study. Our results suggest that the framework produces more realistic and engaging interactions than simple hand-crafted interaction logic, while supporting a controllable and understandable behaviour design.

キーワード
virtual reality
sword fighting
machine learning
animation
gesture recognition
著者
Javier Dehesa
University of Bath, Bath, United Kingdom
Andrew Vidler
Ninja Theory Ltd., Cambridge, United Kingdom
Christof Lutteroth
University of Bath, Bath, United Kingdom
Julian Padget
University of Bath, Bath, United Kingdom
DOI

10.1145/3313831.3376714

論文URL

https://doi.org/10.1145/3313831.3376714

動画
WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback
要旨

Virtual Reality (VR) sickness is common with symptoms such as headaches, nausea, and disorientation, and is a major barrier to using VR. We propose WalkingVibe, which applies unobtrusive vibrotactile feedback for VR walking experiences, and also reduces VR sickness and discomfort while improving realism. Feedback is delivered through two small vibration motors behind the ears at a frequency that strikes a balance in inducing vestibular response while minimizing annoyance. We conducted a 240-person study to explore how visual, audio, and various tactile feedback designs affect the locomotion experience of users walking passively in VR while seated statically in reality. Results showed timing and location for tactile feedback have significant effects on VR sickness and realism. With WalkingVibe, 2-sided step-synchronized design significantly reduces VR sickness and discomfort while significantly improving realism. Furthermore, its unobtrusiveness and ease of integration make WalkingVibe a practical approach for improving VR experiences with new and existing VR headsets.

キーワード
Virtual reality sickness
Discomfort
Realism
Vestibular system
Vibrotactile feedback
著者
Yi-Hao Peng
National Taiwan University, Taipei, Taiwan Roc
Carolyn Yu
National Taiwan University of Science and Technology, Taipei, Taiwan Roc
Shi-Hong Liu
National Taiwan University, Taipei City, Taiwan Roc
Chung-Wei Wang
National Chengchi University, Taipei, Taiwan Roc
Paul Taele
Texas A&M University, College Station, TX, USA
Neng-Hao Yu
National Taiwan University of Science and Technology, Taipei, Taiwan Roc
Mike Y. Chen
National Taiwan University, Taipei, Taiwan Roc
DOI

10.1145/3313831.3376847

論文URL

https://doi.org/10.1145/3313831.3376847

動画
ARchitect: Building Interactive Virtual Experiences from Physical Affordances by Bringing Human-in-the-Loop
要旨

Automatic generation of Virtual Reality (VR) worlds which adapt to physical environments have been proposed to enable safe walking in VR. However, such techniques mainly focus on the avoidance of physical objects as obstacles and overlook their interaction affordances as passive haptics. Current VR experiences involving interaction with physical objects in surroundings still require verbal instruction from an assisting partner. We present ARchitect, a proof-of-concept prototype that allows flexible customization of a VR experience with human-in-the-loop. ARchitect brings in an assistant to map physical objects to virtual proxies of matching affordances using Augmented Reality (AR). In a within-subjects study (9 user pairs) comparing ARchitect to a baseline condition, assistants and players experienced decreased workload and players showed increased VR presence and trust in the assistant. Finally, we defined design guidelines of ARchitect for future designers and implemented three demonstrative experiences.

キーワード
ARchitect
virtual reality
affordance
passive haptics
asymmetric
著者
Chuan-en Lin
Hong Kong University of Science and Technology, Hong Kong, China
Ta Ying Cheng
Hong Kong University of Science and Technology, Hong Kong, China
Xiaojuan Ma
Hong Kong University of Science and Technology, Hong Kong, China
DOI

10.1145/3313831.3376614

論文URL

https://doi.org/10.1145/3313831.3376614

動画
Levitation Simulator: Prototyping Ultrasonic Levitation Interfaces in Virtual Reality
要旨

We present the Levitation Simulator, a system that enables researchers and designers to iteratively develop and prototype levitation interface ideas in Virtual Reality. This includes user tests and formal experiments. We derive a model of the movement of a levitating particle in such an interface. Based on this, we develop an interactive simulation of the levitation interface in VR, which exhibits the dynamical properties of the real interface. The results of a Fitts' Law pointing study show that the Levitation Simulator enables performance, comparable to the real prototype. We developed the first two interactive games, dedicated for levitation interfaces: LeviShooter and BeadBounce, in the Levitation Simulator, and then implemented them on the real interface. Our results indicate that participants experienced similar levels of user engagement when playing the games, in the two environments. We share our Levitation Simulator as Open Source, thereby democratizing levitation research, without the need for a levitation apparatus.

受賞
Honorable Mention
キーワード
Modeling
Simulation
Virtual Prototyping
Ultrasonic Levitation
VR
著者
Viktorija Paneva
University of Bayreuth, Bayreuth, Germany
Myroslav Bachynskyi
University of Bayreuth, Bayreuth, Germany
Jörg Müller
University of Bayreuth, Bayreuth, Germany
DOI

10.1145/3313831.3376409

論文URL

https://doi.org/10.1145/3313831.3376409

動画
Understanding Viewport- and World-based Pointing with Everyday Smart Devices in Immersive Augmented Reality
要旨

Personal smart devices have demonstrated a variety of efficient techniques for pointing and selecting on physical displays. However, when migrating these input techniques to augmented reality, it is both unclear what the relative performance of different techniques will be given the immersive nature of the environment, and it is unclear how viewport-based versus world-based pointing methods will impact performance. To better understand the impact of device and viewing perspectives on pointing in augmented reality, we present the results of two controlled experiments comparing pointing conditions that leverage various smartphone- and smartwatch-based external display pointing techniques and examine viewport-based versus world-based target acquisition paradigms. Our results demonstrate that viewport-based techniques offer faster selection and that both smartwatch- and smartphone-based pointing techniques represent high-performance options for performing distant target acquisition tasks in augmented reality.

キーワード
3D Pointing
Augmented Reality
Virtual Reality
Mobile Devices
Input Devices
著者
Yuan Chen
University of Waterloo, Waterloo, ON, Canada
Keiko Katsuragawa
University of Waterloo & National Research Council of Canada, Waterloo, ON, Canada
Edward Lank
University of Waterloo & Inria & University of Lille, Waterloo, ON, Canada
DOI

10.1145/3313831.3376592

論文URL

https://doi.org/10.1145/3313831.3376592