この勉強会は終了しました。ご参加ありがとうございました。
VR games offer new freedom for players to interact naturally using motion. This makes it harder to design games that react to player motions convincingly. We present a framework for VR sword fighting experiences against a virtual character that simplifies the necessary technical work to achieve a convincing simulation. The framework facilitates VR design by abstracting from difficult details on the lower "physical" level of interaction, using data-driven models to automate both the identification of user actions and the synthesis of character animations. Designers are able to specify the character's behaviour on a higher "semantic" level using parameterised building blocks, which allow for control over the experience while minimising manual development work. We conducted a technical evaluation, a questionnaire study and an interactive user study. Our results suggest that the framework produces more realistic and engaging interactions than simple hand-crafted interaction logic, while supporting a controllable and understandable behaviour design.
Virtual Reality (VR) sickness is common with symptoms such as headaches, nausea, and disorientation, and is a major barrier to using VR. We propose WalkingVibe, which applies unobtrusive vibrotactile feedback for VR walking experiences, and also reduces VR sickness and discomfort while improving realism. Feedback is delivered through two small vibration motors behind the ears at a frequency that strikes a balance in inducing vestibular response while minimizing annoyance. We conducted a 240-person study to explore how visual, audio, and various tactile feedback designs affect the locomotion experience of users walking passively in VR while seated statically in reality. Results showed timing and location for tactile feedback have significant effects on VR sickness and realism. With WalkingVibe, 2-sided step-synchronized design significantly reduces VR sickness and discomfort while significantly improving realism. Furthermore, its unobtrusiveness and ease of integration make WalkingVibe a practical approach for improving VR experiences with new and existing VR headsets.
Automatic generation of Virtual Reality (VR) worlds which adapt to physical environments have been proposed to enable safe walking in VR. However, such techniques mainly focus on the avoidance of physical objects as obstacles and overlook their interaction affordances as passive haptics. Current VR experiences involving interaction with physical objects in surroundings still require verbal instruction from an assisting partner. We present ARchitect, a proof-of-concept prototype that allows flexible customization of a VR experience with human-in-the-loop. ARchitect brings in an assistant to map physical objects to virtual proxies of matching affordances using Augmented Reality (AR). In a within-subjects study (9 user pairs) comparing ARchitect to a baseline condition, assistants and players experienced decreased workload and players showed increased VR presence and trust in the assistant. Finally, we defined design guidelines of ARchitect for future designers and implemented three demonstrative experiences.
We present the Levitation Simulator, a system that enables researchers and designers to iteratively develop and prototype levitation interface ideas in Virtual Reality. This includes user tests and formal experiments. We derive a model of the movement of a levitating particle in such an interface. Based on this, we develop an interactive simulation of the levitation interface in VR, which exhibits the dynamical properties of the real interface. The results of a Fitts' Law pointing study show that the Levitation Simulator enables performance, comparable to the real prototype. We developed the first two interactive games, dedicated for levitation interfaces: LeviShooter and BeadBounce, in the Levitation Simulator, and then implemented them on the real interface. Our results indicate that participants experienced similar levels of user engagement when playing the games, in the two environments. We share our Levitation Simulator as Open Source, thereby democratizing levitation research, without the need for a levitation apparatus.
Personal smart devices have demonstrated a variety of efficient techniques for pointing and selecting on physical displays. However, when migrating these input techniques to augmented reality, it is both unclear what the relative performance of different techniques will be given the immersive nature of the environment, and it is unclear how viewport-based versus world-based pointing methods will impact performance. To better understand the impact of device and viewing perspectives on pointing in augmented reality, we present the results of two controlled experiments comparing pointing conditions that leverage various smartphone- and smartwatch-based external display pointing techniques and examine viewport-based versus world-based target acquisition paradigms. Our results demonstrate that viewport-based techniques offer faster selection and that both smartwatch- and smartphone-based pointing techniques represent high-performance options for performing distant target acquisition tasks in augmented reality.