XR Toolkits

会議の名前
UIST 2022
AUIT – the Adaptive User Interfaces Toolkit for Designing XR Applications
要旨

Adaptive user interfaces can improve experiences in Extended Reality (XR) applications by adapting interface elements according to the user's context. Although extensive work explores different adaptation policies, XR creators often struggle with their implementation, which involves laborious manual scripting. The few available tools are underdeveloped for realistic XR settings where it is often necessary to consider conflicting aspects that affect an adaptation. We fill this gap by presenting AUIT, a toolkit that facilitates the design of optimization-based adaptation policies. AUIT allows creators to flexibly combine policies that address common objectives in XR applications, such as element reachability, visibility, and consistency. Instead of using rules or scripts, specifying adaptation policies via adaptation objectives simplifies the design process and enables creative exploration of adaptations. After creators decide which adaptation objectives to use, a multi-objective solver finds appropriate adaptations in real-time. A study showed that AUIT allowed creators of XR applications to quickly and easily create high-quality adaptations.

著者
João Marcelo. Evangelista Belo
Aarhus University, Aarhus, Denmark
Mathias N.. Lystbæk
Aarhus University, Aarhus, Denmark
Anna Maria. Feit
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Ken Pfeuffer
Aarhus University, Aarhus, Denmark
Peter Kán
TU Wien, Vienna, Austria
Antti Oulasvirta
Aalto University, Helsinki, Finland
Kaj Grønbæk
Aarhus University, Aarhus, Denmark
論文URL

https://doi.org/10.1145/3526113.3545651

RealityLens: A User Interface for Blending Customized Physical World View into Virtual Reality
要旨

Research has enabled virtual reality (VR) users to interact with the physical world by blending the physical world view into the virtual environment. However, current solutions are designed for specific use cases and hence are not capable of covering users' varying needs for accessing information about the physical world. This work presents RealityLens, a user interface that allows users to peep into the physical world in VR with the reality lenses they deployed for their needs. For this purpose, we first conducted a preliminary study with experienced VR users to identify users' needs for interacting with the physical world, which led to a set of features for customizing the scale, placement, and activation method of a reality lens. We evaluated the design in a user study (n=12) and collected the feedback of participants engaged in two VR applications while encountering a range of interventions from the physical world. The results show that users' VR presence tends to be better preserved when interacting with the physical world with the support of the RealityLens interface.

著者
Chiu-Hsuan Wang
National Yang Ming Chiao Tung University, Hsinchu, Taiwan
Bing-Yu Chen
National Taiwan University, Taipei, Taiwan
Liwei Chan
National Chiao Tung University, Hsinchu, Taiwan
論文URL

https://doi.org/10.1145/3526113.3545686

MechARspace: An Authoring System Enabling Bidirectional Binding of AR with Toys in Real-time
要旨

Augmented Reality (AR), which blends physical and virtual worlds, presents the possibility of enhancing traditional toy design. By leveraging bidirectional virtual-physical interactions between humans and the designed artifact, such AR-enhanced toys can provide more playful and interactive experiences for traditional toys. However, designers are constrained by the complexity and technical difficulties of the current AR content creation processes. We propose MechARspace, an immersive authoring system that supports users to create toy-AR interactions through direct manipulation and visual programming. Based on the elicitation study, we propose a bidirectional interaction model which maps both ways: from the toy inputs to reactions of AR content, and also from the AR content to the toy reactions. This model guides the design of our system which includes a plug-and-play hardware toolkit and an in-situ authoring interface. We present multiple use cases enabled by MechARspace to validate this interaction model. Finally, we evaluate our system with a two-session user study where users first recreated a set of predefined toy-AR interactions and then implemented their own AR-enhanced toy designs.

著者
Zhengzhe Zhu
Purdue University, West Lafayette, Indiana, United States
Ziyi Liu
Purdue University, West Lafayette, Indiana, United States
Tianyi Wang
Purdue University, West Lafayette, Indiana, United States
Youyou Zhang
Purdue University, West Lafayette, Indiana, United States
Xun Qian
Purdue University, West Lafayette, Indiana, United States
Ana M. Villanueva
Purdue University, West Lafayette, Indiana, United States
Karthik Ramani
Purdue University, West Lafayette, Indiana, United States
Pashin Farsak. Raja
Purdue University, West Lafayette, Indiana, United States
論文URL

https://doi.org/10.1145/3526113.3545668

RemoteLab: Virtual Reality Remote study Tool Kit
要旨

User studies play a critical role in human subject research, including human-computer interaction. Virtual reality (VR) researchers tend to conduct user studies in-person at their laboratory, where participants experiment with novel equipment to complete tasks in a simulated environment, which is often new to many. However, due to social distancing requirements in recent years, VR research has been disrupted by preventing participants from attending in-person laboratory studies. On the other hand, affordable head-mounted displays are becoming common, enabling access to VR experiences and interactions outside traditional research settings. Recent research has shown that unsupervised remote user studies can yield reliable results, however, the setup of experiment software designed for remote studies can be technically complex and convoluted. We present a novel open-source Unity toolkit, RemoteLab, designed to facilitate the preparation of remote experiments by providing a set of tools that synchronize experiment state across multiple computers, record and collect data from various multimedia sources, and replay the accumulated data for analysis. This toolkit facilitates VR researchers to conduct remote experiments when in-person experiments are not feasible or increase the sampling variety of a target population and reach participants that otherwise would not be able to attend in-person.

著者
Jaewook Lee
University of Illinois at Urbana-Champaign, Urbana, Illinois, United States
Raahul Natarrajan
University of Illinois at Urbana-Champaign, Urbana, Illinois, United States
Payod Panda
Microsoft Corp., Cambridge, United Kingdom
Sebastian S.. Rodriguez
University of Illinois at Urbana-Champaign, Urbana, Illinois, United States
Eyal Ofek
Microsoft Research, Redmond, Washington, United States
論文URL

https://doi.org/10.1145/3526113.3545679