We introduce iGripper, a handheld haptic controller designed to render stiffness feedback for gripping and clamping both rigid and elastic objects in virtual reality. iGripper directly adjusts physical stiffness by using a small linear actuator to modify the spring’s position along a lever arm, with feedback force generated by the spring's reaction to the user's input. This enables iGripper to render stiffness from zero to any specified value, determined by the spring's inherent stiffness. Additionally, a blocking mechanism is designed to provide fully rigid feedback to enlarge the rendering range. Compared to active controllers, iGripper offers a broad range of force and stiffness feedback without requiring high-power actuators. Unlike many passive controllers, which provide only braking force, iGripper, as a semi-active controller, delivers controllable elastic force feedback. We present the iGripper’s design, performance evaluation, and user studies, comparing its realism with a commercial impedance-type grip device.
https://dl.acm.org/doi/10.1145/3706598.3714291
A core use case for Virtual Reality applications is recreating real-life scenarios for training or entertainment. Promoting physiological responses for users in VR that match those of real-life spectators can maximize engagement and contribute to more co-presence. Current research focuses on visualizations and measurements of physiological data to ensure experience accuracy. However, placebo effects are known to influence performance and self-perception in HCI studies, creating a need to investigate the effect of visualizing different types of data (real, unmatched, and fake) on user perception during event recreation in VR. We investigate these conditions through a balanced between-groups study (n=44) of uninformed and informed participants. The informed group was provided with the information that the data visualizations represented previously recorded human physiological data. Our findings reveal a placebo effect, where the informed group demonstrated enhanced engagement and co-presence. Additionally, the fake data condition in the informed group evoked a positive emotional response.
https://dl.acm.org/doi/10.1145/3706598.3713594
Contemporary research in Virtual Reality (VR) for users who are visually impaired often employs navigation and interaction modalities that are either non-conventional or constrained by physical spaces or both. We designed and examined a hapto-acoustic VR system that mitigates this by enabling non-visual exploration of large virtual environments using white cane simulation and walk-in place locomotion. The system features a complex urban cityscape incorporating a physical cane prototype coupled with a virtual cane for rendering surface textures and an omnidirectional slide mill for navigation. In addition, spatialized audio is rendered based on the progression of sound through the geometry around the user. A study involving twenty sighted participants evaluated the system through three formative tasks while blindfolded to simulate absolute blindness. 19/20 participants successfully completed all the tasks while effectively navigating through the environment. This work highlights the potential for accessible non-visual VR experiences requiring minimal training and limited prior VR exposure.
https://dl.acm.org/doi/10.1145/3706598.3713400
Research in Augmented Reality (AR) and Virtual Reality (VR) has mostly viewed them in isolation. Yet, when used together in practical settings, AR and VR each offer unique strengths, necessitating multiple transitions to harness their advantages. This paper investigates potential challenges in Cross-Reality (CR) transitions to inform future application design. We implemented a CR system featuring a 3D modeling task that requires users to switch between PC, AR, and VR. Using a talk-aloud study (n=12) and thematic analysis, we revealed that frictions primarily arose when transitions conflicted with users' Spatial Mental Model (SMM). Furthermore, we found five transition archetypes employed to enhance productivity once an SMM was established. Our findings uncover that transitions have to focus on establishing and upholding the SMM of users across realities, by communicating differences between them.
https://dl.acm.org/doi/10.1145/3706598.3713921
Research on rotational gain has been done largely under active self-motion, where users control their own movement. In multiple XR scenarios, the user is under passive self-motion: their body is moved by a training simulator, a motorised gaming chair, or a vehicle-based XR application. Users may be less sensitive to manipulation under passive motion - especially when engaged in a secondary task - meaning motion experiences could be expanded by high gains and even opposed virtual-physical motion. We identified both the perceptible and maximum comfortable thresholds of rotational gain when passively turned in a motorised chair, with and without a task, for the first time. We then applied those thresholds to an 'unbounded' in-car VR game where the user experiences an entirely different route to their physical movement. We provide the first guidelines for creating enhanced passive motion experiences and open the design space to new applications not restricted by physical movement
https://dl.acm.org/doi/10.1145/3706598.3713455
The use of haptic and visual stimuli to create body illusions and enhance body ownership of virtual avatars in virtual reality (VR) has been extensively studied in the fields of psychology and Human-Computer Interaction (HCI). However, previous studies have relied on mechanical devices or corresponding proxies to provide haptic feedback. In this paper, we applied haptic retargeting to induce body illusions by redirecting users’ hand movements, altering their perception of the shape of body parts when touched. Our technique allows for the realization of more precise and complex deformations. We implemented mapping of the ear’s contour, thereby creating illusions of different ear shapes, such as elf ears and dog ears. To determine the scope of retargeting, we conducted a user study to identify the maximum tolerable deviation angle for virtual ears. Subsequently, we explored the impact of haptic retargeting on body ownership of virtual avatars.
https://dl.acm.org/doi/10.1145/3706598.3714110
Designing notifications in Augmented Reality (AR) that are noticeable yet unobtrusive is challenging since achieving this balance heavily depends on the user’s context. However, current AR systems tend to be context-agnostic and require explicit feedback to determine whether a user has noticed a notification. This limitation restricts AR systems from providing timely notifications that are integrated with users’ activities. To address this challenge, we studied how sensors can infer users’ detection of notifications while they work in an office setting. We collected 98 hours of data from 12 users, including their gaze, head position, computer interactions, and engagement levels. Our findings showed that combining gaze and engagement data most accurately classified noticeability (AUC = 0.81). Even without engagement data, the accuracy was still high (AUC = 0.76). Our study also examines time windowing methods and compares general and personalized models.
https://dl.acm.org/doi/10.1145/3706598.3713511