Perception and Input in Immersive Environments

会議の名前
CHI 2024
Big or Small, It’s All in Your Head: Visuo-Haptic Illusion of Size-Change Using Finger-Repositioning
要旨

Haptic perception of physical sizes increases the realism and immersion in Virtual Reality (VR). Prior work rendered sizes by exerting pressure on the user’s fingertips or employing tangible, shape-changing devices. These interfaces are constrained by the physical shapes they can assume, making it challenging to simulate objects growing larger or smaller than the perceived size of the interface. Motivated by literature on pseudo-haptics describing the strong influence of visuals over haptic perception, this work investigates modulating the perception of size beyond this range. We developed a fixed-sized VR controller leveraging finger-repositioning to create a visuo-haptic illusion of dynamic size-change of handheld virtual objects. Through two user studies, we found that with an accompanying size-changing visual context, users can perceive virtual object sizes up to 44.2% smaller to 160.4%larger than the perceived size of the device. Without the accompanying visuals, a constant size (141.4% of device size) was perceived.

受賞
Honorable Mention
著者
Myung Jin Kim
KAIST, Daejeon, Korea, Republic of
Eyal Ofek
Microsoft Research, Redmond, Washington, United States
Michel Pahud
Microsoft Research, Redmond, Washington, United States
Mike J. Sinclair
University of Washington, Seattle, Washington, United States
Andrea Bianchi
KAIST, Daejeon, Korea, Republic of
論文URL

doi.org/10.1145/3613904.3642254

動画
STMG: A Machine Learning Microgesture Recognition System for Supporting Thumb-Based VR/AR Input
要旨

AR/VR devices have started to adopt hand tracking, in lieu of controllers, to support user interaction. However, today's hand input rely primarily on one gesture: pinch. Moreover, current mappings of hand motion to use cases like VR locomotion and content scrolling involve more complex and larger arm motions than joystick or trackpad usage. STMG increases the gesture space by recognizing additional small thumb-based microgestures from skeletal tracking running on a headset. We take a machine learning approach and achieve a 95.1% recognition accuracy across seven thumb gestures performed on the index finger surface: four directional thumb swipes (left, right, forward, backward), thumb tap, and fingertip pinch start and pinch end. We detail the components to our machine learning pipeline and highlight our design decisions and lessons learned in producing a well generalized model. We then demonstrate how these microgestures simplify and reduce arm motions for hand-based locomotion and scrolling interactions.

著者
Kenrick Kin
Meta, Redmond, Washington, United States
Chengde Wan
Meta, Redmond, Washington, United States
Ken Koh
Meta Reality Labs, Burlingame, California, United States
Andrei Marin
Meta, Zurich, Switzerland
Necati Cihan Camgöz
Meta, Zurich, Switzerland
Yubo Zhang
Meta, Burlingame, California, United States
Yujun Cai
Meta, Redmond, Washington, United States
Fedor Kovalev
Meta, Zurich, Switzerland
Moshe Ben-Zacharia
Meta, Seattle, Washington, United States
Shannon Hoople
Meta, Seattle, Washington, United States
Marcos Nunes-Ueno
Meta, Seattle, Washington, United States
Mariel Sanchez-Rodriguez
Meta Reality Labs, Redmond, Washington, United States
Ayush Bhargava
Meta, Burlingame, California, United States
Robert Wang
Meta Reality Labs, Redmond, Washington, United States
Eric Sauser
Meta, Zurich, Switzerland
Shugao Ma
Meta, Redmond, Washington, United States
論文URL

doi.org/10.1145/3613904.3642702

動画
Beyond the Blink: Investigating Combined Saccadic & Blink-Suppressed Hand Redirection in Virtual Reality
要旨

In pursuit of hand redirection techniques that are ever more tailored to human perception, we propose the first algorithm for hand redirection in virtual reality that makes use of saccades, i.e., fast ballistic eye movements that are accompanied by the perceptual phenomenon of change blindness. Our technique combines the previously proposed approaches of gradual hand warping and blink-suppressed hand redirection with the novel approach of saccadic redirection in one unified yet simple algorithm. We compare three variants of the proposed Saccadic & Blink-Suppressed Hand Redirection (SBHR) technique with the conventional approach to redirection in a psychophysical study (N=25). Our results highlight the great potential of our proposed technique for comfortable redirection by showing that SBHR allows for significantly greater magnitudes of unnoticeable redirection while being perceived as significantly less intrusive and less noticeable than commonly employed techniques that only use gradual hand warping.

著者
André Zenner
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Chiara Karr
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Martin Feick
German Research Center for Artificial Intelligence (DFKI), Saarland Informatics Campus, Saarbrücken, Germany
Oscar Ariza
Universität Hamburg, Hamburg, Germany
Antonio Krüger
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
論文URL

doi.org/10.1145/3613904.3642073

動画
TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking Only
要旨

TriPad enables opportunistic touch interaction in Augmented Reality using hand tracking only. Users declare the surface they want to appropriate with a simple hand tap gesture. They can then use this surface at will for direct and indirect touch input. TriPad only involves analyzing hand movements and postures, without the need for additional instrumentation, scene understanding or machine learning. TriPad thus works on a variety of flat surfaces, including glass. It also ensures low computational overhead on devices that typically have a limited power budget. We describe the approach, and report on two user studies. The first study demonstrates the robustness of TriPad's hand movement interpreter on different surface materials. The second study compares TriPad against direct mid-air AR input techniques on both discrete and continuous tasks and with different surface orientations. TriPad achieves a better speed-accuracy trade-off overall, improves comfort and minimizes fatigue.

著者
Camille Dupré
Université Paris-Saclay, CNRS, Inria, Gif-sur-Yvette, France
Caroline Appert
Université Paris-Saclay, CNRS, Inria, Orsay, France
Stéphanie Rey
Berger-Levrault, Toulouse, France
Houssem Saidi
Carl Berger-Levrault, Paris, France
Emmanuel Pietriga
Université Paris-Saclay, CNRS, Inria, Gif-sur-Yvette, France
論文URL

doi.org/10.1145/3613904.3642323

動画
Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented Reality
要旨

Providing attention guidance, such as assisting in search tasks, is a prominent use for Augmented Reality. Typically, this is achieved by graphically overlaying geometrical shapes such as arrows. However, providing visual guidance can cause side effects such as attention tunnelling or scene occlusions, and introduce additional visual clutter. Alternatively, visual guidance can adjust saliency but this comes with different challenges such as hardware requirements and environment dependent parameters. In this work we advocate for using flicker as an alternative for real-world guidance using Augmented Reality. We provide evidence for the effectiveness of flicker from two user studies. The first compared flicker against alternative approaches in a highly controlled setting, demonstrating efficacy (N = 28). The second investigated flicker in a practical task, demonstrating feasibility with higher ecological validity (N = 20). Finally, our discussion highlights the opportunities and challenges when using flicker to provide real-world visual guidance using Augmented Reality.

著者
Jonathan Sutton
University of Copenhagen, Copenhagen, Denmark
Tobias Langlotz
University of Otago, Dunedin, New Zealand
Alexander Plopski
TU Graz, Graz, Austria
Kasper Hornbæk
University of Copenhagen, Copenhagen, Denmark
論文URL

doi.org/10.1145/3613904.3642085

動画