Vibration Vibes

会議の名前
CHI 2025
ReachPad: Interacting with Multiple Virtual Screens using a Single Physical Pad through Haptic Retargeting
要旨

The advancement of Virtual Reality (VR) has expanded 2D user interfaces into 3D space. This change has introduced richer interaction modalities but also brought challenges, especially the lack of haptic feedback in mid-air interactions. Previous research has explored various methods to provide feedback for interface interactions, but most approaches require specialized haptic devices. We introduce haptic retargeting to enable users to control multiple virtual screens in VR using a simple flat pad, which serves as a single physical proxy to support seamless interaction across multiple virtual screens. We conducted user studies to explore the appropriate virtual screen size and positioning under our retargeting method and then compared various drag-and-drop methods for cross-screen interaction. Finally, we compared our method with controller-based interaction in application scenarios.

著者
Han Shi
Fudan University, Shanghai, China
Hanzhong Luo
Tsinghua University, Beijing, China
HyeonBeom Yi
Electronics and Telecommunications Research Institute, Daejeon, Korea, Republic of
Seungwoo Je
SUSTech, Shenzhen, China
DOI

10.1145/3706598.3713629

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713629

動画
ChatHAP: A Chat-Based Haptic System for Designing Vibrations through Conversation
要旨

In contrast to design tools for graphics and audio generation from text prompts, haptic design tools lag behind due to challenges in constructing large-scale, high-quality datasets including vibrations and text descriptions. To address this gap, we propose ChatHAP, a conversational haptic system for designing vibrations. ChatHAP integrates various haptic design approaches using a large language model, including generating vibrations using signal parameters, navigating through libraries, and modifying existing vibrations. To further improve vibration navigation, we present an algorithm that adaptively learns user preferences for vibration features. A user study with novices (n=20) demonstrated that ChatHAP can serve as a practical design tool, and the proposed algorithm significantly reduced task completion time (38%), prompt quantity (25%), and verbosity (36%). The study found ChatHAP easy-to-use and identified requirements for chat-based haptic design as well as features for further improvement. Finally, we present key findings with ChatHAP and discuss implications for future work.

著者
Chungman Lim
Gwangju Institute of Science and Technology, Gwangju, Korea, Republic of
Kevin John
Arizona State University, Tempe, Arizona, United States
Gyungmin Jin
Gwangju Institute of Science and Technology, Gwangju, Korea, Republic of
Hasti Seifi
Arizona State University, Tempe, Arizona, United States
Gunhyuk Park
Gwangju Institute of Science and Technology, Gwangju, --- Select One ---, Korea, Republic of
DOI

10.1145/3706598.3713441

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713441

動画
Tendon Vibration for Creating Movement Illusions in Virtual Reality
要旨

Tendon vibration can create movement illusions: vibrating the biceps tendon induces an illusion of extending the arm, while vibrating the triceps tendon induces an illusion of flexing the arm. However, it is unclear how to create and integrate such illusions shown in neuroscience to interaction techniques in virtual reality (VR). We first design a motor setup for tendon vibration. Study 1 validates that the setup induces movement illusions which on average create a 5.26 cm offset in active arm movements. Study 2 shows that tendon vibration improves the detection thresholds of visual motion gains often used in VR interaction techniques by 0.22. A model we developed in Study 2 predicts the effects of tendon vibration and is used in a biomechanical simulation to demonstrate the detection thresholds across typical reaching tasks in VR.

受賞
Honorable Mention
著者
Mantas Cibulskis
University of Copenhagen, Copenhagen, Denmark
Difeng Yu
University of Copenhagen, Copenhagen, Denmark
Erik Skjoldan. Mortensen
University of Copenhagen, Copenhagen, Denmark
Waseem Hassan
University of Copenhagen, Copenhagen, Denmark
Mark Schram Christensen
University of Copenhagen, Copenhagen, Denmark
Joanna Bergström
University of Copenhagen, Copenhagen, Denmark
DOI

10.1145/3706598.3714003

論文URL

https://dl.acm.org/doi/10.1145/3706598.3714003

動画
Gesture and Audio-Haptic Guidance Techniques to Direct Conversations with Intelligent Voice Interfaces
要旨

Advances in large language models (LLMs) empower new interactive capabilities for wearable voice interfaces, yet traditional voice-and-audio I/O techniques limit users' ability to flexibly navigate information and manage timing for complex conversational tasks. We developed a suite of gesture and audio-haptic guidance techniques that enable users to control conversation flows and maintain awareness of possible future actions, while simultaneously contributing and receiving conversation content through voice and audio. A 14-participant exploratory study compared our parallelized I/O techniques to a baseline of voice-only interaction. The results demonstrate the efficiency of gestures and haptics for information access, while allowing system speech to be redirected and interrupted in a socially acceptable manner. The techniques also raised user awareness of how to leverage intelligent capabilities. Our findings inform design recommendations to facilitate role-based collaboration between multimodal I/O techniques and reduce users' perception of time pressure when interleaving interactions with system speech.

著者
Shwetha Rajaram
Meta, Toronto, Ontario, Canada
Hemant Bhaskar. Surale
Meta, Toronto, Ontario, Canada
Codie McConkey
Meta, Toronto, Ontario, Canada
Carine Rognon
Meta, Redmond, Washington, United States
Hrim Mehta
Meta, Toronto, Ontario, Canada
Michael Glueck
Meta, Toronto, Ontario, Canada
Christopher Collins
Meta, Toronto, Ontario, Canada
DOI

10.1145/3706598.3714310

論文URL

https://dl.acm.org/doi/10.1145/3706598.3714310

動画
Motion-Coupled Asymmetric Vibration for Pseudo Force Rendering in Virtual Reality
要旨

In Virtual Reality (VR), rendering realistic forces is crucial for immersion, but traditional vibrotactile feedback fails to convey force sensations effectively. Studies of asymmetric vibrations that elicit pseudo forces show promise but are inherently tied to unwanted vibrations, reducing realism. Leveraging sensory attenuation to reduce the perceived intensity of self-generated vibrations during user movement, we present a novel algorithm that couples asymmetric vibrations with user motion, which mimics self-generated sensations. Our psychophysics study with 12 participants shows that motion-coupled asymmetric vibration attenuates the experience of vibration (equivalent to a \textasciitilde 30\% reduction in vibration-amplitude) while preserving the experience of force, compared to continuous asymmetric vibrations (state-of-the-art). We demonstrate the effectiveness of our approach in VR through three scenarios: shooting arrows, lifting weights, and simulating haptic magnets. Results revealed that participants preferred forces elicited by motion-coupled asymmetric vibration for tasks like shooting arrows and lifting weights. This research highlights the potential of motion-coupled asymmetric vibrations, offers new insights into sensory attenuation, and advances force rendering in VR.

受賞
Honorable Mention
著者
Nihar Sabnis
Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
Maëlle Roche
École Supérieure d'Ingénieur Léonard de Vinci, Courbevoie, France
Dennis Wittchen
Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
Donald Degraen
University of Canterbury, Christchurch, New Zealand
Paul Strohmeier
Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
DOI

10.1145/3706598.3713358

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713358

動画
VibraForge: A Scalable Prototyping Toolkit For Creating Spatialized Vibrotactile Feedback Systems
要旨

Spatialized vibrotactile feedback systems deliver tactile information by placing multiple vibrotactile actuators on the body. As increasing numbers of actuators are required to adequately convey information in complicated applications, haptic designers find it difficult to create such systems due to limited scalability of existing toolkits. We propose VibraForge, an open-source vibrotactile toolkit that supports up to 128 vibrotactile actuators. Each actuator is encapsulated within a self-contained vibration unit and driven by its own microcontroller. By leveraging a chain-connection method, each unit receives independent vibration commands from a control unit, with fine-grained control over intensity and frequency. We also designed a GUI Editor to expedite the authoring of spatial vibrotactile patterns. Technical evaluation showed that vibration units reliably reproduced audio waveforms with low-latency and high-bandwidth data communication. Case studies of a phonemic tactile display, virtual reality fitness training, and drone teleoperation demonstrated the potential usage of VibraForge within different domains. A usability study with non-expert users highlighted the low technical barrier and customizability of the toolkit.

著者
Bingjian Huang
University of Toronto, Toronto, Ontario, Canada
Siyi Ren
University of Toronto, Toronto, Ontario, Canada
Yuewen Luo
University of Toronto, Toronto, Ontario, Canada
Qilong Cheng
University of Toronto, Toronto, Ontario, Canada
Hanfeng Cai
University of Toronto, Toronto, Ontario, Canada
Yeqi Sang
University of Toronto, Toronto, Ontario, Canada
Mauricio Sousa
Reality Labs Research at Meta, Toronto, Ontario, Canada
Paul H. Dietz
The University of Toronto, Toronto, Ontario, Canada
Daniel Wigdor
University of Toronto, Toronto, Ontario, Canada
DOI

10.1145/3706598.3714273

論文URL

https://dl.acm.org/doi/10.1145/3706598.3714273

動画
CollabJam: Studying Collaborative Haptic Experience Design for On-Body Vibrotactile Patterns
要旨

Designing vibrotactile experiences collaboratively requires communicating using multiple senses. This is challenging in remote scenarios as designers need to effectively express and communicate their intention while iteratively building and refining experiences, ideally in real-time. We formulate design considerations for collaborative haptic design tools, and propose CollabJam, a collaborative prototyping suite enabling remote synchronous design of vibrotactile experiences for on-body applications. We first outline CollabJam’s features and present a technical evaluation. Second, we use CollabJam to understand communication and design patterns used during haptic experience design. We performed an in-depth design evaluation spanning four sessions in which four pairs of participants designed and reviewed vibrotactile experiences remotely. A qualitative content analysis revealed how multi-sensory communication is essential to convey ideas, how stimulating the tactile sense can interfere with personal boundaries, and how freely placing actuators on the skin can provide both benefits and challenges.

受賞
Honorable Mention
著者
Dennis Wittchen
Dresden University of Applied Sciences, Dresden, Saxony, Germany
Alexander Ramian
University of Applied Sciences Dresden, Dresden, Saxony, Germany
Nihar Sabnis
Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
Richard Böhme
Dresden University of Applied Sciences, Dresden, Saxony, Germany
Christopher Chlebowski
Dresden University of Applied Sciences, Dresden, Saxony, Germany
Georg Freitag
University of Applied Sciences Dresden, Dresden, Saxony, Germany
Bruno Fruchard
Univ. Lille, Inria, CNRS, Centrale Lille, F-59000 Lille, France
Donald Degraen
University of Canterbury, Christchurch, New Zealand
DOI

10.1145/3706598.3713469

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713469

動画