SensoryBlox: Plug-and-Feel Modular Multi-Sensory User Interface for Immersive Cardboard VR

要旨

We present SensoryBlox, a modular, multi-sensory user interface designed for integration with cardboard-based virtual reality (VR) head-mounted displays (HMDs). SensoryBlox features interchangeable sensory modules—vibration, temperature, wind, and olfactory—that enable users to assemble customized multi-sensory configurations tailored to diverse VR contexts. The system includes in-VR interfaces for module scanning, spatial tracking, and real-time customization of feedback patterns. To inform SensoryBlox design, we conducted three user studies. The initial study explored application scenarios and associated sensory modalities to identify design requirements for a modular multi-sensory VR system. Based on these findings, we developed the hardware modules and in-VR software interfaces. In the second study, we evaluated the usability and interaction experience of SensoryBlox across all functionalities. Finally, a comparison study examined the impact of multi-sensory feedback on user experience. Our findings demonstrate the potential of a modular multi-sensory system in enriching immersion and engaging interactions within low-cost VR environments.

著者
Hyunjae Gil
The University of Texas at Dallas, Richardson, Texas, United States
Abbas Khawaja
The University of Texas at Dallas, Richardson, Texas, United States
Ben Cressman
University of Texas at Dallas, Richardson, Texas, United States
Andrew Gerungan
University of Texas at Dallas, Richardson, Texas, United States
Jin Ryong Kim
University of Texas at Dallas, Richardson, Texas, United States

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Embodied Interaction and Wearables

P1 - Room 133
7 件の発表
2026-04-15 18:00:00
2026-04-15 19:30:00