Haptixel: Encoding Data through Cutaneous Force-based Encountered-Type Fingertip Haptics

要旨

Data visualization benefits from non-visual cues to enable people to understand information by engaging with it through multimodality, yet most approaches rely on cumbersome technologies or large scale artifacts, making them difficult to adapt to dynamic or complex datasets. In this paper, we explore the use of cutaneous haptics as a lightweight quantitative channel for visualization tasks, allowing users to feel data and interact with it dynamically. We present Haptixel, an open-source DIY encountered-type wearable providing force-feedback on the users' fingertips' pulp. We propose an interaction framework illustrating how Haptixel can be used to complement visualization tasks through combinations of force levels and contact types. We evaluate our approach in a pixel-art-like VR user study (n=16) where pixels color/height are associated to forces as a univariate value mapping. Results show that participants can retrieve information with Haptixel, and significantly discriminate 3D-data with at least four levels of forces; suggesting that cutaneous force-feedback can function for quantitative distinctions in visualization tasks.

著者
Elodie Bouzbib
Universidad Publica de Navarra, Pamplona, Spain
Louis Badr
De Vinci Higher Education, Research Center, Courbevoie, France
Claudio Pacchierotti
CNRS, Rennes, France
Anatole Lécuyer
Inria, Rennes, France
Arnaud Prouzeau
Université Paris-Saclay, Inria, CNRS, Paris, France

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Physical and Tangible Data Visualizations

P1 - Room 119
7 件の発表
2026-04-15 18:00:00
2026-04-15 19:30:00