Data visualization benefits from non-visual cues to enable people to understand information by engaging with it through multimodality, yet most approaches rely on cumbersome technologies or large scale artifacts, making them difficult to adapt to dynamic or complex datasets. In this paper, we explore the use of cutaneous haptics as a lightweight quantitative channel for visualization tasks, allowing users to feel data and interact with it dynamically. We present Haptixel, an open-source DIY encountered-type wearable providing force-feedback on the users' fingertips' pulp. We propose an interaction framework illustrating how Haptixel can be used to complement visualization tasks through combinations of force levels and contact types. We evaluate our approach in a pixel-art-like VR user study (n=16) where pixels color/height are associated to forces as a univariate value mapping. Results show that participants can retrieve information with Haptixel, and significantly discriminate 3D-data with at least four levels of forces; suggesting that cutaneous force-feedback can function for quantitative distinctions in visualization tasks.
ACM CHI Conference on Human Factors in Computing Systems