Sensory-substitution devices enable perceiving objects by translating one modality (e.g., vision) into another (e.g., tactile). While many explored the placement of the haptic-output (e.g., torso, forehead), the camera’s location remains largely unexplored—typically seeing from the eyes’ perspective. Instead, we propose that seeing & feeling information from the hands’ perspective could enhance flexibility & expressivity of sensory-substitution devices to support manual interactions with physical objects. To this end, we engineered a back-of-the-hand electrotactile-display that renders tactile images from a wrist-mounted camera, allowing the user’s hand to feel objects while reaching & hovering. We conducted a study with sighted/Blind-or-Low-Vision participants who used our eyes vs. hand tactile-perspectives to manipulate bottles and soldering-irons, etc. We found that while both tactile perspectives provided comparable performance, when offered the opportunity to choose, all participants found value in also using the hands’ perspective. Moreover, we observed behaviors when “seeing with the hands” that suggest a more ergonomic object-manipulation. We believe these insights extend the landscape of sensory-substitution devices.
https://dl.acm.org/doi/10.1145/3706598.3713419
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)