Providing tactile-feedback when users contact virtual-interfaces has been a seminal advance. However, we posit these advances have been explored in isolation from considerations of users’ physical interactions with surrounding-objects. Most touch-interfaces were designed to optimize virtual interfaces, but rarely consider that users also need to feel physical interfaces (e.g., tools, putting on/off headsets). We argue against this being the sole design-objective driving haptic-interfaces; instead, we propose also to optimize the fidelity of the real-world sensations that users feel while wearing a haptic device. We propose a framework to classify touch-devices by measuring not only their abilities to deliver virtual-feedback but also how much they impair physical-feedback—we argue this balancing act is an urgent mainstream need, given the success of Mixed-Reality. Thus, to accelerate the research in this area, we synthesize existing techniques into new conceptual-categories: feel-through, on-demand, relocated, and remote actuators. Finally, we present their pros/cons and discuss a possible roadmap.
ACM CHI Conference on Human Factors in Computing Systems