Unlike visual and auditory media, physical sensations are difficult to create and capture, limiting the availability of diverse haptic content. Converting common media formats like video into haptics offers a promising solution, but existing video-to-haptics methods depend on specific characteristics, such as camera motion or predefined actions, and rely on spatial haptic hardware (e.g., motion chair, haptic vest). We introduce HapticLens, an interactive method for creating haptics from video, supported by an open-source GUI and two vision algorithms. Our method works with arbitrary video content, detects subtle motion, and requires only a single vibrotactile actuator. We evaluate HapticLens through technical experiments and a study with 22 participants. Results demonstrate it supports interactive vibration design with high designer satisfaction for its usability and haptic signals' overall quality and relevance. This work broadens the accessibility of video-driven haptics, offering a practical method to create and experience tactile content.
ACM CHI Conference on Human Factors in Computing Systems