Emerging AR applications require seamless integration of the virtual and physical worlds, which calls for tools that support both passive perception and active manipulation of the environment, enabling bidirectional interaction. We introduce EchoSight, a system for AR glasses that enables efficient look-and-control bidirectional interaction. EchoSight exploits optical wireless communication to instantaneously connect virtual data with its physical counterpart. EchoSight's unique dual-element optical design leverages beam directionality to automatically align the user's focus with target objects, reducing the overhead in both target identification and subsequent communication. This approach streamlines user interaction, reducing cognitive load and enhancing engagement. Our evaluations demonstrate EchoSight's effectiveness for room-scale communication, achieving distances up to 5 m and viewing angles up to 120 degrees. A study with 12 participants confirms EchoSight's improved efficiency and user experience over traditional methods, such as QR Code scanning and voice control, in AR IoT applications.
https://dl.acm.org/doi/10.1145/3706598.3713925
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)