The People's Gaze: Co-Designing and Refining Gaze Gestures with Users and Experts

要旨

As eye-tracking becomes increasingly common in modern mobile devices, the potential for hands-free, gaze-based interaction grows, but current gesture sets are largely expert-designed and often misaligned with how users naturally move their eyes. To address this gap, we introduce a two-phase methodology for developing intuitive gaze gestures. First, four co-design workshops with 20 non-expert participants generated 102 initial concepts. Next, four gaze interaction experts reviewed and refined these into a set of 32 gestures. We found that non-experts, after a brief introduction, intuitively anchor gestures in familiar metaphors and develop a compositional grammar; i.e., activation (dwell) + action (gaze gesture or blink), to ensure intentionality and mitigate the classic Midas Touch problem. Experts prioritized gestures that are ergonomically sound, aligned with natural saccades, and reliably distinguishable. The resulting user-grounded, expert-validated gesture set, along with actionable design principles, provides a foundation for developing intuitive, hands-free interfaces for gaze-enabled devices.

受賞
Honorable Mention
著者
Yaxiong Lei
University of St Andrews, St Andrews, United Kingdom
Xinya Gong
University of St Andrews, Fife, United Kingdom
Shijing He
King's College London, London, United Kingdom
Yafei Wang
Dalian Maritime University, Dalian, Liaoning, China
Mohamed Khamis
University of Glasgow, Glasgow, United Kingdom
Juan Ye
University of St Andrews, St Andrews, United Kingdom

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Gaze as Input

P1 - Room 124
6 件の発表
2026-04-15 18:00:00
2026-04-15 19:30:00