User authentication on head-mounted displays (HMDs) relies on passwords, which are cumbersome to input and susceptible to shoulder-surfing attacks. Recent research has revealed that behavioral signals collected during common HMD tasks are highly distinctive between users. Building on these findings, this paper presents a knowledge-driven behavioral authentication system for HMDs. Our system leverages user-defined gestures as cues and trains an anomaly detector on hand joint motion signals for each user. To evaluate effectiveness, we conducted a comprehensive multi-session user study (n=20) and an observation attack study (n=10). The results show that gestures secured with joint motions are resilient to worst-case scenario video-based observation attacks (AUC = 0.97, EER = 3.58%) and maintain high recall performance over one week (AUC = 0.93, EER = 9.82%). These findings suggest that user-generated biometric hand gestures offer a promising approach to securing HMDs.
ACM CHI Conference on Human Factors in Computing Systems