Camera-based facial-gesture interfaces offer hands-free access for people with motor impairments (PwM), yet most recognition models are trained on able-bodied data and implicitly assume normative motor control and proprioception. We conducted a mixed-methods empirical study of 37 above-neck gestures performed by 11 PwM and 11 non-impaired participants. Results reveal systematic mismatches between user intention and model recognition in the PwM group, stemming from diverse patterns of body perception and control and leading to allocative harms. These mismatches concentrated in low-amplitude, asymmetric, and directional gestures. Building on these findings, we introduce FairGesture, a diagnostic auditing method for quantifying and interpreting such mismatches. FairGesture combines (1) the Perception Gap metric, (2) trajectory-based motion analysis, and (3) an analysis of user sensorimotor feedback, exploring the reasons behind these mismatches. The work reframes accuracy in gesture recognition as a problem of sensorimotor alignment, advancing user-centred evaluation and inclusive model design.
ACM CHI Conference on Human Factors in Computing Systems