Beyond Accuracy: Auditing Allocative Harms in Facial-Gesture Recognition for People with Motor Impairments

2026. To appear at ACM CHI 2026.

Siyu Zhang, Yelu Gu, Kirsten Cater & Oussama Metatla.

Camera-based facial-gesture interfaces offer hands-free access for people with motor impairments (PwM), yet most recognition models are trained on able-bodied data and implicitly assume normative motor control and proprioception. We conducted a mixed-methods empirical study of 37 above-neck gestures performed by 11 PwM and 11 non-impaired participants. Results reveal systematic mismatches between user intention and model recognition in the PwM group, stemming from diverse patterns of body perception and control and leading to allocative harms. These mismatches concentrated in low-amplitude, asymmetric, and directional gestures. Building on these findings, we introduce FairGesture, a diagnostic auditing method for quantifying and interpreting such mismatches. FairGesture combines (1) the Perception Gap metric, (2) trajectory-based motion analysis, and (3) a analysis of user sensorimotor feedback, exploring reasons of mismatches. The work reframes accuracy in gesture recognition as a problem of sensorimotor alignment, advancing user-centered evaluation and inclusive model design.