Hand Gesture Recognition for Blind Users by Tracking 3D Gesture Trajectory

要旨

Hand gestures provide an alternate interaction modality for blind users and can be supported using commodity smartwatches without requiring specialized sensors. The enabling technology is an accurate gesture recognition algorithm, but almost all algorithms are designed for sighted users. Our study shows that blind user gestures are considerably different from sighted users, rendering current recognition algorithms unsuitable. Blind user gestures have high inter-user variance, making learning gesture patterns difficult without large-scale training data. Instead, we design a gesture recognition algorithm that works on a 3D representation of the gesture trajectory, capturing motion in free space. Our insight is to extract a micro-movement in the gesture that is user-invariant and use this micro-movement for gesture classification. To this end, we develop an ensemble classifier that combines image classification with geometric properties of the gesture. Our evaluation demonstrates a 92% classification accuracy, surpassing the next best state-of-the-art which has an accuracy of 82%.

著者
Prerna Khanna
Stony Brook University , Stony Brook, New York, United States
IV Ramakrishnan
Stony Brook University, Stony Brook, New York, United States
Shubham Jain
Stony Brook University, Stony Brook, New York, United States
Xiaojun Bi
Stony Brook University, Stony Brook, New York, United States
Aruna Balasubramanian
Stony Brook University, Stony Brook, New York, United States
論文URL

doi.org/10.1145/3613904.3642602

動画

会議: CHI 2024

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)

セッション: Hand Interaction

313B
5 件の発表
2024-05-15 18:00:00
2024-05-15 19:20:00