Gestures are a promising candidate of a input modality for ambient computing where conventional input modalities such as touchscreen are not available. Existing works have focused on gesture recognition using image sensors. However, its cost, high battery consumption, and privacy concerns made it challenging as an always-on solution. This paper introduces an efficient gesture recognition technique using a miniaturized 60GHz radar sensor. The technique recognizes four directional swipes and omni swipe using a radar chip (6.5×5.0[mm]) integrated into a mobile phone. We developed a convolutional neural network model efficient enough for battery-powered and computationally constrained processors. Its model size and interference time is less than 1/5000 compared to an existing gesture recognition technique using radar. Our evaluations with large scale datasets consisting of 558,000 gesture samples and 3,920,000 negative samples demonstrated our algorithm’s efficiency, robustness, and readiness to be deployed outside of research laboratories.
https://doi.org/10.1145/3411764.3445367
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2021.acm.org/)