This paper introduces PalateTouch, a hands-free earphone interaction system that leverages acoustic sensing technology to detect gestures resulting from the interaction between the tongue and the palate. By transmitting Zadoff-Chu signals and analyzing ear canal transfer function features, PalateTouch can capture subtle ear canal deformation and recognize various palate gestures used for interaction. Our proposed palate touch screening method ensures the system remains unaffected by unintended gestures from daily activities and the calibration mechanism enables our system to achieve user-independent recognition. Using only the earphone's built-in microphone and speaker, our system can distinguish nine gestures with an average F1 score of 0.92 and a false alarm rate of 0.02 across diverse conditions with 16 participants. Additionally, we have enabled real-time functionality and conducted a user study with 11 participants to evaluate PalateTouch's effectiveness in a demo application. The results demonstrate the superior performance and high usability of PalateTouch.
https://dl.acm.org/doi/10.1145/3706598.3713211
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)