In-ear EEG research has traditionally treated biological signals other than brainwaves, such as electromyography (EMG) and electrooculography (EOG), as unwanted noise to be removed. However, instead of discarding these signals, we developed ID.EARS, a single-ear, dry electrode-based device that utilizes these signals for real-time gesture input. We first identified the optimal position for EEG measurement around the ear using the Alpha Attenuation Response (AAR) test and collected biological signals that occur alongside brainwaves at this location. Using these signals, we created a real-time artifact detection model capable of recognizing five specific gestures: blinking, left and right winking, teeth clenching, and chewing. This model achieved over 90% accuracy in cross-validation experiments. Leveraging this model and device, we propose several application scenarios, including music control, accessibility features, MR/XR control, and healthcare services. This innovative approach extends the use of ear-EEG devices beyond healthcare, opening up possibilities for natural user interfaces.
https://dl.acm.org/doi/10.1145/3706598.3714185
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)