Recent advances in muscle-computer interfaces (MCIs) have brought us closer to wearable EMG devices capable of accurate gesture recognition without the longstanding requirement for user-specific calibration data. However, much of this progress has relied on closed datasets, proprietary resources, and custom hardware, limiting accessibility for the broader research community. We take a step toward democratizing universal MCIs by showing that calibration-free gesture recognition can be achieved with open-source code, publicly available datasets, and commodity hardware. Using a 612-participant Myo Armband dataset to train foundational models, we demonstrate accurate cross-user performance for two real-time interaction tasks (inspired by recent closed-source state-of-the-art results): (1) 1D cursor control (mean acquisition time: 1.1 s) and (2) five-class discrete gesture recognition (error rate: 2% and response time: 1.0 s). For the first time, we contribute openly available calibration-free models and code for creating highly accurate MCIs, establishing a new foundation for future replication and extension.
ACM CHI Conference on Human Factors in Computing Systems