PianoBand: A Multimodal Wristband Interface for Portable Piano Interaction

要旨

Traditional pianos are inherently non-portable, restricting everyday accessibility and on-demand creativity. Existing portable alternatives, largely vision-based with external cameras, suffer from limited range, occlusion, and unreliable contact detection. We present PianoBand, a wrist-worn system integrating an IMU, a miniature under-wrist RGB camera, and a printed keyboard sheet augmented with fiducial markers for reliable key mapping on any flat surface. Powered by a lightweight real-time IMU–vision pipeline, PianoBand enables high-fidelity piano interaction, supporting single notes, multi-finger chords, flexible fingering, dynamic velocity, and preliminary articulation techniques. Technical evaluation showed robust tap detection (over 99% accuracy) and accurate fingertip localization (8.90 pixels error), enabling precise note mapping. A comparative user study (N=15) further evaluated system performance, reporting high note accuracy, comparable to roll-up pianos and outperforming an XR piano, along with high ratings for portability, expressivity, and extensibility. Expert interviews highlighted broad application opportunities for piano-based experience and music creation, suggesting future design directions.

著者
Zhaoguo Wang
Tsinghua University, Beijing, China
Ziyuan Li
Tsinghua University, Beijing, China
Chentao Li
Department of Automation, Tsinghua University, Beijing, China
Zihang Ao
Tsinghua University, Beijing, China
Jianjiang Feng
Tsinghua University, Beijing, China
Jie Zhou
Department of Automation, BNRist, Tsinghua University, Beijing, China

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Music to My Ears

P1 - Room 132
7 件の発表
2026-04-17 20:15:00
2026-04-17 21:45:00