New tangible input techniques are transforming human-computer interaction. Point-contact devices such as joysticks or buttons are simple and scalable, but they capture limited spatial information. In contrast, surface-based contact interfaces such as touchpads provide richer spatial input but require larger instrumented surfaces. We present MagBall, a magnetic-ball sensor that captures fine-grained interactions, including displacement and force, through the rotation of a magnet-embedded ball over a 3D Hall-effect sensor array. Our design localizes diverse physical interactions to a single point-contact yet operates at multiple scales from millimeters to meters. Our machine learning models can infer the displacement and force with root-mean-squared errors of 0.15 mm and 0.67 N. Furthermore, our device supports interactions across diverse surfaces such as glass, metal and human skin, without additional instrumentation. We demonstrate applications in stylus pens, wearable trackballs and smart massage tools, which naturally aligns with the rolling mechanism of MagBall.
ACM CHI Conference on Human Factors in Computing Systems