We examine gestures performed with a class of input devices with distinctive quality properties in the wearables landscape, which we call "index-Finger Augmentation Devices" (iFADs). We introduce a four-level taxonomy to characterize the diversity of iFAD gestures, evaluate iFAD gesture articulation on a dataset of 6,369 gestures collected from 20 participants, and compute recognition accuracy rates. Our findings show that iFAD gestures are fast (1.84s on average), easy to articulate (1.52 average rating on a difficulty scale from 1 to 5), and socially acceptable (81% willingness to use them in public places). We compare iFAD gestures with gestures performed using other devices (styli, touchscreens, game controllers) from several public datasets (39,263 gestures, 277 participants), and report that iFAD gestures are two times faster than whole-body gestures and as fast as stylus and finger strokes performed on touchscreens.
https://doi.org/10.1145/3544548.3580928
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)