iFAD Gestures: Understanding Users' Gesture Input Performance with Index-Finger Augmentation Devices

要旨

We examine gestures performed with a class of input devices with distinctive quality properties in the wearables landscape, which we call "index-Finger Augmentation Devices" (iFADs). We introduce a four-level taxonomy to characterize the diversity of iFAD gestures, evaluate iFAD gesture articulation on a dataset of 6,369 gestures collected from 20 participants, and compute recognition accuracy rates. Our findings show that iFAD gestures are fast (1.84s on average), easy to articulate (1.52 average rating on a difficulty scale from 1 to 5), and socially acceptable (81% willingness to use them in public places). We compare iFAD gestures with gestures performed using other devices (styli, touchscreens, game controllers) from several public datasets (39,263 gestures, 277 participants), and report that iFAD gestures are two times faster than whole-body gestures and as fast as stylus and finger strokes performed on touchscreens.

受賞
Honorable Mention
著者
Radu-Daniel Vatavu
Ștefan cel Mare University of Suceava, Suceava, Romania
論文URL

https://doi.org/10.1145/3544548.3580928

動画

会議: CHI 2023

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)

セッション: Pointing and Icons

Hall D
6 件の発表
2023-04-24 23:30:00
2023-04-25 00:55:00