Touchscreen Typing As Optimal Supervisory Control

要旨

Traditionally, touchscreen typing has been studied in terms of motor performance. However, recent research has exposed a decisive role of visual attention being shared between the keyboard and the text area. Strategies for this are known to adapt to the task, design, and user. In this paper, we propose a unifying account of touchscreen typing, regarding it as optimal supervisory control. Under this theory, rules for controlling visuo-motor resources are learned via exploration in pursuit of maximal typing performance. The paper outlines the control problem and explains how visual and motor limitations affect it. We then present a model, implemented via reinforcement learning, that simulates co-ordination of eye and finger movements. Comparison with human data affirms that the model creates realistic finger- and eye-movement patterns and shows human-like adaptation. We demonstrate the model's utility for interface development in evaluating touchscreen keyboard designs.

著者
Jussi P. P.. Jokinen
Aalto University, Helsinki, Finland
Aditya Acharya
Aalto University, Espoo, Finland
Mohammad Uzair
Aalto University, Espoo, Finland
Xinhui Jiang
Kochi University of Technology, Kami, Kochi, Japan
Antti Oulasvirta
Aalto University, Helsinki, Finland
DOI

10.1145/3411764.3445483

論文URL

https://doi.org/10.1145/3411764.3445483

動画

会議: CHI 2021

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2021.acm.org/)

セッション: Smart Home, Bot, Robot, & Drone / Input & Measurement

[A] Paper Room 14, 2021-05-12 17:00:00~2021-05-12 19:00:00 / [B] Paper Room 14, 2021-05-13 01:00:00~2021-05-13 03:00:00 / [C] Paper Room 14, 2021-05-13 09:00:00~2021-05-13 11:00:00
Paper Room 14
11 件の発表
2021-05-12 17:00:00
2021-05-12 19:00:00
日本語まとめ
読み込み中…