ReflecTouch: Detecting Grasp Posture of Smartphone Using Corneal Reflection Images

要旨

By sensing how a user is holding a smartphone, adaptive user interfaces are possible such as those that automatically switch the displayed content and position of graphical user interface (GUI) components following how the phone is being held. We propose ReflecTouch, a novel method for detecting how a smartphone is being held by capturing images of the smartphone screen reflected on the cornea with a built-in front camera. In these images, the areas where the user places their fingers on the screen appear as shadows, which makes it possible to estimate the grasp posture. Since most smartphones have a front camera, this method can be used regardless of the device model; in addition, no additional sensor or hardware is required. We conducted data collection experiments to verify the classification accuracy of the proposed method for six different grasp postures, and the accuracy was 85%.

著者
Xiang Zhang
Keio University, Yokohama City, Japan
Kaori Ikematsu
Yahoo Japan Corporation, Tokyo, Japan
Kunihiro Kato
Tokyo University of Technology, Tokyo, Japan
Yuta Sugiura
Keio University, Yokohama City, Japan
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3517440

動画

会議: CHI 2022

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)

セッション: Sensing

386
5 件の発表
2022-05-05 01:15:00
2022-05-05 02:30:00