GazeSwipe: Enhancing Mobile Touchscreen Reachability through Seamless Gaze and Finger-Swipe Integration

要旨

Smartphones with large screens provide users with increased display and interaction space but pose challenges in reaching certain areas with the thumb when using the device with one hand. To address this, we introduce GazeSwipe, a multimodal interaction technique that combines eye gaze with finger-swipe gestures, enabling intuitive and low-friction reach on mobile touchscreens. Specifically, we design a gaze estimation method that eliminates the need for explicit gaze calibration. Our approach also avoids the use of additional eye-tracking hardware by leveraging the smartphone's built-in front-facing camera. Considering the potential decrease in gaze accuracy without dedicated eye trackers, we use finger-swipe gestures to compensate for any inaccuracies in gaze estimation. Additionally, we introduce a user-unaware auto-calibration method that improves gaze accuracy during interaction. Through extensive experiments on smartphones and tablets, we compare our technique with various methods for touchscreen reachability and evaluate the performance of our auto-calibration strategy. The results demonstrate that our method achieves high success rates and is preferred by users. The findings also validate the effectiveness of the auto-calibration strategy.

受賞
Best Paper
著者
Zhuojiang Cai
Beihang University, Beijing, China
Jingkai Hong
Beihang University, Beijing, China
Zhimin Wang
Beihang University, Beijing, China
Feng Lu
Beihang University, Beijing, China
DOI

10.1145/3706598.3713739

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713739

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: UI/UX Design

G304
7 件の発表
2025-04-29 20:10:00
2025-04-29 21:40:00
日本語まとめ
読み込み中…