Palmpad: Enabling Real-Time Index-to-Palm Touch Interaction with a Single RGB Camera

要旨

Index-to-palm interaction plays a crucial role in Mixed Reality(MR) interactions. However, achieving a satisfactory inter-hand interaction experience is challenging with existing vision-based hand tracking technologies, especially in scenarios where only a single camera is available. Therefore, we introduce Palmpad, a novel sensing method utilizing a single RGB camera to detect the touch of an index finger on the opposite palm. Our exploration reveals that the incorporation of optical flow techniques to extract motion information between consecutive frames for the index finger and palm leads to a significant improvement in touch status determination. By doing so, our CNN model achieves 97.0% recognition accuracy and a 96.1% F1 score. In usability evaluation, we compare Palmpad with Quest's inherent hand gesture algorithms. Palmpad not only delivers superior accuracy 95.3% but also reduces operational demands and significantly improves users’ willingness and confidence. Palmpad aims to enhance accurate touch detection for lightweight MR devices.

著者
Zhe He
Tsinghua University, Beijing, Beijing, China
Xiangyang Wang
Tsinghua University, Beijing, China
Yuanchun Shi
Tsinghua University, Beijing, China
Chi Hsia
Tsinghua University, Beijing, China
Chen Liang
The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, Guangdong, China
Chun Yu
Tsinghua University, Beijing, China
DOI

10.1145/3706598.3714130

論文URL

https://dl.acm.org/doi/10.1145/3706598.3714130

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Immersive Touch and Gesture Interaction

G303
7 件の発表
2025-04-30 20:10:00
2025-04-30 21:40:00
日本語まとめ
読み込み中…