TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking Only

要旨

TriPad enables opportunistic touch interaction in Augmented Reality using hand tracking only. Users declare the surface they want to appropriate with a simple hand tap gesture. They can then use this surface at will for direct and indirect touch input. TriPad only involves analyzing hand movements and postures, without the need for additional instrumentation, scene understanding or machine learning. TriPad thus works on a variety of flat surfaces, including glass. It also ensures low computational overhead on devices that typically have a limited power budget. We describe the approach, and report on two user studies. The first study demonstrates the robustness of TriPad's hand movement interpreter on different surface materials. The second study compares TriPad against direct mid-air AR input techniques on both discrete and continuous tasks and with different surface orientations. TriPad achieves a better speed-accuracy trade-off overall, improves comfort and minimizes fatigue.

著者
Camille Dupré
Université Paris-Saclay, CNRS, Inria, Gif-sur-Yvette, France
Caroline Appert
Université Paris-Saclay, CNRS, Inria, Orsay, France
Stéphanie Rey
Berger-Levrault, Toulouse, France
Houssem Saidi
Carl Berger-Levrault, Paris, France
Emmanuel Pietriga
Université Paris-Saclay, CNRS, Inria, Gif-sur-Yvette, France
論文URL

doi.org/10.1145/3613904.3642323

動画

会議: CHI 2024

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)

セッション: Perception and Input in Immersive Environments

316C
5 件の発表
2024-05-15 18:00:00
2024-05-15 19:20:00