Teachable Reality: Prototyping Tangible Augmented Reality with Everyday Objects by Leveraging Interactive Machine Teaching

要旨

This paper introduces Teachable Reality, an augmented reality (AR) prototyping tool for creating interactive tangible AR applications with arbitrary everyday objects. Teachable Reality leverages vision-based interactive machine teaching (e.g., Teachable Machine), which captures real-world interactions for AR prototyping. It identifies the user-defined tangible and gestural interactions using an on-demand computer vision model. Based on this, the user can easily create functional AR prototypes without programming, enabled by a trigger-action authoring interface. Therefore, our approach allows the flexibility, customizability, and generalizability of tangible AR applications that can address the limitation of current marker-based approaches. We explore the design space and demonstrate various AR prototypes, which include tangible and deformable interfaces, context-aware assistants, and body-driven AR applications. The results of our user study and expert interviews confirm that our approach can lower the barrier to creating functional AR prototypes while also allowing flexible and general-purpose prototyping experiences.

著者
Kyzyl Monteiro
IIIT-Delhi, New Delhi, Delhi, India
Ritik Vatsal
IIIT Delhi, New Delhi, Delhi, India
Neil Chulpongsatorn
University of Calgary, Calgary, Alberta, Canada
Aman Parnami
IIIT-Delhi, New Delhi, Delhi, India
Ryo Suzuki
University of Calgary, Calgary, Alberta, Canada
論文URL

https://doi.org/10.1145/3544548.3581449

動画

会議: CHI 2023

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)

セッション: Making Realities

Room Y01+Y02
6 件の発表
2023-04-26 01:35:00
2023-04-26 03:00:00