Real-world IoT enhanced spaces involve diverse proximity- and gesture-based interactions between users and IoT devices/objects. Prototyping such interactions benefits various applications like the conceptual design of ubicomp space. AR (Augmented Reality) prototyping provides a flexible way to achieve early-stage designs by overlaying digital contents on real objects or environments. However, existing AR prototyping approaches have focused on prototyping AR experiences or context-aware interactions from the first-person view instead of full-body proxemic and gestural (pro-ges for short) interactions of real users in the real world. In this work, we conducted interviews to figure out the challenges of prototyping pro-ges interactions in real-world IoT enhanced spaces. Based on the findings, we present ProGesAR, a mobile AR tool for prototyping pro-ges interactions of a subject in a real environment from a third-person view, and examining the prototyped interactions from both the first- and third- person views. Our interface supports the effects of virtual assets dynamically triggered by a single subject, with the triggering events based on four features: location, orientation, gesture, and distance. We conduct a preliminary study by inviting participants to prototype in a freeform manner using ProGesAR. The early-stage findings show that with ProGesAR, users can easily and quickly prototype their design ideas about pro-ges interactions.
https://dl.acm.org/doi/abs/10.1145/3491102.3517689
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)