DanXeReflect: Interacting with the Spatio-Temporal Past Movements for Embodied, Reflective Choreographic Collaboration

要旨

Choreographic reflection relies on iterative dialogue, where dancers and choreographers refine movement through embodied demonstration and shared feedback in studio rehearsal. With the shift to video, this exchange becomes constrained: annotations detach from the body, gestures lose spatial grounding, and subtle variations are difficult to capture. Advances in markerless motion capture enable 3D reconstruction from rehearsal video, allowing past recordings to be re-materialized for embodied interaction in XR. We present DanXeReflect, an XR system that transforms flat video into a virtual studio where movements appear as interactive avatars. Users can re-enact poses to search sequences, perform alternative revisions alongside originals, and attach annotations directly to body parts. A study with choreographers and dancers shows how these embodied interactions reposition spatio-temporal data as collaborative anchors, extending reflective dialogue beyond co-located rehearsal into asynchronous, distributed practice.

受賞
Honorable Mention
著者
Hyunju Kim
Cornell University, Ithaca, New York, United States
Francois Guimbretiere
Cornell University, Ithaca, New York, United States
Bokyung Lee
Yonsei University, Seoul, Korea, Republic of

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Collaborative/Shared XR

P1 - Room 134
7 件の発表
2026-04-13 20:15:00
2026-04-13 21:45:00