Choreographic reflection relies on iterative dialogue, where dancers and choreographers refine movement through embodied demonstration and shared feedback in studio rehearsal. With the shift to video, this exchange becomes constrained: annotations detach from the body, gestures lose spatial grounding, and subtle variations are difficult to capture. Advances in markerless motion capture enable 3D reconstruction from rehearsal video, allowing past recordings to be re-materialized for embodied interaction in XR. We present DanXeReflect, an XR system that transforms flat video into a virtual studio where movements appear as interactive avatars. Users can re-enact poses to search sequences, perform alternative revisions alongside originals, and attach annotations directly to body parts. A study with choreographers and dancers shows how these embodied interactions reposition spatio-temporal data as collaborative anchors, extending reflective dialogue beyond co-located rehearsal into asynchronous, distributed practice.
ACM CHI Conference on Human Factors in Computing Systems