Mixed Reality (MR) increasingly explores how virtual elements can shape physical behavior, yet how MR objects guide group movement remains underexplored. We address this gap by examining how virtual objects can nudge collective, co-located movement without relying on explicit instructions or choreography. We developed GravField, a research-through-design, co-located MR performance system where an “object jockey” live-configures virtual objects (e.g., ropes, springs, magnetic fields) with real-time, parameterized “digital physics” (e.g., weight, elasticity, force) to influence headset-wearing participants' movement, made perceptible through augmented visual and audio feedback serving as cognitive-somatic cues. Our bricolage analysis of the performances, based on video, interviews, soma trajectories, and field notes, indicates that these live nudges support emergent intercorporeal coordination and that ambiguity and real-time configuration sustain open-ended, exploratory engagement. Ultimately, our work offers empirical insights and design principles for MR systems that can guide group movement through embodied, felt dynamics while preserving participants’ sense of agency.
ACM CHI Conference on Human Factors in Computing Systems