Recent advances in Artificial Intelligence have enabled powerful generative models, yet few are tailored to dancers’ practices. We present a long-term collaboration with a Voguing and Dancehall collective to design movement generation models trained on their repertoire. Our initial study with the dancers revealed that, despite limited {physical} realism, the generated movements inspired them. Iterative development led to Korai, an interactive tool for monitoring training, visualizing motion data, and prompting generation, which improved output quality. A subsequent structured observation study compared three model variants with high, medium, and low fidelity to the original dataset's {style}. Results show that dancers favored either highly faithful or highly unfaithful outputs, rejecting medium fidelity as neither authentic to their style nor creatively stimulating. Our findings highlight how direct collaboration with dancers not only informs model design but also deepens understanding of AI’s role in supporting creative movement practices.
ACM CHI Conference on Human Factors in Computing Systems