VR is often utilized for organizing virtual events such as meetings, conferences, and concerts; however, support for live production is lacking in most existing VR tools. We present XCam, a toolkit enabling mixed-initiative control over virtual camera systems---from fully manual control by users to increasingly automated, system-driven control with minimal user intervention. XCam's architectural design separates the concerns of object tracking, camera motion, and scene transition, giving more degrees of freedom to operators who can adjust the level of automation along all three dimensions. We used to conduct two studies: (1) interviews with six VR content creators probe into what aspects should and shouldn't be automated based on six applications developed with XCam; (2) three workshops with experts explore XCam's utility in live production of an interactive VR film sequence, a lecture on cinematography, and an alumni meeting in social VR. Expert feedback from our studies suggests how to balance automation and control, and the opportunities and limits of future AI-driven tools.
https://dl.acm.org/doi/10.1145/3706598.3713305
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)