Automated embodied moderation has the potential to create safer spaces for children in social VR, providing a protective figure that takes action to mitigate harmful interactions. However, little is known about how such moderation should be employed in practice. Through interviews with 16 experts in online child safety and psychology, and workshops with 8 guardians and 13 children, we contribute a comprehensive overview of how Automated Embodied Moderators (AEMs) can safeguard children in social VR. We explore perceived concerns, benefits and preferences across the stakeholder groups and gather first-of-their-kind recommendations and reflections around AEM design. The results stress the need to adapt AEMs to children, whether victims or harassers, based on age and development, emphasising empowerment, psychological impact and humans/guardians-in-the-loop. Our work provokes new participatory design-led directions to consider in the development of AEMs for children in social VR taking child, guardian, and expert insights into account.
https://doi.org/10.1145/3613904.3642144
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)