Virtual reality (VR) has become a valuable tool for social and educational purposes for autistic people, as it provides flexible environmental support to create a variety of experiences. A growing body of recent research has examined the behaviors of autistic people using sensor-based data to better understand autistic people and investigate the effectiveness of VR. Comprehensive analysis of the various signals that can be easily collected in the VR environment can promote understanding of autistic people. While this quantitative evidence has the potential to help both autistic people and others (e.g., autism experts) to understand behaviors of autistic people, existing studies have focused on single signal analysis and have not determined the acceptability of signal analysis results from the autistic person's point of view. To facilitate the use of multiple sensor signals in VR for autistic people and experts, we introduce V-DAT (Virtual Reality Data Analysis Tool), designed to support a VR sensor data handling pipeline. V-DAT takes into account four sensor modalities - head position and rotation, eye movement, audio, and physiological signals - that are actively used in current VR research for autistic people. We explain the characteristics and processing methods of the data for each modality as well as the analysis with comprehensive visualizations of V-DAT. We also conduct a case study to investigate the feasibility of V-DAT as a way of broadening understanding of autistic people from the perspectives of both autistic people and autism experts. Finally, we discuss issues with the process of V-DAT development and complementary measures for the applicability and scalability of a sensor data management system for autistic people.
https://doi.org/10.1145/3586183.3606797
ACM Symposium on User Interface Software and Technology