Wall displays are well suited for collaborative work and are often placed in rooms with ample space in front of them that remains largely unused. Augmented Reality (AR) headsets can seamlessly extend the collaboration space around the Wall. Nevertheless, it is unclear if extending Walls with AR is effective and how it may affect collaboration. We first present a prototype combining a Wall and AR headsets to extend the Wall workspace. We then use this prototype to study how users utilize the virtual space created in AR. In an experiment with 24 participants, we compare how pairs solve collaborative tasks with the Wall alone and with Wall+AR. Our qualitative and quantitative results highlight that with Wall+AR, participants use the physical space in front and around the Wall extensively, and while this creates interaction overhead, it does not impact performance and improves the user experience.
https://doi.org/10.1145/3544548.3580752
With face-to-face music collaboration being severely limited during the recent pandemic, mixed reality technologies and their potential to provide musicians a feeling of "being there" with their musical partner can offer tremendous opportunities. In order to assess this potential, we conducted a laboratory study in which musicians made music together in real-time while simultaneously seeing their jamming partner's mixed reality point cloud via a head-mounted display and compared mental effects such as flow, affect, and co-presence to an audio-only baseline. In addition, we tracked the musicians' physiological signals and evaluated their features during times of self-reported flow. For users jamming in mixed reality, we observed a significant increase in co-presence. Regardless of the condition (mixed reality or audio-only), we observed an increase in positive affect after jamming remotely. Furthermore, we identified heart rate and HF/LF as promising features for classifying the flow state musicians experienced while making music together.
https://doi.org/10.1145/3544548.3581162
Mixed-reality telepresence allows local and remote users feel as if they are present together in the same space. In this paper we report on a mixed-reality volumetric telepresence system that is adaptable, multi-user and cross-modal, i.e. combining augmented and virtual reality technologies with face-to-face interactions. The system extends state-of-art by creating full-body and environmental volumetric renderings in real-time over local enterprise networks. We report findings of an evaluation in a training scenario which was adapted for remote delivery and led by an industry professional. Analysis of interviews and observed behaviours identify varying attitudes towards virtually mediated full-body experiences and highlight the impact of volumetric mixed-reality telepresence to facilitate personal experiences of co-presence and to ground communication with interlocutors.
https://doi.org/10.1145/3544548.3581277
Social virtual reality (VR) platforms have increased in popularity with many people turning to these platforms to experience social connection, including a rapid influx of users during the COVID-19 pandemic. However, there is limited understanding of how people appropriate and use emerging social VR applications to actively support their mental health and wellbeing in daily life. Through an online questionnaire and exploratory interviews conducted within the social VR app VRChat during the COVID-19 pandemic, we document how social VR is being used explicitly as a mental health support tool. Participants reported positive wellbeing benefits, mostly attributed to the anonymity provided by avatars and perceived safety within digital worlds and communities of practice. We also report how people use social VR to practice social interaction, reduce negative thoughts and form strong social bonds and connections with others.
https://doi.org/10.1145/3544548.3581103
Transitional Interfaces are a yet underexplored, emerging class of cross-reality user interfaces that enable users to freely move along the reality-virtuality continuum during collaboration. To analyze and understand how such collaboration unfolds, we propose four analytical lenses derived from an exploratory study of transitional collaboration with 15 dyads. While solving a complex spatial optimization task, participants could freely switch between three contexts, each with different displays (desktop screens, tablet-based augmented reality, head-mounted virtual reality), input techniques (mouse, touch, handheld controllers), and visual representations (monoscopic and allocentric 2D/3D maps, stereoscopic egocentric views). Using the rich qualitative and quantitative data from our study, we evaluated participants' perceptions of transitional collaboration and identified commonalities and differences between dyads. We then derived four lenses including metrics and visualizations to analyze key aspects of transitional collaboration: (1) place and distance, (2) temporal patterns, (3) group use of contexts, (4) individual use of contexts.
https://doi.org/10.1145/3544548.3580879
The HCI community has explored new interaction designs for collaborative AR interfaces in terms of usability and feasibility; however, security & privacy (S&P) are often not considered in the design process and left to S&P professionals. To produce interaction proposals with S&P in mind, we extend the user-driven elicitation method with a scenario-based approach that incorporates a threat model involving access control in multi-user AR. We conducted an elicitation study in two conditions, pairing AR/AR experts in one condition and AR/S&P experts in the other, to investigate the impact of each pairing. We contribute a set of expert-elicited interactions for sharing AR content enhanced with access control provisions, analyze the benefits and tradeoffs of pairing AR and S&P experts, and present recommendations for designing future multi-user AR interactions that better balance competing design goals of usability, feasibility, and S&P in collaborative AR.
https://doi.org/10.1145/3544548.3581089