Causality-preserving Asynchronous Reality


Mixed Reality is gaining interest as a platform for collaboration and focused work to a point where it may supersede current office settings in future workplaces. At the same time, we expect that interaction with physical objects and face-to-face communication will remain crucial for future work environments, which is a particular challenge in fully immersive Virtual Reality. In this work, we reconcile those requirements through a user's individual Asynchronous Reality, which enables seamless physical interaction across time. When a user is unavailable, e.g., focused on a task or in a call, our approach captures co-located or remote physical events in real-time, constructs a causality graph of co-dependent events, and lets immersed users revisit them at a suitable time in a causally accurate way. Enabled by our system AsyncReality, we present a workplace scenario that includes walk-in interruptions during a person's focused work, physical deliveries, and transient spoken messages. We then generalize our approach to a use-case agnostic concept and system architecture. We conclude by discussing the implications of an Asynchronous Reality for future offices.

Best Paper
Andreas Rene. Fender
ETH Zürich, Zurich, Switzerland
Christian Holz
ETH Zürich, Zurich, Switzerland


会議: CHI 2022

The ACM CHI Conference on Human Factors in Computing Systems (

セッション: Immersion

5 件の発表
2022-05-02 23:15:00
2022-05-03 00:30:00