SituationAdapt: Contextual UI Optimization in Mixed Reality with Situation Awareness via LLM Reasoning

要旨

Mixed Reality is increasingly used in mobile settings beyond controlled home and office spaces. This mobility introduces the need for user interface layouts that adapt to varying contexts. However, existing adaptive systems are designed only for static environments. In this paper, we introduce SituationAdapt, a system that adjusts Mixed Reality UIs to real-world surroundings by considering environmental and social cues in shared settings. Our system consists of perception, reasoning, and optimization modules for UI adaptation. Our perception module identifies objects and individuals around the user, while our reasoning module leverages a Vision-and-Language Model to assess the placement of interactive UI elements. This ensures that adapted layouts do not obstruct relevant environmental cues or interfere with social norms. Our optimization module then generates Mixed Reality interfaces that account for these considerations as well as temporal constraints The evaluation of SituationAdapt is two-fold: We first validate our reasoning component’s capability in assessing UI contexts comparable to human expert users. In an online user study, we then established our system’s capability of producing context-aware MR layouts, where it outperformed adaptive methods from previous work. We further demonstrate the versatility and applicability of SituationAdapt with a set of application scenarios.

著者
Zhipeng Li
ETH, Zurich, Switzerland
Christoph Gebhardt
ETH Zurich, Zurich, Switzerland
Yves Inglin
ETH Zürich, Zürich, Switzerland
Nicolas Steck
ETH Zürich, Zurich, Switzerland
Paul Streli
ETH, Zurich, Switzerland
Christian Holz
ETH Zürich, Zurich, Switzerland
論文URL

https://doi.org/10.1145/3654777.3676470

動画

会議: UIST 2024

ACM Symposium on User Interface Software and Technology

セッション: 2. Shared Spaces

Westin: Allegheny 2
5 件の発表
2024-10-15 18:00:00
2024-10-15 19:15:00