Sensing Noticeability in Ambient Information Environments

要旨

Designing notifications in Augmented Reality (AR) that are noticeable yet unobtrusive is challenging since achieving this balance heavily depends on the user’s context. However, current AR systems tend to be context-agnostic and require explicit feedback to determine whether a user has noticed a notification. This limitation restricts AR systems from providing timely notifications that are integrated with users’ activities. To address this challenge, we studied how sensors can infer users’ detection of notifications while they work in an office setting. We collected 98 hours of data from 12 users, including their gaze, head position, computer interactions, and engagement levels. Our findings showed that combining gaze and engagement data most accurately classified noticeability (AUC = 0.81). Even without engagement data, the accuracy was still high (AUC = 0.76). Our study also examines time windowing methods and compares general and personalized models.

著者
Yi Fei Cheng
Carnegie Mellon University, Pittsburgh, Pennsylvania, United States
David Lindlbauer
Carnegie Mellon University, Pittsburgh, Pennsylvania, United States
DOI

10.1145/3706598.3713511

論文URL

https://dl.acm.org/doi/10.1145/3706598.3713511

動画

会議: CHI 2025

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)

セッション: Perception in VR

G318+G319
7 件の発表
2025-04-29 01:20:00
2025-04-29 02:50:00
日本語まとめ
読み込み中…