Modern vehicles are using AI and increasingly sophisticated sensor suites to improve Advanced Driving Assistance Systems (ADAS) and support automated driving capabilities. Heads-Up-Displays (HUDs) provide an opportunity to visually inform drivers about vehicle perception and interpretation of the driving environment. One approach to HUD design may be to reveal to drivers the vehicle’s full contextual understanding, though it is not clear if the benefits of additional information outweigh the drawbacks of added complexity, or if this balance holds across drivers. We designed and tested an Augmented Reality (AR) HUD in an online study ($N=298$), focusing on the influence of HUD visualizations on drivers’ situation awareness and perceptions. Participants viewed two driving scenes with one of three HUD conditions. Results were nuanced: situation awareness declined with increasing driving context complexity, and contrary to expectation, also declined with the presence of a HUD compared to no HUD. Significant differences were found by varying HUD complexity, which led us to explore different characterizations of complexity, including counts of scene items, item categories, and illuminated pixels. Our analysis finds that driving style interacts with driving context and HUD complexity, warranting further study.
https://doi.org/10.1145/3411764.3445575
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2021.acm.org/)