ESCAPE: Countering Systematic Errors from Machine's Blind Spots via Interactive Visual Analysis

要旨

Classification models learn to generalize the associations between data samples and their target classes. However, researchers have increasingly observed that machine learning practice easily leads to systematic errors in AI applications, a phenomenon referred to as "AI blindspots.'' Such blindspots arise when a model is trained with training samples (e.g., cat/dog classification) where important patterns (e.g., black cats) are missing or periphery/undesirable patterns (e.g., dogs with grass background) are misleading towards a certain class. Even more sophisticated techniques cannot guarantee to capture, reason about, and prevent the spurious associations. In this work, we propose ESCAPE, a visual analytic system that promotes a human-in-the-loop workflow for countering systematic errors. By allowing human users to easily inspect spurious associations, the system facilitates users to spontaneously recognize concepts associated misclassifications and evaluate mitigation strategies that can reduce biased associations. We also propose two statistical approaches, relative concept association to better quantify the associations between a concept and instances, and debias method to mitigate spurious associations. We demonstrate the utility of our proposed ESCAPE system and statistical measures through extensive evaluation including quantitative experiments, usage scenarios, expert interviews, and controlled user experiments.

著者
Yongsu Ahn
University of Pittsburgh, Pittsburgh, Pennsylvania, United States
Yu-Ru Lin
University of Pittsburgh, Pittsburgh, Pennsylvania, United States
Panpan Xu
Amazon AWS, Santa Clara, California, United States
Zeng Dai
Bosch Research, Sunnyvale, California, United States
論文URL

https://doi.org/10.1145/3544548.3581373

動画

会議: CHI 2023

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2023.acm.org/)

セッション: Visualization for AI/ML

Room X11+X12
6 件の発表
2023-04-25 01:35:00
2023-04-25 03:00:00