Learning to Denoise Raw Mobile UI Layouts for Improving Datasets at Scale

要旨

The layout of a mobile screen is a critical data source for UI design research and semantic understanding of the screen. However, UI layouts in existing datasets are often noisy, have mismatches with their visual representation, or consists of generic or app-specific types that are difficult to analyze and model. In this paper, we propose the CLAY pipeline that uses a deep learning approach for denoising UI layouts, allowing us to automatically improve existing mobile UI layout datasets at scale. Our pipeline takes both the screenshot and the raw UI layout, and annotates the raw layout by removing incorrect nodes and assigning a semantically meaningful type to each node. To experiment with our data-cleaning pipeline, we create the CLAY dataset of 59,555 human-annotated screen layouts, based on screenshots and raw layouts from Rico, a public mobile UI corpus. Our deep models achieve high accuracy with F1 scores of 82.7% for detecting layout objects that do not have a valid visual representation and 85.9% for recognizing object types, which significantly outperforms a heuristic baseline. Our work lays a foundation for creating large-scale high quality UI layout datasets for data-driven mobile UI research and reduces the need of manual labeling efforts that are prohibitively expensive.

著者
Gang Li
Google Research, Mountain View, California, United States
Gilles Baechler
Google Research, Zurich, ZH, Switzerland
Manuel Tragut
Google, Zurich, Switzerland
Yang Li
Google Research, Mountain View, California, United States
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3502042

動画

会議: CHI 2022

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)

セッション: Tools for Programmers/Developers

293
5 件の発表
2022-05-04 23:15:00
2022-05-05 00:30:00