Dark patterns are malicious UI design strategies that nudge users towards decisions going against their best interests. To create technical countermeasures against them, dark patterns must be automatically detectable. While researchers have devised algorithms to detect some patterns automatically, there has only been little work to use obtained results to technically counter the effects of dark patterns when users face them on their devices. To address this, we tested three visual countermeasures against 13 common dark patterns in an interactive lab study. The countermeasures we tested either (a) highlighted and explained the manipulation, (b) hid it from the user, or (c) let the user switch between the original view and the hidden version. From our data, we were able to extract multiple clusters of dark patterns where participants preferred specific countermeasures for similar reasons. To support creating effective countermeasures, we discuss our findings with a recent ontology of dark patterns.
https://doi.org/10.1145/3613904.3642661
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)