Detecting fraud during online exams using proctoring software comes with substantial privacy challenges. Previous work argues students experience heightened anxiety and have privacy concerns. However, little is known about which specific aspects of online proctoring cause these concerns. This study contributes such insights by using the Contextual Integrity (CI) framework to discover how students (N = 456) rate the acceptability of 1064 proctoring information flows with varying information types, recipients, and transmission principles. We find that the acceptability varies considerably depending on the context. Besides exposing obvious privacy violations, we find that, under certain conditions, students consider it acceptable to share data with teachers - despite their lack of involvement in proctoring. Also, the acceptability of sharing highly sensitive information - which should under no circumstances be shared - sometimes increases. We discuss the implications of these and other findings and provide concrete recommendations for educational institutions using online proctoring.
https://doi.org/10.1145/3544548.3581181
Across academia, government, and industry, data stewards are facing increasing pressure to make datasets more openly accessible for researchers while also protecting the privacy of data subjects. Differential privacy (DP) is one promising way to offer privacy along with open access, but further inquiry is needed into the tensions between DP and data science. In this study, we conduct interviews with 19 data practitioners who are non-experts in DP as they use a DP data analysis prototype to release privacy-preserving statistics about sensitive data, in order to understand perceptions, challenges, and opportunities around using DP. We find that while DP is promising for providing wider access to sensitive datasets, it also introduces challenges into every stage of the data science workflow. We identify ethics and governance questions that arise when socializing data scientists around new privacy constraints and offer suggestions to better integrate DP and data science.
https://doi.org/10.1145/3544548.3580791
Apple’s App Tracking Transparency framework allows users to decide whether they want to allow their activity to be tracked for advertising purposes. In this work we examine the tracking decisions made by 312 participants and their associations with privacy concern and personality factors, and conduct a thematic analysis on participants’ reasons for choosing to accept or reject tracking requests. Despite 51% of participants reporting that they had rejected tracking for privacy reasons, higher privacy concern scores did not correlate with a lower rate of tracking acceptance. Additionally, 43% of participants held incorrect beliefs about what tracking does, including nearly a quarter who mistakenly believed that accepting a tracking request would share their location with the requesting app. We suggest explanations for these misconceptions and provide recommendations that may improve usability of both App Tracking Transparency and future Privacy Enhancing Technologies.
https://doi.org/10.1145/3544548.3580654
Websites implement cookie consent interfaces to obtain users’ permission to use non-essential cookies, as required by privacy regulations. We extend prior research evaluating the impact of interface design on cookie consent through an online behavioral experiment (𝑛 = 1359) in which we prompted mobile and desktop users from the UK and US to make cookie consent decisions using one of 14 interfaces implemented with the OneTrust consent management platform (CMP). We found significant effects on user behavior and sentiment for multiple explanatory variables, including more negative sentiment towards the consent process among UK participants and lower comprehension of interface information among mobile users. The design factor that had the largest effect on user behavior was the initial set of options displayed in the cookie banner. In addition to providing more evidence of the inadequacy of current cookie consent processes, our results have implications for website operators and CMPs
https://doi.org/10.1145/3544548.3580725
To improve user experience, Alexa now allows users to consent to data sharing via voice rather than directing them to the companion smartphone app. While verbal consent mechanisms for voice assistants (VAs) can increase usability, they can also undermine principles core to informed consent. We conducted a Delphi study with experts from academia, industry, and the public sector on requirements for verbal consent in VAs. Candidate requirements were drawn from the literature, regulations, and research ethics guidelines that participants rated based on their relevance to the consent process, actionability by platforms, and usability by end-users, discussing their reasoning as the study progressed. We highlight key areas of (dis)agreement between experts, deriving recommendations for regulators, skill developers, and VA platforms towards crafting meaningful verbal consent mechanisms. Key themes include approaching permissions according to the user's ability to opt-out, minimising consent decisions, and ensuring platforms follow established consent principles.
https://doi.org/10.1145/3544548.3580967
While the literature on permissions from the end-user perspective is rich, there is a lack of empirical research on why developers request permissions, their conceptualization of permissions, and how their perspectives compare with end-users' perspectives. Our study aims to address these gaps using a mixed-methods approach. Through interviews with 19 app developers and a survey of 309 Android and iOS end-users, we found that both groups shared similar concerns about unnecessary permissions breaking trust, damaging the app's reputation, and potentially allowing access to sensitive data. We also found that developer participants sometimes requested multiple permissions due to confusion about the scope of certain permissions or third-party library requirements. Additionally, most end-user participants believed they were responsible for granting a permission request, and it was their choice to do so, a belief shared by many developer participants. Our findings have implications for improving the permission ecosystem for both developers and end-users.
https://doi.org/10.1145/3544548.3581060