この勉強会は終了しました。ご参加ありがとうございました。
With the increasing usage of mental health applications (MHAs), there is growing concern regarding their data privacy practices. Analyzing 437 user reviews from 83 apps, we outline users’ predominant privacy and security concerns with currently available apps. We then compare those concerns to criteria from two prominent app evaluation websites -- Privacy Not Included and One Mind PsyberGuide. Our findings show that MHA users have myriad data privacy and security concerns including a user's control over their own data, but these concerns do not often overlap with those of experts from evaluation websites who focus more on issues such as required password strength. We highlight this disconnect and propose solutions in how the mental health care ecosystem can provide better guidance to MHA users and experts from the fields of privacy and security and mental health technology in choosing and evaluating, respectively, potentially useful mental health apps.
The web utilizes permission prompts to moderate access to certain capabilities. We present the first investigation of user behavior and sentiment of this security and privacy measure on the web, using 28 days of telemetry data from more than 100M Chrome installations on desktop platforms and experience sampling responses from 25,706 Chrome users. Based on this data, we find that ignoring and dismissing permission prompts are most common for geolocation and notifications. Permission prompts are perceived as more annoying and interrupting when they are not allowed, and most respondents cite a rational reason for the decision they took. Our data also supports that the perceived availability of contextual information from the requesting website is associated with allowing access to a requested capability. More usable permission controls could facilitate adoption of best practices that address several of the identified challenges; and ultimately could lead to better user experiences and a safer web.
With increased interest in leveraging personal data collected from 24/7 mobile sensing for digital healthcare research, supporting user-friendly consent to data collection for user privacy has also become important. This work proposes \emph{PriviAware}, a mobile app that promotes flexible user consent to data collection with data exploration and contextual filters that enable users to turn off data collection based on time and places that are considered privacy-sensitive. We conducted a user study (N = 58) to explore how users leverage data exploration and contextual filter functions to explore and manage their data and whether our system design helped users mitigate their privacy concerns. Our findings indicate that offering fine-grained control is a promising approach to raising users’ privacy awareness under the dynamic nature of the pervasive sensing context. We provide practical privacy-by-design guidelines for mobile sensing research.
Users need to configure default apps when they first start using their devices. The privacy configurations of these apps do not always match what users think they have initially enabled. We first explored the privacy configurations of eight default apps Safari, Siri, Family Sharing, iMessage, FaceTime, Location Services, Find My and Touch ID. We discovered serious issues with the documentation of these apps. Based on this, we studied users' experiences with an interview study (N=15). We show that: the instructions of setting privacy configurations of default apps are vague and lack required steps; users were unable to disable default apps from accessing their personal information; users assumed they were being tracked by some default apps; default apps may cause tensions in family relationships because of information sharing. Our results illuminate on the privacy and security implications of configuring the privacy of default apps and how users understand the mobile ecosystem.
The widespread sharing of consumers' personal information with third parties
raises significant privacy concerns. The California Consumer Privacy Act (CCPA)
mandates that online businesses offer consumers the option to opt out of the sale
and sharing of personal information. Our study
automatically tracking the presence of the opt-out link longitudinally across multiple
states after the California Privacy Rights Act (CPRA) went into effect. We categorize
websites based on whether they are subject to CCPA and investigate cases of potential
non-compliance. We find a number of websites that implement the opt-out link early and
across all examined states but also find a significant number of CCPA-subject websites
that fail to offer any opt-out methods even when CCPA is in effect. Our findings can
shed light on how websites are reacting to the CCPA and identify potential gaps in
compliance and opt-out method designs that hinder consumers from exercising CCPA opt-out
rights.