この勉強会は終了しました。ご参加ありがとうございました。
This paper examines the tensions between neighborhood gentrification and community surveillance posts on Nextdoor, a hyperlocal social media platform for neighborhoods. We created a privacy-preserving pipeline to gather research data from public Nextdoor posts in Atlanta, Georgia and filtered these to a dataset of 1,537 community surveillance posts. We developed a qualitative codebook to label observed patterns of community surveillance, and deploy a large language model to tag these posts at scale. Ultimately, we present an extensible and empirically-tested typology of the modes of community surveillance that occur on hyperlocal platforms. We find a complex relationship between community surveillance posts and neighborhood gentrification, which indicates that publicly disclosing information about perceived outsiders, especially for petty crimes, is most prevalent in gentrifying neighborhoods. Our empirical evidence inform critical perspectives which posit that community surveillance on platforms like Nextdoor can exclude and marginalize minoritized populations, particularly in gentrifying neighborhoods. Our findings carry broader implications for hyperlocal social platforms and their potential to amplify and exacerbate social tensions and exclusion.
Everyday Augmented Reality (AR) headsets pose significant privacy risks, potentially allowing prolonged sensitive data collection of both users and bystanders (e.g. members of the public). While users control data access through permissions, current AR systems inherit smartphone permission prompts, which may be less appropriate for all-day AR. This constrains informed choices and risks over-privileged access to sensors. We propose (N=20) a novel AR permission control system that allows better-informed privacy decisions and evaluate it using five mock application contexts. Our system's novelty lies in enabling users to experience the varying impacts of permission levels on not only a) privacy, but also b) application functionality. This empowers users to better understand what data an application depends on and how its functionalities are impacted by limiting said data. Participants found that our method allows for making better informed privacy decisions, and deemed it more transparent and trustworthy than state-of-the-art AR and smartphone permission systems taken from Android and iOS. Our results offer insights into new and necessary AR permission systems, improving user understanding and control over data access.
Active and Assisted Living (AAL) technologies aim to enhance the quality of life of older adults and promote successful aging. While video-based AAL solutions offer rich capabilities for better healthcare management in older age, they pose significant privacy risks. To mitigate the risks, we developed a video-based monitoring system that incorporates different privacy-preserving filters. We deployed the system in one assistive technology center and conducted a qualitative study with older adults and other stakeholders involved in care provision. Our study demonstrates diverse users’ perceptions and experiences with video-monitoring technology and offers valuable insights for the system’s further development. The findings unpack the privacy-versus-safety trade-off inherent in video-based technologies and discuss how the privacy-preserving mechanisms within the system mitigate privacy-related concerns. The study also identifies varying stakeholder perspectives towards the system in general and highlights potential avenues for developing video-based monitoring technologies in the AAL context.
To address privacy concerns with the Internet of Things (IoT) devices, researchers have proposed enhancements in data collection transparency and user control. However, managing privacy preferences for shared devices with multiple stakeholders remains challenging. We introduced ThingPoll, a system that helps users negotiate privacy configurations for IoT devices in shared settings. We designed ThingPoll by observing twelve participants verbally negotiating privacy preferences, from which we identified potentially successful and inefficient negotiation patterns. ThingPoll bootstraps a preference model from a custom crowdsourced privacy preferences dataset. During negotiations, ThingPoll strategically scaffolds the process by eliciting users’ privacy preferences, providing helpful contexts, and suggesting feasible configuration options. We evaluated ThingPoll with 30 participants negotiating the privacy settings of 4 devices. Using ThingPoll, participants reached an agreement in 97.5% of scenarios within an average of 3.27 minutes. Participants reported high overall satisfaction of 83.3% with ThingPoll as compared to baseline approaches.
In today's digital age, searching for information online is considered a ubiquitous task that can be accomplished in just a few moments using various web-based technologies. Yet, information seeking has geopolitical burdens for users who are racialized and marginalized by the nation-state and other structures of power. In our paper, we conducted a qualitative interview study with 15 Muslim participants, mostly of South Asian origin, living in the US with varying citizenship or (non)immigration status about their information needs and concerns around privacy as a Muslim, and the resulting restrictive patterns of information seeking on various Internet platforms. We argue that our findings on the barriers faced and strategies employed by Muslim residents toward information access suggest a broader pattern of digital manifestations of border imperialism. We posit that HCI researchers should pay attention to how "digital borders" have epistemic implications for people marginalized by geopolitical boundaries.