As Smart Home Personal Assistants (SPAs) evolve into social agents, understanding user privacy necessitates interpersonal communication frameworks, such as Privacy Boundary Theory (PBT). To ground our investigation, our three-phase preliminary study (1) identified transmission and sharing ranges as key boundary-related risk factors, (2) categorized relevant SPA functions and data types, and (3) analyzed commercial practices, revealing widespread data sharing and non-transparent safeguards. A subsequent mixed-methods study (N=412 survey, N=40 interviews among the survey participants) assessed users' perceived privacy risks across data types, transmission ranges and sharing ranges. Results demonstrate a significant, non-linear escalation in perceived risk when data crosses two critical boundaries: the `public network' (transmission) and `third parties' (sharing). This boundary effect holds across data types and demographics. Furthermore, risk perception is modulated by data attributes, and contextual privacy calculus. Conversely, anonymization show limited efficacy especially for third-party sharing, a finding attributed to user distrust. These findings empirically ground PBT in SPA context and inform design of boundary-aware privacy protection.
ACM CHI Conference on Human Factors in Computing Systems