Advancements in artificial intelligence are challenging current policy frameworks. Both the human-computer interaction (HCI) community and policymakers note that technologies are designed better when they take into account the impact on society, and that policies are more effective when they are grounded in technical knowledge. Design research can be a powerful lens to support policy design processes. Driven by the potential for design research in technology policy development, the Robot Policy Design Toolkit (RPDT) was designed to support forecasting of robot technology policy and facilitate policy design experiences through a speculative design approach, centering forecasting, compromise, and simplicity design principles. This paper introduces the toolkit's design, reveals insights from how technologists design policies around social robots, and provides reflections from technology policy experts on the value and potential for design research tools, such as the RPDT, in policymaking contexts.
This paper presents a framework of ethical encounters in design practice, grounded in 98 accounts of practitioners' experiences with ethics in their work. While HCI and design scholarship have produced a growing body of empirical work on design ethics, less attention has been given to concept-building informed by practice. Building on practice-oriented design ethics research in HCI, we define ethical encounters as practitioner-identified situations that expose tensions and value-laden decisions, emerging from the situated realities of day-to-day design work. Our analysis reveals three key dimensions of these encounters: the perceived issues, the actions taken to navigate them, and the new capacities emerging through them. Our analysis also considers how these dimensions play out differently across project phases. The framework offers a shared language and practical guidance for understanding and engaging with ethical challenges, and it contributes by framing ethical encounters as generative for relationships, ideas, and directions in HCI and design practice.
Live commerce platforms frequently employ algorithmic recommendations and time-limited promotions to trigger impulsive purchases, challenging rational consumer decision-making. While existing research has identified manipulative design patterns in live commerce, significant gaps remain in understanding consumer psychological motivations and developing counter-persuasion interventions. We conducted a multi-stage formative study involving surveys (N = 116), interviews (N = 21), and co-design workshops (N = 16) to explore user preferences for rational consumption support systems. Informed by these insights, we designed BuyMate, which provides gentle, real-time rational interventions through product comparison and persuasive speech reframing. A user evaluation (N = 35) demonstrates that the system effectively reduces impulsive purchases, enhances decision autonomy, and promotes sustainable consumption. This work contributes an AI-driven counter-persuasion approach, identifies user-centered principles for adaptive interventions, and offers practical guidance for responsible AI in digital commerce.
Research on dark patterns has grown rapidly, but challenges remain in situating these practices within broader socio-technical, legal, and design contexts. In this essay, I introduce the concept of the "dark patterns knowledge stack" as a new way of synthesizing evidence about manipulative, coercive, and deceptive design practices. Inspired by Alexander’s notion of pattern language, I demonstrate how the knowledge stack aligns multiple layers of analysis and evidence—from interfaces and user characteristics to the socio-technical landscape and user intentions—revealing how manipulative practices interrelate across scales, are perpetuated through key business metrics, and evolve over time. Use of the knowledge stack is demonstrated through two case studies, followed by provocations for scholars, regulators, and practitioners to work together to more effectively identify harms, negotiate accountability, and chart pathways for more just and transparent digital systems.
Live streaming platforms are rapidly emerging as popular forms of online entertainment. Yet many users report challenges, most notably a loss of autonomy in managing their spending and consumption. This erosion of autonomy is often reinforced by deceptive patterns embedded in platform design. In this paper, we examine how users experience and perceive autonomy on live streaming platforms. Through a systematic analysis of Douyin, a popular live streaming platform, we identified 27 deceptive patterns, complemented by 15 semi-structured interviews with platform users. Our findings reveal that users experience significant loss of autonomy, including compulsive overspending, disrupted daily routines, forced labor patterns, and increased anxiety, and expressed strong desires for autonomy-supportive alternatives. We provide insights into the design of future live streaming as autonomy-supportive installations, introducing an installation theory–derived framing that helps identify where autonomy breaks down and envision user-centered layered alternatives within entrenched platform infrastructures.
Dark patterns are design practices that undermine users' ability to make autonomous and informed choices in digital experiences. The EU Digital Services Act (DSA) seeks to protect users from such designs and their effects, with Article 25 prohibiting three autonomy violation types: deception, manipulation and distortion/impairment. Demonstrating such regulatory violations, however, requires design-oriented reasoning necessary to articulate why an observed design practice constitutes a specific autonomy violation type. This paper maps 59 known dark patterns onto the three autonomy violation types from the DSA and identifies eight new design factors which can help determine when a dark pattern violates autonomy. Our mapping of dark patterns to autonomy violations grounds ongoing regulatory debates in design while opening pathways for translational research that reimagines how HCI engages with the governance of design practices.