Detecting rhetoric that manipulates readers' emotions requires distinguishing intrinsically emotional content (IEC; e.g., a parent losing a child) from emotionally manipulative language (EML; e.g., using fear-inducing language to spread anti-vaccine propaganda). However, this remains an open classification challenge for both automatic and crowdsourcing approaches. Machine Learning approaches only work in narrow domains where labeled training data is available, and non-expert annotators tend to conflate IEC with EML. We introduce an approach, anchor comparison, that leverages workers' ability to identify and remove instances of EML in text to create a paraphrased "anchor text", which is then used as a comparison point to classify EML in the original content. We evaluate our approach with a dataset of news-style text snippets and show that precision and recall can be tuned for system builders' needs. Our contribution is a crowdsourcing approach that enables non-expert disentanglement of social references from content.
Attending to breath is a self-awareness practice that exists within many contemplative and reflective traditions and is recognized for its benefits to well-being. Our current technological landscape embraces a large body of systems that utilize breath data in order to foster self-awareness. This paper seeks to deepen our understanding of the design space of systems that perceptually extend breath awareness. Our contribution is twofold: (1) our analysis reveals how the underlying theoretical frameworks shape the system design and its evaluation, and (2) how system design features support perceptual extension of breath awareness. We review and critically analyze 31 breath-based interactive systems. We identify 4 theoretical frameworks and 3 design strategies for interactive systems that perceptually extend breath awareness. We reflect upon this design space from both a theoretical and system design perspective, and propose future design directions for developing systems that "listen to" breath and perceptually extend it.
We describe a Research through Design project—Curious Cycles—a collection of objects and interactions which encourage people to be in close contact with their menstruating body. Throughout a full menstrual cycle, five participants used Curious Cycles to look at their bodies in unfamiliar ways and to touch their bodily fluids, specifically, menstrual blood, saliva, and cervical mucus. The act of touching and looking led to the construction of new knowledge about the self and to a nurturing appreciation for the changing body. Yet, participants encountered and reflected upon frictions within themselves, their home, and their social surroundings, which stem from societal stigma and preconceptions about menstruation and bodily fluids. We call for and show how interaction design can engage with technologies that mediate self-touch as a first step towards reconfiguring the way menstruating bodies are treated in society.
https://doi.org/10.1145/3313831.3376471
Biosensing technologies are increasingly available as off-the-shelf products, yet for many designers, artists and non-engineers, these technologies remain difficult to design with. Through a soma design stance, we devised a novel approach for exploring qualities in biodata. Our explorative process culminated in the design of three artefacts, coupling biosignals to tangible actuation formats. By making biodata perceivable as sound, in tangible form or directly on the skin, it became possible to link qualities of the measurements to our own somatics — our felt experience of our bodily bioprocesses — as they dynamically unfold, spurring somatically-grounded design discoveries of novel possible interactions. We show that making biodata attainable for a felt experience — or as we frame it: turning biodata into somadata — enables not only first-person encounters, but also supports collaborative design processes as the somadata can be shared and experienced dynamically, right at the moment when we explore design ideas.
The subjective experience of emotion is notoriously difficult to interpersonally communicate. We believe that technology can challenge this notion through the design of neuroresponsive systems for interpersonal communication. We explore this through "Neo-Noumena", a communicative neuroresponsive system that uses brain-computer interfacing and artificial intelligence to read one's emotional states and dynamically represent them to others in mixed reality through two head-mounted displays. In our study five participant pairs were given Neo-Noumena for three days, using the system freely. Measures of emotional competence demonstrated a statistically significant increase in participants' ability to interpersonally regulate emotions. Furthermore, participant interviews revealed themes regarding Spatiotemporal Actualization, Objective Representation, and Preternatural Transmission. We also suggest design strategies for future augmented emotion communication systems. We intend that work gives guidance towards a future in which our ability to interpersonally communicate emotion is augmented beyond traditional experience.