Musical emotion modulation is central to mental well-being, yet existing affective haptic systems often prioritize technical feasibility over investigation of where and what optimal stimuli should be applied.
This paper systematically examines the emotional effects of vibrotactile, thermotactile, and combined stimuli based on the Valence–Arousal model across the wrist, neck, and ear under two music conditions, indexed by EEG measures and subjective ratings.
We found that both body site and stimulus type significantly influenced emotional responses. The ear strongly enhanced arousal, the neck produced context-dependent effects, and the wrist primarily modulated pleasantness. Vibration primarily boosted arousal, thermal cues enhanced valence, and their combination enabled a more balanced, immersive experience.
Our findings provide scientific guidance for future affective wearable design in various contexts.
Haptic perception on touchscreens varies across fingers, yet little is known about how finger identity and multi-finger use shape tactile discrimination and user experience. We conducted two experiments with four haptic feedback. In Experiment 1, right-handed participants explored each of the ten fingers individually under stationary and moving conditions. Experiment 2 examined two-finger sequences with same participants. Results showed that moving exploration enhanced accuracy, confidence, and enjoyment, while stationary touch increased cognitive and physical load, especially for weaker fingers such as the left ring and pinky. The right thumb and index consistently performed best. In dual-finger trials, moving exploration improved second-finger performance, and adjacent same-hand pairs (e.g., Left Index–Left Thumb, Right Thumb–Right Index) yielded higher synergy. These findings highlight the role of finger anatomy, motion, and coordination, and provide concrete guidelines on which fingers (or combinations) and exploration modes to assign for haptic surfaces that optimize accuracy, comfort, and engagement.
Decades of online fashion retail and investment in its usability have led to a seemingly refined user experience. Yet, our study shows that female online shoppers, who make up the largest user group, experience a conflicted love-hate relationship when shopping online. Adopting a feminist HCI perspective, we contribute insights from a multi-step qualitative approach involving probes, co-design, iterative prototyping and body maps. We demonstrate that even screen-based website designs are deeply entangled with users’ embodied experiences. Through our analysis, we identify where such designs contribute to heightened emotional labour and negative user experiences. Our work offers concrete design implications centred around inclusivity, the predictive user experience of wearing and caring for garments, and transparency of information. We embody these implications in an interactive prototype and use it to validate our recommendations for a body-centred approach to UX design.
Occupational exoskeletons are designed to support workers in strenuous tasks and to promote health, yet their implementation and use often present challenges due to the close interaction between wearer and device. This study explored user perceptions of occupational exoskeletons through qualitative focus groups conducted after participants had gained hands-on experience with 16 different devices in four-hour trials. Key findings highlight users’ feedback on system sound, design, and support, movement restriction and wearer comfort, and underscore the important role of bodily sensations–alongside factors, such as usability and appearance–in exoskeleton user experience. A central discovery was the existence of conflicts between user preferences, for instance, between light-weight designs and effective user support. Based on these insights, we highlight implications for human-centered design of exoskeletons and aim to inspire further research within the human-computer interaction community.
Virtual Reality (VR) emphasizes immersive experiences, while text entry often requires hands or visual attention, which may disrupt the interaction flows in VR. We present \sysName, a hand- and eye-free text-entry technique that leverages ankle-based gestures for both standing and sitting situations. We began with two preliminary studies: one investigated the movement range of users' ankles, and the other elicited user-preferred ankle gestures for text-entry-related operations. The findings of these two studies guided our design of AnkleType. To optimize AnkleType's keyboard layout for eye-free input, we conducted a user study to capture the users’ natural ankle spatial awareness with a computer-simulated language test. Through a pairwise comparison study, we designed a bipedal input strategy for sitting (BPSit) and a unipedal input strategy for standing (UPStand). We further evaluated our design with a 7-day longitudinal study with 12 participants. Participants achieved an average typing speed of 15.05 WPM with UPStand and 16.70 WPM with BPSit in the visual condition, and 11.15 WPM and 12.87 WPM, respectively in the eyes-free condition.
Shape-changing wearables are known to convey emotions to wearers and observers, and jewelry is commonly worn for self-expression and to be seen by others.
But how do individual shape change parameters impact the emotions communicated?
In a first study, participants observed a shape-changing necklace; the second included wearing it.
The necklace uses pneumatic finger actuators; fabrication details are provided.
We systematically varied motion type, speed, amplitude and repetition, and exterior material to analyze emotions using Russell's circumplex model. Additionally, we asked users what they associated with each shape change. We found some surprising relationships between shape change parameters and the valence and arousal levels of emotions wearers and observers perceived.
Symmetrical actuations were recognized more accurately and received higher valence and arousal ratings.
Interestingly, even when wearers, who only felt motions, misidentified them, their ratings matched those from observers.
Our findings support creating shape-changing interfaces that communicate emotions more precisely.
Everyday object-based interactions (EOIs) and mid-air gesture interactions (MAIs) have been widely explored, yet prior work on their integration often targets narrow use cases or specific technologies, leaving designers and developers with limited guidance that generalizes across diverse EOIs and MAIs. We introduce Objestures (“Obj” + “Gestures”)—five interaction types spanning EOIs and MAIs, forming a design space for expressive uni- and bimanual interaction. To evaluate the usefulness of Objestures, we conducted an exploratory user study (N=12) on basic 3D tasks (rotation and scaling), which showed performance comparable to the headset's native freehand manipulation. To understand the user experience, we conducted case studies with the same participants across three applications (Sound, Draw, and Shadow), where participants found the interactions intuitive, engaging, and expressive, and indicated interest in everyday use. We further demonstrate the potential of Objestures across diverse contexts through 30 examples, and discuss limitations and implications.