We present the walking meditation mat research, leveraging targeted heat to help meditators focus attention inward. The mat, measuring three meters in length, is designed with 10 visual signifiers and 10 corresponding heater pads arranged in a step-by-step pattern. Walking meditation is challenging, as it requires both inward and outward attention. In a qualitative study we studied the walking meditation experience with or without heat, evaluating the impact of the mat’s visual signifiers and the gentle feet-focused targeted heat during the walking experience. Our findings reveal the tension participants experience between external design factors and their internal meditation process. Visual signifiers were more commonly associated with outward attention, dizziness and imbalance, while targeted heat affordances were more commonly associated with attention to bodily sensations, calmness, grounding, and reflection. We conclude with insights regarding the role of targeted heat in balancing inward and outward attention in walking meditation and introspective processes.
In redirected walking techniques, curvature gain and bending gain, which are referred to as curvature manipulation, are important redirection gains. The applied gains can differ when multiple paths are mapped, and sudden changes in gain may cause discomfort. This study proposes quadratic curvature manipulation (QCM) based on the habituation mechanism to effectively reduce discomfort. This method quadratically adjusts the path curvature, thereby reducing user's perception of curvature changes. Furthermore, we introduce the segmented curvature change (SCC) mode that combines QCM with linear curvature manipulation to facilitate more natural gain transitions, thereby reducing discomfort. Two experiments were conducted. Experiment 1 examined the relationship between QCM parameters and gains at which users felt discomfort. Experiment 2 further examined the effects of different curvature change modes on discomfort. The results indicate that using the SCC mode in curvature manipulations is more effective than other methods in reducing discomfort.
Occlusion, often caused by the user's body or fingers, can significantly reduce the efficiency and usability of touch interfaces.
As foot-based interactions in HMDs become more prevalent, self-occlusion becomes a more pronounced issue due to the involvement of the body and legs.
This work presents SeeThroughBody, a body-rendering approach designed to mitigate occlusion and enhance touch interactions between the foot and interactive floor in virtual environments.
Our user study unveiled twofold results. First, changing VisualizationStyles and BodyPartsVisibility can improve objective performance (e.g., time, movement) by reducing occlusion.
Second, these modifications also affect the subjective user experience (e.g., embodiment, usability). Different VisualizationStyles and BodyPartsVisibility have varying impacts, presenting trade-offs between performance and experience.
Based on these insights, we recommend Transparent-Foot and Outline-Foot for interactions focused on efficiency, and Transparent-All and Transparent-Thigh for enhancing overall user experience.
Finally, we demonstrate the application of these recommendations in a map browsing scenario using foot touch.
Walking is among the most common human activities where the feet can gather rich tactile information from the ground. The dynamic contact between the feet and the ground generates vibration signals that can be sensed by the foot skin. While existing research focuses on foot pressure sensing and lower-limb interactions, methods of decoding tactile information from foot vibrations remain underexplored. Here, we propose a foot-equipped wearable system capable of recording wideband vibration signals during walking activities. By enabling location-based recording, our system generates maps of haptic data that encode information on ground materials, lower-limb activities, and road conditions. Its efficacy was demonstrated through studies involving 31 users walking over 18 different ground textures, achieving an overall identification accuracy exceeding 95\% (cross-user accuracy of 87\%). Our system allows pedestrians to map haptic information through their daily walking activities, which has potential applications in creating digitalized walking experiences and monitoring road conditions.
Walking on inclined surfaces is common in some Virtual Reality (VR) scenarios, for instance, when moving between floors of a building, climbing a tower, or ascending a virtual mountain. Existing approaches enabling realistic walking experiences in such settings typically require the user to use bulky walking-in-place hardware or to walk in a physical area. Addressing this challenge, we present RedirectedStepper, a locomotion technique leveraging a novel device based on a mini exercise stepper to provide realistic VR staircase walking experiences by alternating the tilt of the two stepper pedals.
RedirectedStepper employs a new exponential mapping function to visually morph the user's real foot motion to a corresponding curved path in the virtual environment (VE).
Combining this stepper and the visual mapping function provides an in-place locomotion technique allowing users to virtually ascend an infinite staircase or slope while walking-in-place (WIP). We conducted three within-subject user studies (n=36) comparing RedirectedStepper with a WIP locomotion technique using the Kinect. Our studies indicate that RedirectedStepper improves the users' sense of realism in walking on staircases in VR. Based on a set of design implications derived from the user studies, we developed SnowRun, a VR exergame application, demonstrating the use of the RedirectedStepper concept.
To interact with Augmented Reality (AR) content while walking, the user interfaces (UIs) need to move along with the user without distracting their field of view. This paper investigates on-hand reference frames for AR interaction while walking. First, we conduct a user study evaluating six on-hand reference frames. Results show that the Pinch Grip With Offset (PGWO), which anchors UIs to the pinch grip while floating at a distance, outperforms other on-hand reference frames regarding speed, accuracy, workload, and user preference. Next, we conduct a follow-up study to compare PGWO’s performance with head and torso reference frames, commonly used in previous studies, to see whether PGWO’s benefits hold up against well-established reference frames. Results revealed better performance and higher user preference for PGWO than both reference frames. Finally, we present design recommendations for developing future AR systems that are more efficient and user-friendly for on-the-go interaction.
Existing measurements of driver distraction in laboratory settings lack construct and ecological validity, and therefore, cannot provide reliable estimates of in-car tasks’ distraction effects. In this paper, we operationalize driver distraction in a novel way with the help of Drive-In Lab, where any passenger car can be connected to a driving simulation. The operationalization is based on drivers’ headway maintenance during in-car tasks as compared to baseline driving, while accommodating situational and driver-specific variables, such as brake response times. Realistic visual looming cues enable evaluation of distraction effects on cognitive processes crucial for safe driving. Validation studies with two 2024 car models indicate that the method can reliably differentiate distraction effects between cars, in-car tasks, and drivers as large, medium, small, or no effect on crash potential. The method supports design of in-car interactions by providing valid means to reveal the worst and best practices in in-car user interface design.