この勉強会は終了しました。ご参加ありがとうございました。
Simultaneous declines in visual function (e.g., dynamic visual acuity), cognitive ability (e.g., cognitive control/multitasking), and physical function (e.g., balance) are major symptoms of aging. Integrating stimulation for those sensory channels into a game could be a suitable way for older adults to engage in long-term health interventions. However, existing game design has not considered the relationship and synergistic impact of multisensory channels of dynamic visual acuity, cognitive ability, and physical function for older adults. We therefore developed the first multisensory VR game system prototype based on cognitive psychology paradigms (e.g., multitasking and Go/No-Go tasks), full-body movement (limb movement), and dynamic visual acuity exercises (horizontal, vertical and forward-backward eye movements) in the VR system environment. We then conducted an experiment to measure the acceptability (in terms of e.g., cybersickness, mental workload, etc.) of our VR game for older adults. The young adults and a PC task were included for comparisons. Qualitative and quantitative results showed that older adults did not experience cybersickness in either sitting or standing postures during the VR gameplay; they well-accepted the workload of the VR game compared to the PC task. Our findings revealed that the design combination of three sensory channels shows synergistic benefits for older adults. Our game encourages older adults to engage in extensive body movement in sitting and standing postures, this is particularly important to people with disabilities who cannot stand. Design implications are provided for the future development and implementation of VR game design for older adults. Our work provides empirical support for the acceptability of multisensory VR systems in older adults, and contributes to the future design of VR games for older adults.
The global aging trend compels older adults to navigate the evolving digital landscape, presenting a substantial challenge in mastering smartphone applications. While Augmented Reality (AR) holds promise for enhancing learning and user experience, its role in aiding older adults' smartphone app exploration remains insufficiently explored. Therefore, we conducted a two-phase study: (1) a workshop with 18 older adults to identify app exploration challenges and potential AR interventions, and (2) tech-probe participatory design sessions with 15 participants to co-create AR support tools. Our research highlights AR's effectiveness in reducing physical and cognitive strain among older adults during app exploration, especially during multi-app usage and the trial-and-error learning process. We also examined their interactional experiences with AR, yielding design considerations on tailoring AR tools for smartphone app exploration. Ultimately, our study unveils the prospective landscape of AR in supporting the older demographic, both presently and in future scenarios.
As mobile user interfaces (UI) become feature-rich, navigation gets more complex. Finding features quickly starts demanding information-intensive strategies for decision-making — which can be challenging for older adults. Older adults examine fewer details, requiring fewer cognitive resources, when searching for information with a large number of alternatives. In this paper, we first systematically examine various ways to convey a reduced feature space. Visually emphasizing three relevant options helped older adults find a specific feature more quickly — on par with younger adults. Older users were more efficient when options were highlighted along with their context or with a weighted zoom than when just highlighted, and they also preferred these two the most. We then present Nav Nudge, an interaction technique that uses voice input and large language models to visually reduce the feature search space on demand — and discuss how older adults use it within a mobile map application.
Recent studies show the promise of VR in improving physical, cognitive, and emotional health of older adults. However, prior work on optimizing object selection and manipulation performance in VR was mostly conducted among younger adults. It remains unclear how older adults would perform such tasks compared to younger adults and the challenges they might face. To fill in this gap, we conducted two studies with both older and younger adults to understand their performances and user experiences of object selection and manipulation in VR respectively. Based on the results, we delineated interaction difficulties that older adults exhibited in VR and identified multiple factors, such as headset-related neck fatigue, extra head movements from out-of-view interactions, and slow spatial perceptions, that significantly decreased the motor performance of older adults. We further proposed design recommendations for improving the accessibility of direct interaction experiences in VR for older adults.
Older adults commonly rely on younger family members for remote tech support, but the current general-purpose video-conferencing platforms fall short of effectively catering to their needs. We introduce the design concept and prototypes for HelpCall, an augmentation of these platforms that provides aids for learning computer tasks, including a step-by-step visual guide automatically generated from synchronous human instruction. Through observations and interviews with older adults (N=14), we assessed the potential of the HelpCall concept and compared its two design candidates: Tooltip with numbered location markers and List of written steps. All participants acknowledged HelpCall's potential to improve the comfort and efficiency of synchronous tech support. Tooltip emerged as more promising and could be enhanced by incorporating the well-received features from List. Our findings provide clear directions for advancing HelpCall design and new insights into designing synchronous software help for older adults, taking a step towards universal accessibility of digital technology.