Audio media -- radio, podcasts, audiobooks -- structures everyday life: we keep up, wind down, and share moments through long-form listening. Yet for people living with aphasia -- a communication disability that affects audio comprehension -- unsupported audio often means losing the thread and marring the experience. While accessibility advances have focused on print, web, and audiovisual content, audio-only remains unconsidered; oftentimes optimised for marketisation rather than sustained understanding. We report a three-week in-situ deployment of Re-Connect app, an audio media player which meets the people at the moment of comprehension difficulty. With ten adults living with aphasia, we show how people assemble personal repertoires of small, co-present communication cues that repair in the moment and support recall. Grounded in lived experience, we argue for personal, source-proximate scaffolds that help make long-form audio more understandable and enjoyable.
ACM CHI Conference on Human Factors in Computing Systems