Online platforms provide support for many kinds of distress, including suicidal thoughts and behaviors. However, because many platforms restrict suicidal talk, volunteers on these platforms struggle with how to help suicidal people who come for support. We interviewed 11 volunteer counselors in a large online support platform, including after they role-played conversations with varying severities of suicidality, to explore practices and challenges when identifying and responding to suicidality. We then presented Speed Dating design concepts around emotional preparation and support, real-time guidance, training, and suicide detection. Participants wanted more support and preparation for conversations with suicidal people, but were conflicted about AI-based technologies, including trade-offs between potential benefits of conversational agents for training and limitations of prediction or real-time response suggestions, due to the sensitive, context-dependent decisions that volunteers must make. Our work has important implications for nuanced considerations and design choices around developing digital mental health technologies.
doi.org/10.1145/3613904.3641922
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)