Caregivers often experience emotional difficulties and social isolation due to their demanding caregiving duties. Conversational AI has the potential to provide emotional support, yet it lacks effective emotional-regulation support. In this study, we conducted focus groups and semi-structured interviews with mental health professionals and caregivers (n = 17) to explore the potential benefits, challenges, and concerns of users on the applications of conversational AI for caregivers’ emotional support. Our findings suggest that, while current text-based conversational AI is deemed valuable for emotional support, there is a desire to have a more empathic AI, an AI that actively listens, takes cultural, religious, and linguistic context into consideration; and makes humans feel heard. We examined the dimensions of empathic AI in mental health, from authenticity and trust to over-reliance, misuse, and even exacerbating mental health problems, and how this can potentially be addressed to improve caregivers’ well-being.
ACM CHI Conference on Human Factors in Computing Systems