When AI Gets It Wrong: Scaffolding AI Hallucination Detection for Children Through Chatbot Creation

要旨

Children increasingly interact with generative AI systems that can produce hallucinated content, potentially reinforcing misconceptions and undermining critical thinking skills. We investigate how children detect and respond to hallucinations while building and testing LLM-powered chatbots in a development environment. We integrated hallucination-awareness scaffolds such as confidence indicators, fact-checking, repeated questioning, and model comparison. Through a study with 48 middle school learners aged 10-14, participants showed significant pre-to-post gains in AI knowledge, hallucination awareness, and confidence in building trustworthy chatbots. They developed multi-layered strategies, including probing inconsistencies and cross-checking with external sources. Key challenges included over-reliance on visible cues, fragmented use of scaffolds, and a tension between creativity and reliability. These findings highlight design implications for children’s AI literacy for responsible AI development: supporting proactive, iterative engagement in the development cycle, integrating scaffolds into coherent workflows, and balancing creativity with accuracy.

著者
Xiaoyi Tian
North Carolina State University, Raleigh, North Carolina, United States
Deniz Ozturk
North Carolina State University, Raleigh, North Carolina, United States
Sreekar Edula
The University of North Carolina at Charlotte, Charlotte, North Carolina, United States
Jibran Adil
The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, United States
Qiao Jin
North Carolina State University, Raleigh, North Carolina, United States
Yang Shi
Utah State University, Logan, Utah, United States
Tiffany Barnes
North Carolina State University, Raleigh, North Carolina, United States

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: AI Literacy, Ethics, and Critical AI Understanding

Auditorium
7 件の発表
2026-04-15 18:00:00
2026-04-15 19:30:00