This work presents the iterative design and evaluation of a large-language-model (LLM) based teachable agent, MatlabTutee, that facilitates learning-by-teaching (LBT) experiences within university computer science courses. We detail four different experiments, with a total of 119 students, where we refine our system, compare it to human-facilitated LBT experiences, and deploy it in two, month-long in-the-wild environments. We find that our system is able to successfully convey a learner persona similar to a human pretending to be novice while also providing comparable LBT benefits. These benefits include helping students identify areas for improvement, develop a more accurate assessment of their own abilities, and improve their overall attitudes toward computer science. We also explore how students choose to adopt our system into their study habits while situated in real university courses.
https://dl.acm.org/doi/10.1145/3706598.3713644
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)