Capturing Team Cognition: A Multimodal Dataset for Adaptive Collaborative Interfaces

要旨

We introduce a multimodal dataset and experimental setup designed to support the development of adaptive collaborative systems. Data were collected from distributed teams working simultaneously across two continents, demonstrating the feasibility of sensing team cognition in geographically dispersed settings. The dataset includes synchronized EEG, audio transcripts, screen recordings, and behavioral annotations, enabling fine-grained analysis of collaboration in naturalistic settings. Our setup integrates neural and behavioral sensing to model team processes, using metrics such as task engagement, neural synchrony, and interaction patterns. These analyses reveal relationships between cognitive states and team dynamics, suggesting new directions for brain-computer interfaces that respond to team-level signals. By providing a shareable dataset, robust sensing infrastructure, and techniques for modeling distributed collaboration, this work enables future interactive systems that sense and support distributed teamwork in real time.

著者
Christopher Micek
Worcester Polytechnic Institute, Worcester, Massachusetts, United States
Lasse Warnke
University of Bremen, Bremen, Germany
Lourenço Abrunhosa Rodrigues
Universität Bremen, Bremen, Germany
Felix Putze
University of Bremen, Bremen, Germany
Erin Solovey
Worcester Polytechnic Institute, Worcester, Massachusetts, United States

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Modeling Spatial, Linguistic, and Sensory Errors

P1 - Room 128
6 件の発表
2026-04-14 20:15:00
2026-04-14 21:45:00