We introduce a multimodal dataset and experimental setup designed to support the development of adaptive collaborative systems. Data were collected from distributed teams working simultaneously across two continents, demonstrating the feasibility of sensing team cognition in geographically dispersed settings. The dataset includes synchronized EEG, audio transcripts, screen recordings, and behavioral annotations, enabling fine-grained analysis of collaboration in naturalistic settings. Our setup integrates neural and behavioral sensing to model team processes, using metrics such as task engagement, neural synchrony, and interaction patterns. These analyses reveal relationships between cognitive states and team dynamics, suggesting new directions for brain-computer interfaces that respond to team-level signals. By providing a shareable dataset, robust sensing infrastructure, and techniques for modeling distributed collaboration, this work enables future interactive systems that sense and support distributed teamwork in real time.
ACM CHI Conference on Human Factors in Computing Systems