As augmented reality devices (e.g., smartphones and headsets) proliferate in the market, multi-user AR scenarios are set to become more common. Co-located users will want to share coherent and synchronized AR experiences, but this is surprisingly cumbersome with current methods. In response, we developed PatternTrack, a novel tracking approach that repurposes the structured infrared light patterns emitted by VCSEL-driven depth sensors, like those found in the Apple Vision Pro, iPhone, iPad, and Meta Quest 3. Our approach is infrastructure-free, requires no pre-registration, works on featureless surfaces, and provides the real-time 3D position and orientation of other users' devices. In our evaluation --- tested on six different surfaces and with inter-device distances of up to 260 cm --- we found a mean 3D positional tracking error of 11.02 cm and a mean angular error of 6.81°.
https://dl.acm.org/doi/10.1145/3706598.3713388
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2025.acm.org/)