Full-body tracking in virtual reality improves presence, allows interaction via body postures, and facilitates better social expression among users. However, full-body tracking systems today require a complex setup fixed to the environment (e.g., multiple lighthouses/cameras) and a laborious calibration process, which goes against the desire to make VR systems more portable and integrated. We present HybridTrak, which provides accurate, real-time full-body tracking by augmenting inside-out upper-body VR tracking systems with a single external off-the-shelf RGB web camera. HybridTrak converts and transforms users' 2D full-body poses from the webcam to 3D poses leveraging the inside-out upper-body tracking data with a full-neural solution. We showed HybridTrak is more accurate than RGB or depth-based tracking method on the MPI-INF-3DHP dataset. We also tested HybridTrak in the popular VRChat app and showed that body postures presented by HybridTrak are more distinguishable and more natural than a solution using an RGBD camera.
https://dl.acm.org/doi/abs/10.1145/3491102.3502045
The ACM CHI Conference on Human Factors in Computing Systems (https://chi2022.acm.org/)