QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse Sensors

(1) Seoul National University (2) Reality Labs Research, Meta
Responsive image


Replicating a user’s pose from only wearable sensors is important for many AR/VR applications. Most existing methods for motion tracking avoid environment interaction apart from foot-floor contact due to their complex dynamics and hard constraints. However, in daily life people regularly interact with their environment, e.g. by sitting on a couch or leaning on a desk. Using Reinforcement Learning, we show that headset and controller pose, if combined with physics simulation and environment observations can generate realistic full-body poses even in highly constrained environments. The physics simulation automatically enforces the various constraints necessary for realistic poses, instead of manually specifying them as in many kinematic approaches. These hard constraints allow us to achieve high-quality interaction motions without typical artifacts such as penetration or contact sliding. We discuss three features, the environment representation, the contact reward and scene randomization, crucial to the performance of the method. We demonstrate the generality of the approach through various examples, such as sitting on chairs, a couch and boxes, stepping over boxes, rocking a chair and turning an office chair. We believe these are some of the highest-quality results achieved for motion tracking from sparse sensor with scene interaction.


Sunmin Lee, Sebastian Starke, Yuting Ye, Jungdam Won and Alexander Winkler
QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse Sensors
SIGGRAPH 2023 (to appear)



author = {Lee, Sunmin and Starke, Sebastian and Ye, Yuting and Won, Jungdam and Winkler, Alexander},
title = {QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse Sensors},
year = {2023},
doi = {10.1145/3588432.3591504},
booktitle = {SIGGRAPH 2023 Conference Papers},
location = {Los Angeles, USA},