Physical Motion Generation Hover to flip

Physical Motion Generation

This module learns a privileged generalist policy in simulation to transform noisy, artifact-heavy references into physically plausible motions.

It enforces robot dynamics consistency and produces stable intermediate motion commands for downstream tracking.

General Motion Tracking Hover to flip

General Motion Tracking

This module trains the general control policy to track physically plausible references under domain randomization and sensor noise.

The resulting policy outputs robust actions for long-horizon and dynamic real-world humanoid motion tracking.

Application and Inference Example Hover to flip

Application / Inference

OmniTrack forms a two-stage pipeline for both offline and online use: the physical motion generation stage filters one or multiple motion clips via simulator rollouts to produce physically feasible, dynamics-consistent trajectories, and the general motion tracking stage delivers stable long-horizon tracking across diverse behaviors.

For online inference, real-time commands from mocap, VR headsets, or other sources are first refined in simulation (e.g., IsaacLab or MuJoCo) and then fed to the tracking policy for joint-level real-robot control, enabling robust, high-dynamic teleoperation under continuously varying user input.