I am a second-year Ph.D. student in the IIIS at Tsinghua University,
advised by Prof. Li Yi.
Currently, I do research about humanoid robot learning at Galbot. We are
actively looking for interns and full-time employees here about humanoid research.
I maintain several open-source projects based on our research works.
Feel free to use them and we welcome any feedback from the community.
LATENT. A full-stack pipeline , from motion tracking to latent action space construction to high-level policy learning. Enable the humanoid to learn tennis and potentially many other athletic skills .
Click and Traverse. A humanoid teleoperation system with local spatial intelligence. Users can guide the humanoid through cluttered indoor scenes with just a single click, while avoiding collisions.
OpenTrack. A humanoid motion tracking training framework, based on our work Any2Track. Key features: directly train in MuJoCo, support multi-GPU parallel training.
OpenWBT. A cross-embodiment, easy-to-deploy VR-based
humanoid whole-body-teleoperation system, based on our work R2S2. Try it!
We present Click and Traverse. It enables collision-free humanoid traversal in cluttered indoor scenes with Humanoid Potential Field (HumanoidPF),
a general guidance for learning obstacle avoidance skills.
We present Any2Track. It learns to track any motions (over 40 hours, highly-dynamic, contact-rich) under any disturbances (terrain, external forces, payloads) by introducing a dynamics world model for policy fine-tuning.
We present Real-world-Ready Skill Space (R2S2). It constructs a latent action space from a set of pre-trained motor skills to
solve loco-manipulation tasks with a large reachable space.