GROVE, a generalized reward framework that enables open-vocabulary pkysical skill leaning without manual engineering or task-specific demonstrations.
- Release training and inference code of Pose2CLIP.
- Release well-trained model of Pose2CLIP.
- Release the training data of low-level controller.
- Release training code of basic RL agents.
Download Isaac Gym from the website, then follow the installation instructions.
Once Isaac Gym is installed, install the external dependencies for this repo:
pip install -r requirements.txt
We release all our training motions for low-level controller, which are located in calm/data/motions/.Individual motion clips are stored as .npy files. Motion datasets are specified by .yaml files, which contains a list of motion clips to be included in the dataset. Motion clips can be visualized with the following command:
python calm/run.py
--test
--task HumanoidViewMotion
--num_envs 1
--cfg_env calm/data/cfg/humanoid.yaml
--cfg_train calm/data/cfg/train/rlg/amp_humanoid.yaml
--motion_file [Your file path].npy
--motion_file can be used to visualize a single motion clip .npy or a motion dataset .yaml.
If you want to retarget new motion clips to the character, you can take a look at an example retargeting script in calm/poselib/retarget_motion.py.
Our code is based on CALM and CLIP and AnySkill. Thanks for these great projects.
@inproceedings{cui2025grove,
title={GROVE: A Generalized Reward for Learning Open-Vocabulary Physical Skill},
author={Cui, Jieming and Liu, Tengyu and Ziyu, Meng and Jiale, Yu and Ran Song and Wei Zhang and Zhu, Yixin and Huang, Siyuan},
booktitle={Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2025}
}