-
Clone the repository:
git clone --recursive [email protected]:facebookresearch/NGDF.git -
Create a conda environment and install package dependencies. Note: mamba is highly recommended as a drop-in replacement for conda.
cd NGDF bash install.shInstall PyTorch separately, based on your CUDA driver version. The command below was tested on a 3080/3090 with CUDA 11.1:
source prepare.sh pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.htmlRun
source prepare.shbefore running anyngdftraining or evaluation code to activate the environment and set env variables.
NGDF
├── acronym # Submodule with utilities for ACRONYM dataset
├── contact_graspnet # Submodule with ContactGraspnet for baselines
├── data # Datasets, models, and evaluation output
├── differentiable-robot-model # Submodule for differentiable FK
├── ndf_robot # Submodule for pre-trained shape embedding
├── ngdf # Code for training and evaluating NGDF networks
├── OMG-Planner # Submodule with pybullet env, reach and grasp evaluation
├── scripts # Scripts for running training and evaluation
└── theseus # Submodule for differentiable FK and SE(3) ops-
Download datasets
acronym_perobjandacronym_multobjfrom this Google Drive link. Place the datasets indata/.The datasets are required to compute the closest grasp metric and are also used in training.
-
Run evaluation
- Download pre-trained models and configs into
data/modelsfrom this link - Download object rotations into
datafrom this link - Run grasp level set evaluations:
bash scripts/eval/grasp_level_set/perobj.sh bash scripts/eval/grasp_level_set/multobj.shResults are stored in
eval/in each model dir.To evaluate the grasps in pybullet, you'll need to install the code in the following section, then run the above commands with a
-pflag:bash scripts/eval/grasp_level_set/perobj.sh -p - Download pre-trained models and configs into
-
Set up dependencies
-
OMG-Planner, follow instructions in OMG-Planner README
OMG-Planner/README.md -
pip install "git+https://github.com/facebookresearch/pytorch3d.git@stable" -
differentiable-robot-model
cd differentiable-robot-model git remote add parent https://github.com/facebookresearch/differentiable-robot-model.git git fetch parent python setup.py develop -
Contact-GraspNet
cd contact_graspnet conda env update -f contact_graspnet_env_tf25.yml sh compile_pointnet_tfops.sh pip install -e .Download trained model
scene_test_2048_bs3_hor_sigma_001from here and copy it into thecheckpoints/folder.
-
-
Run evaluation script
bash scripts/eval/reach_and_grasp/perobj.shThe results are saved in
data/pybullet_eval. Get summary results in jupyter notebookjupyter notebook --notebook-dir=scripts/eval/reach_and_grasp
- Single object model training:
bash scripts/train/perobj_Bottle.sh bash scripts/train/perobj_Bowl.sh bash scripts/train/perobj_Mug.sh - Multi-object model training;
bash scripts/train/multobj_Bottle.sh
- Building docker
cd NGDF docker build -t ngdf . - Run docker
bash docker_run.shsource prepare.sh- Run the same commands for training in the container under
root:/workspace/NGDF#
@article{weng2022ngdf,
title={Neural Grasp Distance Fields for Robot Manipulation},
author={Weng, Thomas and Held, David and Meier, Franziska and Mukadam, Mustafa},
journal={IEEE International Conference on Robotics and Automation (ICRA)},
year={2023}
}
The majority of NGDF is licensed under MIT license, however a portion of the project is available under separate license terms: ContactGraspNet is licensed under a non-commericial NVidia License.
We actively welcome your pull requests! Please see CONTRIBUTING.md and CODE_OF_CONDUCT.md for more info.

