Skip to content

liujf69/TD-GCN-Gesture

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition

Jinfu Liu, Xinshun Wang, Can Wang, Yuan Gao, Mengyuan Liu
Sun Yat-sen University, Kiel University, Tampere University, Peking University

IEEE Transactions on Multimedia (TMM), 2024

Image

PWC
PWC
PWC
PWC
PWC
PWC

Prerequisites

You can install all dependencies by running pip install -r requirements.txt
Then, you need to install torchlight by running pip install -e torchlight

Data Preparation

Download four datasets:

  1. SHREC’17 Track dataset from http://www-rech.telecom-lille.fr/shrec2017-hand/
  2. DHG-14/28 dataset from http://www-rech.telecom-lille.fr/DHGdataset/
  3. NTU RGB+D 60 Skeleton dataset from https://rose1.ntu.edu.sg/dataset/actionRecognition/
  4. NW-UCLA dataset from Download NW-UCLA dataset
  5. Put downloaded data into the following directory structure:
- data/
  - shrec/
    - shrec17_dataset/
	  - HandGestureDataset_SHREC2017/
	    - gesture_1
	      ...
  - DHG14-28/
    - DHG14-28_dataset/
	  - gesture_1
	    ...
  - NW-UCLA/
    - all_sqe
      ...
  - ntu/
    - nturgbd_raw/
	  - nturgb+d_skeletons
            ...

Download from cloud drive:

  1. SHREC’17 Track dataset from Baidu Drive, Password is TDGC. Download from Google Drive.
  2. DHG-14/28 dataset from Baidu Drive, Password is TDGC. Download from Google Drive.
  3. NTU RGB+D 60 dataset from Baidu Drive, Password is TDGC.

SHREC’17 Track dataset:

  1. First, extract all files to /data/shrec/shrec17_dataset
  2. Then, run python gen_traindataset.py and python gen_testdataset.py

DHG-14/28 dataset:

  1. First, extract all files to ./data/DHG14-28/DHG14-28_dataset
  2. Then, run python python gen_dhgdataset.py

NTU RGB+D 60 dataset

  1. First, extract all skeleton files to ./data/ntu/nturgbd_raw
  2. Then, run python get_raw_skes_data.py, python get_raw_denoised_data.py and python seq_transformation.py in sequence

NW-UCLA dataset

  1. Move folder all_sqe to ./data/NW-UCLA

Training

You can change the configuration in the yaml file and in the main function. We also provide four default yaml configuration files.

SHREC’17 Track dataset:

Run python main.py --device 0 --config ./config/shrec17/shrec17.yaml

DHG-14/28 dataset:

Run python main.py --device 0 --config ./config/dhg14-28/DHG14-28.yaml

NTU RGB+D 60 dataset:

On the benchmark of cross-view, run python main.py --device 0 --config ./config/nturgbd-cross-view/default.yaml
On the benchmark of cross-subject, run python main.py --device 0 --config ./config/nturgbd-cross-subject/default.yaml

NW-UCLA dataset:

Run python main.py --device 0 --config ./config/ucla/nw-ucla.yaml

Testing

We provide several trained weight files and place them in the checkpoints folder.

python main.py --device 0 --config <config.yaml> --phase test --weights <work_dir>/<weight.pt>

Ensemble

1. Set Rate
2. Run:
python gesture_ensemble.py \
--joint_Score <joint_path> \
--bone_Score <bone_path> \
--jointmotion_Score <jointmotion_path> \
--val_sample <val_path> \
--benchmark <benchmark>

# Example for Shrec_28
1. Download .pkl file from: https://drive.google.com/drive/folders/1ux87mUirBQjmA4b4fEWtb9tuj-8wSYHt
2. Set Rate [0.5, 0.5, 0.5] or [0.5, 0.3, 0.2]
3. Run:
python gesture_ensemble.py \
--joint_Score ./joint.pkl \
--bone_Score ./bone.pkl \
--jointmotion_Score ./jointmotion.pkl \
--val_sample ./shrec17_28.txt \
--benchmark Shrec_28

Citation

# Result about SHREC’17 Track, DHG-14/28, NTU RGB+D 60 and NW-UCLA datasets.
@ARTICLE{10113233,
  author={Liu, Jinfu and Wang, Xinshun and Wang, Can and Gao, Yuan and Liu, Mengyuan},
  title={Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition}, 
  journal={IEEE Transactions on Multimedia (TMM)}, 
  year={2024}
}

# Result about UAV-Human dataset.
@inproceedings{liu2024HDBN,
  author={Liu, Jinfu and Yin, Baiqiao and Lin, Jiaying and Wen, Jiajun and Li, Yue and Liu, Mengyuan},
  title={HDBN: A Novel Hybrid Dual-branch Network for Robust Skeleton-based Action Recognition}, 
  booktitle={Proceedings of the IEEE International Conference on Multimedia and Expo Workshop (ICMEW)}, 
  year={2024}
}

Our project is based on the DSTA-Net, CTR-GCN.

Contact

For any questions, feel free to contact: liujf69@mail2.sysu.edu.cn

About

[TMM 2024] Implementation of the paper “Temporal Decoupling Graph Convolutional Network for Skeleton-based Gesture Recognition”.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages