Skip to content

A toolbox for skeleton-based action recognition.

License

Notifications You must be signed in to change notification settings

zeyun-zhong/pyskl

 
 

Repository files navigation

PYSKL

PWC PWC PWC

PYSKL is a toolbox focusing on action recognition based on SKeLeton data with PYTorch. Various algorithms will be supported for skeleton-based action recognition. We build this project based on the OpenSource Project MMAction2.

This repo is the official implementation of PoseConv3D and STGCN++.


Left: Skeleton-base Action Recognition Results on NTU-RGB+D-120; Right: CPU Realtime Skeleton-base Gesture Recognition Results

News

  • Support PyTorch 2.0: when set --compile for training/testing scripts and with torch.__version__ >= 'v2.0.0' detected, will use torch.compile to compile the model before training/testing. Experimental Feature, absolutely no performance warranty (2023-03-16).
  • Provide a real-time gesture recognition demo based on skeleton-based action recognition with ST-GCN++, check Demo for more details and instructions (2023-02-10).
  • Provide scripts to estimate the inference speed of each model (2022-12-30).
  • Support RGBPoseConv3D, a two-stream 3D-CNN for action recognition based on RGB & Human Skeleton. Follow the guide to train and test RGBPoseConv3D on NTURGB+D (2022-12-29).
  • We provide a script (ntu_preproc.py) to generate PYSKL-style annotations files from official NTURGB+D skeleton files (2022-12-20).

Supported Algorithms

Supported Skeleton Datasets

Installation

git clone https://github.com/kennymckormick/pyskl.git
cd pyskl
# Please first install pytorch according to instructions on the official website: https://pytorch.org/get-started/locally/. Please use pytorch with version smaller than 1.11.0 and larger (or equal) than 1.5.0
# The following command will install mmcv-full 1.5.0 from source, which might be very slow (take ~10 minutes). You can also follow the instruction at https://github.com/open-mmlab/mmcv to install mmcv-full from pre-built wheels, which will be much faster.
pip install -r requirements.txt
pip install -e .

Demo

Check demo.md.

Data Preparation

We provide HRNet 2D skeletons for every dataset we supported and Kinect 3D skeletons for the NTURGB+D and NTURGB+D 120 dataset. To obtain the human skeleton annotations, you can:

  1. Use our pre-processed skeleton annotations: we directly provide the processed skeleton data for all datasets as pickle files (which can be directly used for training and testing), check Data Doc for the download links and descriptions of the annotation format.
  2. For NTURGB+D 3D skeletons, you can download the official annotations from https://github.com/shahroudy/NTURGB-D, and use our provided script to generate the processed pickle files. The generated files are the same with the provided ntu60_3danno.pkl and ntu120_3danno.pkl. For detailed instructions, follow the Data Doc.
  3. We also provide scripts to extract 2D HRNet skeletons from RGB videos, you can follow the diving48_example to extract 2D skeletons from an arbitrary RGB video dataset.

You can use vis_skeleton to visualize the provided skeleton data.

Training & Testing

You can use following commands for training and testing. Basically, we support distributed training on a single server with multiple GPUs.

# Training
bash tools/dist_train.sh {config_name} {num_gpus} {other_options}
# Testing
bash tools/dist_test.sh {config_name} {checkpoint} {num_gpus} --out {output_file} --eval top_k_accuracy mean_class_accuracy

For specific examples, please go to the README for each specific algorithm we supported.

Citation

If you use PYSKL in your research or wish to refer to the baseline results published in the Model Zoo, please use the following BibTeX entry and the BibTex entry corresponding to the specific algorithm you used.

@inproceedings{duan2022pyskl,
  title={Pyskl: Towards good practices for skeleton action recognition},
  author={Duan, Haodong and Wang, Jiaqi and Chen, Kai and Lin, Dahua},
  booktitle={Proceedings of the 30th ACM International Conference on Multimedia},
  pages={7351--7354},
  year={2022}
}

Contributing

PYSKL is an OpenSource Project under the Apache2 license. Any contribution from the community to improve PYSKL is appreciated. For significant contributions (like supporting a novel & important task), a corresponding part will be added to our updated tech report, while the contributor will also be added to the author list.

Any user can open a PR to contribute to PYSKL. The PR will be reviewed before being merged into the master branch. If you want to open a large PR in PYSKL, you are recommended to first reach me (via my email [email protected]) to discuss the design, which helps to save large amounts of time in the reviewing stage.

Contact

For any questions, feel free to contact: [email protected]

About

A toolbox for skeleton-based action recognition.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.1%
  • Other 0.9%