Skip to content

zq1335030905/GaitSDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gait recognition Static and Dynamic Features Analysis(Pytorch)

This is the code for paper: Static and Dynamic Features Analysis from Human Skeletons for Gait Recognition in 2021 IEEE International Joint Conference on Biometrics(IJCB). If you have any qustions, you can contact with me by [email protected]. The introduction video is released in bilibili and youtube. Paper link here.

About The Project

Gait recognition is an effective way to identify a person due to its non-contact and long-distance acquisition. In addition, the length of human limbs and the motion pattern of human from human skeletons have been proved to be effective features for gait recognition. However, the length of human limbs and motion pattern are calculated through human prior knowledge, more important or detailed information may be missing. Our method proposes to obtain the dynamic information and static information from human skeletons through disentanglement learning. In the experiments, it has been shown that the features extracted by our method are effective.

Prepare for data

You should download the data CASIA-B, and extracted by alphapose or openpose, our network input x ([Batch, 30, 64]). 30 = 15*2, 15 is the num of keypoints and 2 is the x and y coordinate of each keypoint.

Getting Start

Environment

python 3.6.9
pytorch 
tensorboard

Train

We train the disentanglement module and recognition module at the same time.

CUDA_VISIBLE_DEVICES=2 python train.py --config configs/train.yaml --phase train

Tensorboard

You can visualize the training process in tensorboard, and you should change the PortId, for example, 8008.

tensorboard --logdir out/logs/ --port PortId

Test

The pretrained model parameters can be download in BaiduYunPan, the extract code is 8652. You should download the parameters of fc and autoencoder to directory "out/checkpoints/".

python test.py --config configs/test.yaml --ae_checkpoint out/checkpoints/autoencoder_00050000.pt --fc_checkpoint out/checkpoints/fc_00050000.pt

The test result will be saved in an excel file.

Visualization

python visualize.py --config configs/test.yaml --checkpoint out/checkpoints/autoencoder_00050000.pt --heatmap 1 --exchange 1

"heatmap" and "exchange" can be set to 0 if you don't want to generate the results.

heatmap

motion difference.

body and view features.

view exchange.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages