Skip to content

Latest commit

 

History

History
13 lines (13 loc) · 1.35 KB

README.md

File metadata and controls

13 lines (13 loc) · 1.35 KB

note

origin github addr is https://github.com/thohemp/6DRepNet.git
This project is based on the original project for modification, seamlessly supporting the tool from the repository: https://github.com/HelloWorld158/HeadPoseAnnotation.git. It has slightly modified the model architecture by adding a channel responsible for predicting the confidence of generating correct rotation amounts. When the confidence is very low, this rotation amount is considered unreliable.

install

pip install -r requirements.txt

training

Download pre-trained RepVGG model 'RepVGG-B1g2-train.pth' from here
Place the pre-trained weights in the models folder.
Place the annotation files generated by the tool from https://github.com/HelloWorld158/HeadPoseAnnotation.git into the headDir/train and headDir/val folders for training and validation, respectively.
python sixdrepnet/train.py --num_epochs 100 --batch_size 8 train.py This code is relatively simple and can be modified directly.

Inference

https://github.com/open-mmlab/mmdetection.git and https://github.com/open-mmlab/mmyolo.git Put the contents of these two files separately into MMDetecTVT/mmdetection and MMDetecTVT/mmyolodet
Place the JPG images under valTestData and then run python inferwithdetect.py