You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 20, 2024. It is now read-only.
I trained a custom yoloV2 detector with 2 classes using the tools provided in this repository, but I've to modify the input size of the image to 416x416 to get better results for one of the classes during the training. Then I used the export.py script to convert my file into ONNX, which according to the logs reported it was built successfully, the same script also convert my file from ONNX to NCNN, which according and gave me two files (.bin and .param). For some reason when I uploaded my files to the AWNN online converter tool I got the following error. The error is caused because when the input size of the image?
Here is a description of what my .param file looks and the structure of the folder I'm trying to upload to the converter.
Can you please help me and provide some more information on how you trained a custom yolov2 dataset for maix-ii v831. I checked repository https://github.com/sipeed/maix_train/tree/master/pytorch and there isn't a lot of information.
Hello @Q2Learn here are the steps you need for training:
Arrange your dataset in a folder with this structure:
Annotations (All .xml files)
ImageSets > Main (Two .txt files with the filenames from training and validation)
JPEGImages(All your images)
On the train.py script from the repo modify this lines with your custom classes. This is a must step classes = ["right", "left", "back", "front", "others"] Replace with your custom classes dataset_name = "lobster_5classes" Replace with the name of the folder that contains your dataset
On the same script you can also modify the following lines. This is optional
I trained a custom yoloV2 detector with 2 classes using the tools provided in this repository, but I've to modify the input size of the image to 416x416 to get better results for one of the classes during the training. Then I used the export.py script to convert my file into ONNX, which according to the logs reported it was built successfully, the same script also convert my file from ONNX to NCNN, which according and gave me two files (.bin and .param). For some reason when I uploaded my files to the AWNN online converter tool I got the following error. The error is caused because when the input size of the image?
Here is a description of what my
.param
file looks and the structure of the folder I'm trying to upload to the converter.The text was updated successfully, but these errors were encountered: