Skip to content

pfisters/MLonMCU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

About The Project

This project is done in the context of the course "Machine Learning on Microcontrollers" @ ETH Zurich. It contains a reduced Keras implementation of MTCNN and is trained using the WIDER FACE dataset. It also contains a quantization part where the networks (weights and activations) are quantized from float32 to int8 without a major loss of accuarcy.

Installation

  1. Clone the repo
  2. Setup an environment according to the environment.yml file
  3. Set the source node as a environment variable
    export PYTHONPATH="path to your directory"

Usage

  1. Download the data with download_data.py
    python data/download_data.py
  2. Generate the training images for the pnet, rnet and onet (with arguments 12, 24 and 48 respectively) with generate_training_data.py
    python data/generate_training_data.py 12
  3. Train the models with train_pnet.py, train_rnet.py and train_onet.py
    python models/train_pnet.py
  4. Quantize the models with quantize_pnet.py, quantize_rnet.py and quantize_onet.py
    python models/quantize_pnet.py

In models, you will find *.h5, *.tflite and .h files, for the keras model, tensorflow lite model and a hex representation of them. The quantization files also give you the possibility to generate validation data for STM32 X-CUBE-AI expansion (use the keras models *.h5 and the *net_bbx.csv, *net_cat.csv and *net_data.csv files).

To increase the performance, you can also forward sample with data/generate_hard_samples for the rnet and onet with the arguments 24 and 48 respectively.

  1. To test the performance, run
python detect_faces "path to your image" 

Acknowledgements

You will recognise some code sections or even entire files from the following repositories:

About

Machine Learning on Microcontroller

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published