75% to 85% accuracy on MNIST with 355 parameters. Can you do better?
If so, contribute!
We aim to better understand the Pareto front of accuracy and number of parameters for MNIST.
This may prove useful to help understand hyperparameter tuning for frugal models.
Do you have ideas for improvements?
Install torch, then TorchUncertainty with
pip install torch-uncertainty
Then run the model with
python optumnist/optumnist-v1.py
The dataset will be downloaded automatically.
Install torch, build your own trainer and get the model and optimization procedure from optumnist/optumnist-v1.py
.
OptuMNIST-v1 has been found with Optuna:
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD 2019.