This is code that classifies the MNIST dataset, which is a dataset of 50,000 handwritten digits. I used a deep neural network with layers sizes 784, 684, 584, 484, 392, 100, 10. I use sigmoid for activation functions, but have experimented with relu.
Explore the docs »
Watch the video!
·
Report Bug
·
Request Feature
Table of Contents
I watched 3b1b's video. I thought that was really cool, so I tried to do it from scratch.
Pretty much, this program takes the brightness value of each pixel of the written digit, flattens the 28x28 image into a 784 element vector, and uses that as an input into a neural network. Then, we feedforward and backpropagate.
I was also learning rust at the time, so I did it in rust! It was a good learning experience for both rust learning and neural network theory. When I got stuck, or my net seemed like it was just guessing, I got tips from my friend jun. He let me use his dads lecture notes which helped, and sauced me a page of differentials for the backpropagation. This was really fun to develop. I had to get the dataset in rust though, and that was hard. That is what getData.py
is for. I parse the pkl file into a json, and then rust can read the json.
Everything is in the crate!
well, almost everything.
- You first need to install rust, see the documentation
- for our parsing tool, you will need these dependencies:
gzip
pickle
matplotlib
numpy
- Additionally, if you want to run the pytorch solution to MNIST, you will obviously need
torch
- Clone the repo
git clone https://github.com/Sentientplatypus/digit-recognition-dnn.git
- Run the crate with
cargo run
- Additional requirements based upon your needs as mentioned in the previous section.
To load the dataset, you will need to create a new Dataset
with the Dataset::generate_full(file_path:String)
.
The network architecture is stored in a Network
struct. The Network
aggregates the Layer
struct. The Layer
struct aggregates Neuron
. For running, you can create a new instance of Network
with Network::new()
. where you can pass in the layer sizes as an array, and whether it is binary output or not.
You can run the training through the Network.sgd()
method. A gui like this should pop up.
After your training, you can choose to save your network parameters with the Network.to_file(path:String)
method.
For more examples, please refer to the main.rs
file
- Create classes and research
- Binary Classification
- Multiclassification
- Softmax + Cross Entropy Loss
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Geneustace Wicaksono - My Website - [email protected]
Project Link: https://github.com/Sentientplatypus/digit-recognition-dnn