Completely from scratch implementation of a neural network imcluding gradient descent, activation functions, and backpropagation
python main.py
The following are implemented:
- RELU and Sigmoid activation functions
- Forward propagation
- Backpropagation
- Cross Entropy Loss
- Derivatives of all the calculated functions
All of the code is batched, and will run with that in mind where the batchs are in the 0th dimension