Skip to content

GaganCodes/neural_network_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Implementing a basic neural network from scratch and comparing it with Scikit-Learn's MLP Regressor.

Study 1: Basic performance comparison In this study we compare basic stochastic gradient descent with a similar implementation of Sklearn. Sklearn outperforms due to inherent optimizations in implementation.

Study 2: Comparison of dataset size against performance Sklearn takes more time due to persistent search for optimum solution.

Study parameters:

  1. Sigmoid activation function
  2. One input layer with three features
  3. One hidden layer with four neurons
  4. One output layer (regression task)

Figure_1 Figure_2

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages