This repository contains a collection of Jupyter notebooks that cover fundamental concepts and techniques in neural networks and deep learning. Each notebook provides a hands-on approach to understanding and implementing various neural network architectures and optimization strategies.
- 1 - Perceptron Learning Algorithm.ipynb
- 2 - Activation Functions.ipynb
- 3 - Fully Connected Neural network.ipynb
- 4 - MNIST Digit Classification using ANN.ipynb
- 5 - Optimizers.ipynb
- 6 - CNN.ipynb
- 7 - Transfer Learning.ipynb
This notebook introduces the Perceptron Learning Algorithm, one of the earliest models of neural networks. It covers the theory behind perceptrons, their implementation, and how they can be used for binary classification tasks.
Activation functions are crucial in neural networks for introducing non-linearity. This notebook explores various activation functions such as Sigmoid, Tanh, and ReLU, explaining their roles and impact on the performance of neural networks.
This notebook delves into the architecture of Fully Connected Neural Networks (FCNNs). It covers the construction and training of FCNNs, and demonstrates how they can be applied to various tasks including classification and regression.
In this notebook, you'll learn how to build an Artificial Neural Network (ANN) to classify handwritten digits from the MNIST dataset. It walks through the entire process, from data preprocessing to model evaluation.
Optimizers are key to training efficient and effective neural networks. This notebook reviews different optimization algorithms such as Gradient Descent, Adam, and RMSprop, and compares their performance in various scenarios.
Convolutional Neural Networks (CNNs) are powerful tools for image processing tasks. This notebook provides an introduction to CNNs, explaining the concepts of convolutional layers, pooling layers, and how CNNs are applied in image classification.
Transfer Learning allows you to leverage pre-trained models for new tasks. This notebook explains the principles of Transfer Learning and shows how to apply it using popular models like VGG, ResNet, and others to improve the performance on small datasets.
To run these notebooks, you will need the following libraries installed:
- Python 3.x
- Jupyter Notebook
- TensorFlow or PyTorch (depending on your preference)
- NumPy
- Matplotlib
- Scikit-learn
You can install the required libraries using pip:
pip install tensorflow numpy matplotlib scikit-learn jupyter
If you'd like to contribute to this project, please fork the repository and use a feature branch. Pull requests are welcome.
For any questions, feel free to reach out via [email protected]