Skip to content

Latest commit

 

History

History
59 lines (41 loc) · 3.26 KB

README.md

File metadata and controls

59 lines (41 loc) · 3.26 KB

Neural Networks and Deep Learning Notebooks

This repository contains a collection of Jupyter notebooks that cover fundamental concepts and techniques in neural networks and deep learning. Each notebook provides a hands-on approach to understanding and implementing various neural network architectures and optimization strategies.

Table of Contents

  1. 1 - Perceptron Learning Algorithm.ipynb
  2. 2 - Activation Functions.ipynb
  3. 3 - Fully Connected Neural network.ipynb
  4. 4 - MNIST Digit Classification using ANN.ipynb
  5. 5 - Optimizers.ipynb
  6. 6 - CNN.ipynb
  7. 7 - Transfer Learning.ipynb

Notebooks Overview

1 - Perceptron Learning Algorithm.ipynb

This notebook introduces the Perceptron Learning Algorithm, one of the earliest models of neural networks. It covers the theory behind perceptrons, their implementation, and how they can be used for binary classification tasks.

2 - Activation Functions.ipynb

Activation functions are crucial in neural networks for introducing non-linearity. This notebook explores various activation functions such as Sigmoid, Tanh, and ReLU, explaining their roles and impact on the performance of neural networks.

3 - Fully Connected Neural network.ipynb

This notebook delves into the architecture of Fully Connected Neural Networks (FCNNs). It covers the construction and training of FCNNs, and demonstrates how they can be applied to various tasks including classification and regression.

4 - MNIST Digit Classification using ANN.ipynb

In this notebook, you'll learn how to build an Artificial Neural Network (ANN) to classify handwritten digits from the MNIST dataset. It walks through the entire process, from data preprocessing to model evaluation.

5 - Optimizers.ipynb

Optimizers are key to training efficient and effective neural networks. This notebook reviews different optimization algorithms such as Gradient Descent, Adam, and RMSprop, and compares their performance in various scenarios.

6 - CNN.ipynb

Convolutional Neural Networks (CNNs) are powerful tools for image processing tasks. This notebook provides an introduction to CNNs, explaining the concepts of convolutional layers, pooling layers, and how CNNs are applied in image classification.

7 - Transfer Learning.ipynb

Transfer Learning allows you to leverage pre-trained models for new tasks. This notebook explains the principles of Transfer Learning and shows how to apply it using popular models like VGG, ResNet, and others to improve the performance on small datasets.

Prerequisites

To run these notebooks, you will need the following libraries installed:

  • Python 3.x
  • Jupyter Notebook
  • TensorFlow or PyTorch (depending on your preference)
  • NumPy
  • Matplotlib
  • Scikit-learn

You can install the required libraries using pip:

pip install tensorflow numpy matplotlib scikit-learn jupyter

Contributing

If you'd like to contribute to this project, please fork the repository and use a feature branch. Pull requests are welcome.

Contact

For any questions, feel free to reach out via [email protected]