Skip to content

Commit

Permalink
updated README and added documentation for #1 Single Layer Perceptrons
Browse files Browse the repository at this point in the history
	modified:   README.md
	modified:   perceptrons.ipynb
  • Loading branch information
feed0 committed Sep 24, 2024
1 parent 645f07e commit bbfe2d6
Show file tree
Hide file tree
Showing 3 changed files with 37 additions and 3 deletions.
38 changes: 36 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,40 @@

This project implements artificial neural networks (ANN) using perceptrons as per the ANN university subject.

1. Single Layer
0. [Sum Function](#0-sum-function)
1. [Single Layer](#1-single-layer-perceptron)
2. Single Layer and Training
3. Multi Layer
3. Multi Layer

## 0. Sum Function
The sum_function is designed to compute the sum of the products of corresponding elements from two lists: inputs and weights. This is a common operation in neural networks, particularly in the calculation of the net input to a neuron.

```python
def sum_function(inputs, weights) -> float:
'''
Sum of the product of the inputs by the weights
@return float: The Net Input / Pre-Activation result
'''

...
```

## 1. Single Layer Perceptron
In this section, we implement a simple step function, which is often used in academic settings to illustrate the basic principles of neural networks and perceptrons.

```python
def step_function(net_input) -> int:
'''
1, if net_input >= 1
0, if net_input < 1
@return int: The Activation result
'''

...
```
> The following chart is made using both the `Sum` and `Step` functions
![Step Function Plot](step_function_plot.png)

2 changes: 1 addition & 1 deletion perceptrons.ipynb

Large diffs are not rendered by default.

Binary file added step_function_plot.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit bbfe2d6

Please sign in to comment.