diff --git a/1-introduction.html b/1-introduction.html index 391652ff..961cf575 100644 --- a/1-introduction.html +++ b/1-introduction.html @@ -392,27 +392,100 @@
A neural network is an artificial intelligence technique loosely -based on the way neurons in the brain work.
+based on the way neurons in the brain work. A neural network consists of +connected computational units called neurons. Let’s +look at the operations of a single neuron.A neural network consists of connected computational units called -neurons. Each neuron …
+Each neuron …
The goal of the activation function is to convert the weighted sum of +the inputs to the output signal of the neuron. This output is then +passed on to the next layer of the network. There are many different +activation functions, 3 of them are introduced in the exercise +below.
+Look at the following activation functions:
+A. Sigmoid activation function The sigmoid +activation function is given by: \[ f(x) = +\frac{1}{1 + e^{-x}} \]
+B. ReLU activation function The Rectified Linear +Unit (ReLU) activation function is defined as: \[ f(x) = \max(0, x) \]
+This involves a simple comparison and maximum calculation, which are +basic operations that are computationally inexpensive. It is also simple +to compute the gradient: 1 for positive inputs and 0 for negative +inputs.
+C. Linear (or identity) activation function +(output=input) The linear activation function is simply the +identity function: \[ f(x) = x \]
+Combine the following statements to the correct activation +function:
+Activation function plots by Laughsinthestocks - Own work, CC +BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=44920411, +https://commons.wikimedia.org/w/index.php?curid=44920600, +https://commons.wikimedia.org/w/index.php?curid=44920533
+