From 6cd2e8ff48462bef1d80d0f9e657967579b05b91 Mon Sep 17 00:00:00 2001
From: Morten Hjorth-Jensen
Date: Mon, 15 Jan 2024 21:27:12 +0100
Subject: [PATCH] updating slides
---
doc/pub/week1/html/week1-bs.html | 187 +++----
doc/pub/week1/html/week1-reveal.html | 148 ++----
doc/pub/week1/html/week1-solarized.html | 168 ++----
doc/pub/week1/html/week1.html | 168 ++----
doc/pub/week1/ipynb/ipynb-week1-src.tar.gz | Bin 2624762 -> 2624762 bytes
doc/pub/week1/ipynb/week1.ipynb | 574 ++++++++++-----------
doc/pub/week1/pdf/week1.pdf | Bin 3154397 -> 3152919 bytes
doc/src/week1/week1.do.txt | 122 ++---
8 files changed, 516 insertions(+), 851 deletions(-)
diff --git a/doc/pub/week1/html/week1-bs.html b/doc/pub/week1/html/week1-bs.html
index fb5e53be..b3796b40 100644
--- a/doc/pub/week1/html/week1-bs.html
+++ b/doc/pub/week1/html/week1-bs.html
@@ -259,6 +259,10 @@
None,
'setting-up-the-equations-for-a-neural-network'),
('Definitions', 2, None, 'definitions'),
+ ('Inputs to tje activation function',
+ 2,
+ None,
+ 'inputs-to-tje-activation-function'),
('Derivatives and the chain rule',
2,
None,
@@ -271,6 +275,10 @@
2,
None,
'bringing-it-together-first-back-propagation-equation'),
+ ('Analyzing the last results',
+ 2,
+ None,
+ 'analyzing-the-last-results'),
('More considerations', 2, None, 'more-considerations'),
('Derivatives in terms of $z_j^L$',
2,
@@ -281,11 +289,15 @@
2,
None,
'final-back-propagating-equation'),
- ('Setting up the Back propagation algorithm',
+ ('Using the chain rule and summing over all $k$ entries',
+ 2,
+ None,
+ 'using-the-chain-rule-and-summing-over-all-k-entries'),
+ ('Setting up the back propagation algorithm',
2,
None,
'setting-up-the-back-propagation-algorithm'),
- ('Setting up the Back propagation algorithm, part 2',
+ ('Setting up the back propagation algorithm, part 2',
2,
None,
'setting-up-the-back-propagation-algorithm-part-2'),
@@ -293,11 +305,6 @@
2,
None,
'setting-up-the-back-propagation-algorithm-part-3'),
- ('Setting up the Back propagation algorithm, final '
- 'considerations',
- 2,
- None,
- 'setting-up-the-back-propagation-algorithm-final-considerations'),
('Updating the gradients', 2, None, 'updating-the-gradients')]}
end of tocinfo -->
@@ -402,17 +409,19 @@
Other parameters
Setting up the equations for a neural network
Definitions
+ Inputs to tje activation function
Derivatives and the chain rule
Derivative of the cost function
Bringing it together, first back propagation equation
+ Analyzing the last results
More considerations
Derivatives in terms of \( z_j^L \)
Bringing it together
Final back propagating equation
- Setting up the Back propagation algorithm
- Setting up the Back propagation algorithm, part 2
+ Using the chain rule and summing over all \( k \) entries
+ Setting up the back propagation algorithm
+ Setting up the back propagation algorithm, part 2
Setting up the Back propagation algorithm, part 3
- Setting up the Back propagation algorithm, final considerations
Updating the gradients
@@ -1580,22 +1589,19 @@ Setting up
$$
-{\cal C}(\hat{W}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2,
$$
-where the $t_i$s are our \( n \) targets (the values we want to
+
where the $y_i$s are our \( n \) targets (the values we want to
reproduce), while the outputs of the network after having propagated
-all inputs \( \hat{x} \) are given by \( y_i \). Below we will demonstrate
-how the basic equations arising from the back propagation algorithm
-can be modified in order to study classification problems with \( K \)
-classes.
+all inputs \( \boldsymbol{x} \) are given by \( \boldsymbol{\tilde{y}}_i \).
Definitions
-With our definition of the targets \( \hat{t} \), the outputs of the
-network \( \hat{y} \) and the inputs \( \hat{x} \) we
+
With our definition of the targets \( \boldsymbol{y} \), the outputs of the
+network \( \boldsymbol{\tilde{y}} \) and the inputs \( \boldsymbol{x} \) we
define now the activation \( z_j^l \) of node/neuron/unit \( j \) of the
\( l \)-th layer as a function of the bias, the weights which add up from
the previous layer \( l-1 \) and the forward passes/outputs
@@ -1616,8 +1622,12 @@
Definitions
\hat{z}^l = \left(\hat{W}^l\right)^T\hat{a}^{l-1}+\hat{b}^l.
$$
-With the activation values \( \hat{z}^l \) we can in turn define the
-output of layer \( l \) as \( \hat{a}^l = f(\hat{z}^l) \) where \( f \) is our
+
+
+
+
+With the activation values \( \boldsymbol{z}^l \) we can in turn define the
+output of layer \( l \) as \( \boldsymbol{a}^l = f(\boldsymbol{z}^l) \) where \( f \) is our
activation function. In the examples here we will use the sigmoid
function discussed in our logistic regression lectures. We will also use the same activation function \( f \) for all layers
and their nodes. It means we have
@@ -1654,18 +1664,18 @@
Derivative of the cost f
Let us specialize to the output layer \( l=L \). Our cost function is
$$
-{\cal C}(\hat{W^L}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}^L) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - y_i\right)^2,
$$
The derivative of this function with respect to the weights is
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
+\frac{\partial{\cal C}(\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
$$
The last partial derivative can easily be computed and reads (by applying the chain rule)
$$
-\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1}.
$$
@@ -1674,19 +1684,23 @@ Bri
We have thus
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial{\cal C}((\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)a_j^L(1-a_j^L)a_k^{L-1},
$$
Defining
$$
-\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - t_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
+\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - y_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
$$
and using the Hadamard product of two vectors we can write this as
$$
-\hat{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\hat{a}^L)}.
+\boldsymbol{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\boldsymbol{a}^L)}.
$$
+
+
+Analyzing the last results
+
This is an important expression. The second term on the right handside
measures how fast the cost function is changing as a function of the $j$th
output activation. If, for example, the cost function doesn't depend
@@ -1714,7 +1728,7 @@
More considerations
With the definition of \( \delta_j^L \) we have a more compact definition of the derivative of the cost function in terms of the weights, namely
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
+\frac{\partial{\cal C}}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
$$
@@ -1739,10 +1753,6 @@ Bringing it together
We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are
-
-
-
-
$$
\begin{equation}
\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1},
@@ -1766,8 +1776,6 @@
Bringing it together
\label{_auto3}
\end{equation}
$$
-
-
@@ -1778,8 +1786,12 @@ Final back propagating e
\delta_j^l =\frac{\partial {\cal C}}{\partial z_j^l}.
$$
-
We want to express this in terms of the equations for layer \( l+1 \). Using the chain rule and summing over all \( k \) entries we have
+We want to express this in terms of the equations for layer \( l+1 \).
+
+
+Using the chain rule and summing over all \( k \) entries
+We obtain
$$
\delta_j^l =\sum_k \frac{\partial {\cal C}}{\partial z_k^{l+1}}\frac{\partial z_k^{l+1}}{\partial z_j^{l}}=\sum_k \delta_k^{l+1}\frac{\partial z_k^{l+1}}{\partial z_j^{l}},
$$
@@ -1799,65 +1811,43 @@ Final back propagating e
We are now ready to set up the algorithm for back propagation and learning the weights and biases.
-Setting up the Back propagation algorithm
+Setting up the back propagation algorithm
The four equations provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
First, we set up the input data \( \hat{x} \) and the activations
+
First, we set up the input data \( \hat{x} \) and the activations
\( \hat{z}_1 \) of the input layer and compute the activation function and
the pertinent outputs \( \hat{a}^1 \).
-
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
+
Secondly, we perform then the feed forward till we reach the output
layer and compute all \( \hat{z}_l \) of the input layer and compute the
activation function and the pertinent outputs \( \hat{a}^l \) for
\( l=2,3,\dots,L \).
-
-
-
-Setting up the Back propagation algorithm, part 2
+Setting up the back propagation algorithm, part 2
-
-
-
Thereafter we compute the ouput error \( \hat{\delta}^L \) by computing all
$$
\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
$$
-
-
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
$$
\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
$$
-
-
Setting up the Back propagation algorithm, part 3
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
Finally, we update the weights and the biases using gradient descent
+for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases
+according to the rules
+
+
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1866,71 +1856,18 @@
Setting
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
-
-
-Setting up the Back propagation algorithm, final considerations
-
-The four equations above provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
-
First, we set up the input data \( \boldsymbol{x} \) and the activations
-\( \boldsymbol{z}_1 \) of the input layer and compute the activation function and
-the pertinent outputs \( \boldsymbol{a}^1 \).
-
-
-
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
-layer and compute all \( \boldsymbol{z}_l \) of the input layer and compute the
-activation function and the pertinent outputs \( \boldsymbol{a}^l \) for
-\( l=2,3,\dots,L \).
-
-
-
-
-
-
-
-
-
Thereafter we compute the ouput error \( \boldsymbol{\delta}^L \) by computing all
-$$
-\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
-$$
-
-
+with \( \eta \) being the learning rate.
Updating the gradients
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
+
With the back propagate error for each \( l=L-1,L-2,\dots,1 \) as
$$
-\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
+\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l),
$$
-
-
-
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases according to the rules
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1939,13 +1876,7 @@
Updating the gradients
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
diff --git a/doc/pub/week1/html/week1-reveal.html b/doc/pub/week1/html/week1-reveal.html
index c9df4e9d..bceb433b 100644
--- a/doc/pub/week1/html/week1-reveal.html
+++ b/doc/pub/week1/html/week1-reveal.html
@@ -1481,24 +1481,21 @@ Setting up the equations
$$
-{\cal C}(\hat{W}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2,
$$
-
where the $t_i$s are our \( n \) targets (the values we want to
+
where the $y_i$s are our \( n \) targets (the values we want to
reproduce), while the outputs of the network after having propagated
-all inputs \( \hat{x} \) are given by \( y_i \). Below we will demonstrate
-how the basic equations arising from the back propagation algorithm
-can be modified in order to study classification problems with \( K \)
-classes.
+all inputs \( \boldsymbol{x} \) are given by \( \boldsymbol{\tilde{y}}_i \).
Definitions
-With our definition of the targets \( \hat{t} \), the outputs of the
-network \( \hat{y} \) and the inputs \( \hat{x} \) we
+
With our definition of the targets \( \boldsymbol{y} \), the outputs of the
+network \( \boldsymbol{\tilde{y}} \) and the inputs \( \boldsymbol{x} \) we
define now the activation \( z_j^l \) of node/neuron/unit \( j \) of the
\( l \)-th layer as a function of the bias, the weights which add up from
the previous layer \( l-1 \) and the forward passes/outputs
@@ -1522,9 +1519,13 @@
Definitions
\hat{z}^l = \left(\hat{W}^l\right)^T\hat{a}^{l-1}+\hat{b}^l.
$$
+
+
+
+
-With the activation values \( \hat{z}^l \) we can in turn define the
-output of layer \( l \) as \( \hat{a}^l = f(\hat{z}^l) \) where \( f \) is our
+
With the activation values \( \boldsymbol{z}^l \) we can in turn define the
+output of layer \( l \) as \( \boldsymbol{a}^l = f(\boldsymbol{z}^l) \) where \( f \) is our
activation function. In the examples here we will use the sigmoid
function discussed in our logistic regression lectures. We will also use the same activation function \( f \) for all layers
and their nodes. It means we have
@@ -1570,7 +1571,7 @@
Derivative of the cost function
Let us specialize to the output layer \( l=L \). Our cost function is
$$
-{\cal C}(\hat{W^L}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}^L) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - y_i\right)^2,
$$
@@ -1578,14 +1579,14 @@
Derivative of the cost function
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
+\frac{\partial{\cal C}(\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
$$
The last partial derivative can easily be computed and reads (by applying the chain rule)
$$
-\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1}.
$$
@@ -1596,23 +1597,27 @@ Bringing it togeth
We have thus
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial{\cal C}((\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)a_j^L(1-a_j^L)a_k^{L-1},
$$
Defining
$$
-\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - t_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
+\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - y_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
$$
and using the Hadamard product of two vectors we can write this as
$$
-\hat{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\hat{a}^L)}.
+\boldsymbol{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\boldsymbol{a}^L)}.
$$
+
+
+
+Analyzing the last results
This is an important expression. The second term on the right handside
measures how fast the cost function is changing as a function of the $j$th
@@ -1645,7 +1650,7 @@
More considerations
With the definition of \( \delta_j^L \) we have a more compact definition of the derivative of the cost function in terms of the weights, namely
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
+\frac{\partial{\cal C}}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
$$
@@ -1676,10 +1681,6 @@
Bringing it together
We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are
-
-
The starting equations
-
-
$$
\begin{equation}
@@ -1709,7 +1710,6 @@
Bringing it together
\end{equation}
$$
-
@@ -1722,8 +1722,13 @@ Final back propagating equation
$$
-
We want to express this in terms of the equations for layer \( l+1 \). Using the chain rule and summing over all \( k \) entries we have
+We want to express this in terms of the equations for layer \( l+1 \).
+
+
+
+Using the chain rule and summing over all \( k \) entries
+We obtain
$$
\delta_j^l =\sum_k \frac{\partial {\cal C}}{\partial z_k^{l+1}}\frac{\partial z_k^{l+1}}{\partial z_j^{l}}=\sum_k \delta_k^{l+1}\frac{\partial z_k^{l+1}}{\partial z_j^{l}},
@@ -1750,65 +1755,48 @@
Final back propagating equation
-Setting up the Back propagation algorithm
+Setting up the back propagation algorithm
The four equations provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
First, we set up the input data \( \hat{x} \) and the activations
+
First, we set up the input data \( \hat{x} \) and the activations
\( \hat{z}_1 \) of the input layer and compute the activation function and
the pertinent outputs \( \hat{a}^1 \).
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
+
Secondly, we perform then the feed forward till we reach the output
layer and compute all \( \hat{z}_l \) of the input layer and compute the
activation function and the pertinent outputs \( \hat{a}^l \) for
\( l=2,3,\dots,L \).
-
-Setting up the Back propagation algorithm, part 2
+Setting up the back propagation algorithm, part 2
-
-
-
Thereafter we compute the ouput error \( \hat{\delta}^L \) by computing all
$$
\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
$$
-
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
$$
\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
$$
-
Setting up the Back propagation algorithm, part 3
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
Finally, we update the weights and the biases using gradient descent
+for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases
+according to the rules
+
+
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
@@ -1820,70 +1808,21 @@
Setting up the Back pr
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
-
-
-Setting up the Back propagation algorithm, final considerations
-
-The four equations above provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
-
First, we set up the input data \( \boldsymbol{x} \) and the activations
-\( \boldsymbol{z}_1 \) of the input layer and compute the activation function and
-the pertinent outputs \( \boldsymbol{a}^1 \).
-
-
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
-layer and compute all \( \boldsymbol{z}_l \) of the input layer and compute the
-activation function and the pertinent outputs \( \boldsymbol{a}^l \) for
-\( l=2,3,\dots,L \).
-
-
-
-
-
-
-
-
Thereafter we compute the ouput error \( \boldsymbol{\delta}^L \) by computing all
-
-$$
-\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
-$$
-
-
+with \( \eta \) being the learning rate.
Updating the gradients
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
+
With the back propagate error for each \( l=L-1,L-2,\dots,1 \) as
$$
-\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
+\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l),
$$
-
-
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases according to the rules
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
@@ -1895,11 +1834,6 @@
Updating the gradients
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
diff --git a/doc/pub/week1/html/week1-solarized.html b/doc/pub/week1/html/week1-solarized.html
index 83978612..a4f37c4f 100644
--- a/doc/pub/week1/html/week1-solarized.html
+++ b/doc/pub/week1/html/week1-solarized.html
@@ -286,6 +286,10 @@
None,
'setting-up-the-equations-for-a-neural-network'),
('Definitions', 2, None, 'definitions'),
+ ('Inputs to tje activation function',
+ 2,
+ None,
+ 'inputs-to-tje-activation-function'),
('Derivatives and the chain rule',
2,
None,
@@ -298,6 +302,10 @@
2,
None,
'bringing-it-together-first-back-propagation-equation'),
+ ('Analyzing the last results',
+ 2,
+ None,
+ 'analyzing-the-last-results'),
('More considerations', 2, None, 'more-considerations'),
('Derivatives in terms of $z_j^L$',
2,
@@ -308,11 +316,15 @@
2,
None,
'final-back-propagating-equation'),
- ('Setting up the Back propagation algorithm',
+ ('Using the chain rule and summing over all $k$ entries',
+ 2,
+ None,
+ 'using-the-chain-rule-and-summing-over-all-k-entries'),
+ ('Setting up the back propagation algorithm',
2,
None,
'setting-up-the-back-propagation-algorithm'),
- ('Setting up the Back propagation algorithm, part 2',
+ ('Setting up the back propagation algorithm, part 2',
2,
None,
'setting-up-the-back-propagation-algorithm-part-2'),
@@ -320,11 +332,6 @@
2,
None,
'setting-up-the-back-propagation-algorithm-part-3'),
- ('Setting up the Back propagation algorithm, final '
- 'considerations',
- 2,
- None,
- 'setting-up-the-back-propagation-algorithm-final-considerations'),
('Updating the gradients', 2, None, 'updating-the-gradients')]}
end of tocinfo -->
@@ -1475,22 +1482,19 @@ Setting up the equations
$$
-{\cal C}(\hat{W}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2,
$$
-
where the $t_i$s are our \( n \) targets (the values we want to
+
where the $y_i$s are our \( n \) targets (the values we want to
reproduce), while the outputs of the network after having propagated
-all inputs \( \hat{x} \) are given by \( y_i \). Below we will demonstrate
-how the basic equations arising from the back propagation algorithm
-can be modified in order to study classification problems with \( K \)
-classes.
+all inputs \( \boldsymbol{x} \) are given by \( \boldsymbol{\tilde{y}}_i \).
Definitions
-With our definition of the targets \( \hat{t} \), the outputs of the
-network \( \hat{y} \) and the inputs \( \hat{x} \) we
+
With our definition of the targets \( \boldsymbol{y} \), the outputs of the
+network \( \boldsymbol{\tilde{y}} \) and the inputs \( \boldsymbol{x} \) we
define now the activation \( z_j^l \) of node/neuron/unit \( j \) of the
\( l \)-th layer as a function of the bias, the weights which add up from
the previous layer \( l-1 \) and the forward passes/outputs
@@ -1511,8 +1515,12 @@
Definitions
\hat{z}^l = \left(\hat{W}^l\right)^T\hat{a}^{l-1}+\hat{b}^l.
$$
-With the activation values \( \hat{z}^l \) we can in turn define the
-output of layer \( l \) as \( \hat{a}^l = f(\hat{z}^l) \) where \( f \) is our
+
+
+
+
+With the activation values \( \boldsymbol{z}^l \) we can in turn define the
+output of layer \( l \) as \( \boldsymbol{a}^l = f(\boldsymbol{z}^l) \) where \( f \) is our
activation function. In the examples here we will use the sigmoid
function discussed in our logistic regression lectures. We will also use the same activation function \( f \) for all layers
and their nodes. It means we have
@@ -1549,18 +1557,18 @@
Derivative of the cost function
Let us specialize to the output layer \( l=L \). Our cost function is
$$
-{\cal C}(\hat{W^L}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}^L) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - y_i\right)^2,
$$
The derivative of this function with respect to the weights is
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
+\frac{\partial{\cal C}(\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
$$
The last partial derivative can easily be computed and reads (by applying the chain rule)
$$
-\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1}.
$$
@@ -1569,19 +1577,23 @@ Bringing it togeth
We have thus
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial{\cal C}((\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)a_j^L(1-a_j^L)a_k^{L-1},
$$
Defining
$$
-\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - t_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
+\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - y_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
$$
and using the Hadamard product of two vectors we can write this as
$$
-\hat{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\hat{a}^L)}.
+\boldsymbol{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\boldsymbol{a}^L)}.
$$
+
+
+Analyzing the last results
+
This is an important expression. The second term on the right handside
measures how fast the cost function is changing as a function of the $j$th
output activation. If, for example, the cost function doesn't depend
@@ -1609,7 +1621,7 @@
More considerations
With the definition of \( \delta_j^L \) we have a more compact definition of the derivative of the cost function in terms of the weights, namely
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
+\frac{\partial{\cal C}}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
$$
@@ -1634,10 +1646,6 @@ Bringing it together
We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are
-
-
The starting equations
-
-
$$
\begin{equation}
\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1},
@@ -1661,7 +1669,6 @@
Bringing it together
\label{_auto3}
\end{equation}
$$
-
@@ -1672,8 +1679,12 @@ Final back propagating equation
\delta_j^l =\frac{\partial {\cal C}}{\partial z_j^l}.
$$
-We want to express this in terms of the equations for layer \( l+1 \). Using the chain rule and summing over all \( k \) entries we have
+We want to express this in terms of the equations for layer \( l+1 \).
+
+
+Using the chain rule and summing over all \( k \) entries
+We obtain
$$
\delta_j^l =\sum_k \frac{\partial {\cal C}}{\partial z_k^{l+1}}\frac{\partial z_k^{l+1}}{\partial z_j^{l}}=\sum_k \delta_k^{l+1}\frac{\partial z_k^{l+1}}{\partial z_j^{l}},
$$
@@ -1693,61 +1704,43 @@ Final back propagating equation
We are now ready to set up the algorithm for back propagation and learning the weights and biases.
-Setting up the Back propagation algorithm
+Setting up the back propagation algorithm
The four equations provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
First, we set up the input data \( \hat{x} \) and the activations
+
First, we set up the input data \( \hat{x} \) and the activations
\( \hat{z}_1 \) of the input layer and compute the activation function and
the pertinent outputs \( \hat{a}^1 \).
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
+
Secondly, we perform then the feed forward till we reach the output
layer and compute all \( \hat{z}_l \) of the input layer and compute the
activation function and the pertinent outputs \( \hat{a}^l \) for
\( l=2,3,\dots,L \).
-
-
-Setting up the Back propagation algorithm, part 2
+Setting up the back propagation algorithm, part 2
-
-
-
Thereafter we compute the ouput error \( \hat{\delta}^L \) by computing all
$$
\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
$$
-
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
$$
\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
$$
-
Setting up the Back propagation algorithm, part 3
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
Finally, we update the weights and the biases using gradient descent
+for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases
+according to the rules
+
+
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1756,66 +1749,18 @@
Setting up the Back pr
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
-
-
-Setting up the Back propagation algorithm, final considerations
-
-The four equations above provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
-
First, we set up the input data \( \boldsymbol{x} \) and the activations
-\( \boldsymbol{z}_1 \) of the input layer and compute the activation function and
-the pertinent outputs \( \boldsymbol{a}^1 \).
-
-
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
-layer and compute all \( \boldsymbol{z}_l \) of the input layer and compute the
-activation function and the pertinent outputs \( \boldsymbol{a}^l \) for
-\( l=2,3,\dots,L \).
-
-
-
-
-
-
-
-
Thereafter we compute the ouput error \( \boldsymbol{\delta}^L \) by computing all
-$$
-\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
-$$
-
+with \( \eta \) being the learning rate.
Updating the gradients
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
+
With the back propagate error for each \( l=L-1,L-2,\dots,1 \) as
$$
-\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
+\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l),
$$
-
-
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases according to the rules
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1824,12 +1769,7 @@
Updating the gradients
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
© 1999-2024, Morten Hjorth-Jensen. Released under CC Attribution-NonCommercial 4.0 license
diff --git a/doc/pub/week1/html/week1.html b/doc/pub/week1/html/week1.html
index 7c06db08..41b60bae 100644
--- a/doc/pub/week1/html/week1.html
+++ b/doc/pub/week1/html/week1.html
@@ -363,6 +363,10 @@
None,
'setting-up-the-equations-for-a-neural-network'),
('Definitions', 2, None, 'definitions'),
+ ('Inputs to tje activation function',
+ 2,
+ None,
+ 'inputs-to-tje-activation-function'),
('Derivatives and the chain rule',
2,
None,
@@ -375,6 +379,10 @@
2,
None,
'bringing-it-together-first-back-propagation-equation'),
+ ('Analyzing the last results',
+ 2,
+ None,
+ 'analyzing-the-last-results'),
('More considerations', 2, None, 'more-considerations'),
('Derivatives in terms of $z_j^L$',
2,
@@ -385,11 +393,15 @@
2,
None,
'final-back-propagating-equation'),
- ('Setting up the Back propagation algorithm',
+ ('Using the chain rule and summing over all $k$ entries',
+ 2,
+ None,
+ 'using-the-chain-rule-and-summing-over-all-k-entries'),
+ ('Setting up the back propagation algorithm',
2,
None,
'setting-up-the-back-propagation-algorithm'),
- ('Setting up the Back propagation algorithm, part 2',
+ ('Setting up the back propagation algorithm, part 2',
2,
None,
'setting-up-the-back-propagation-algorithm-part-2'),
@@ -397,11 +409,6 @@
2,
None,
'setting-up-the-back-propagation-algorithm-part-3'),
- ('Setting up the Back propagation algorithm, final '
- 'considerations',
- 2,
- None,
- 'setting-up-the-back-propagation-algorithm-final-considerations'),
('Updating the gradients', 2, None, 'updating-the-gradients')]}
end of tocinfo -->
@@ -1552,22 +1559,19 @@ Setting up the equations
$$
-{\cal C}(\hat{W}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2,
$$
-
where the $t_i$s are our \( n \) targets (the values we want to
+
where the $y_i$s are our \( n \) targets (the values we want to
reproduce), while the outputs of the network after having propagated
-all inputs \( \hat{x} \) are given by \( y_i \). Below we will demonstrate
-how the basic equations arising from the back propagation algorithm
-can be modified in order to study classification problems with \( K \)
-classes.
+all inputs \( \boldsymbol{x} \) are given by \( \boldsymbol{\tilde{y}}_i \).
Definitions
-With our definition of the targets \( \hat{t} \), the outputs of the
-network \( \hat{y} \) and the inputs \( \hat{x} \) we
+
With our definition of the targets \( \boldsymbol{y} \), the outputs of the
+network \( \boldsymbol{\tilde{y}} \) and the inputs \( \boldsymbol{x} \) we
define now the activation \( z_j^l \) of node/neuron/unit \( j \) of the
\( l \)-th layer as a function of the bias, the weights which add up from
the previous layer \( l-1 \) and the forward passes/outputs
@@ -1588,8 +1592,12 @@
Definitions
\hat{z}^l = \left(\hat{W}^l\right)^T\hat{a}^{l-1}+\hat{b}^l.
$$
-With the activation values \( \hat{z}^l \) we can in turn define the
-output of layer \( l \) as \( \hat{a}^l = f(\hat{z}^l) \) where \( f \) is our
+
+
+
+
+With the activation values \( \boldsymbol{z}^l \) we can in turn define the
+output of layer \( l \) as \( \boldsymbol{a}^l = f(\boldsymbol{z}^l) \) where \( f \) is our
activation function. In the examples here we will use the sigmoid
function discussed in our logistic regression lectures. We will also use the same activation function \( f \) for all layers
and their nodes. It means we have
@@ -1626,18 +1634,18 @@
Derivative of the cost function
Let us specialize to the output layer \( l=L \). Our cost function is
$$
-{\cal C}(\hat{W^L}) = \frac{1}{2}\sum_{i=1}^n\left(y_i - t_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - t_i\right)^2,
+{\cal C}(\boldsymbol{\Theta}^L) = \frac{1}{2}\sum_{i=1}^n\left(y_i - \tilde{y}_i\right)^2=\frac{1}{2}\sum_{i=1}^n\left(a_i^L - y_i\right)^2,
$$
The derivative of this function with respect to the weights is
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
+\frac{\partial{\cal C}(\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)\frac{\partial a_j^L}{\partial w_{jk}^{L}},
$$
The last partial derivative can easily be computed and reads (by applying the chain rule)
$$
-\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial a_j^L}{\partial w_{jk}^{L}} = \frac{\partial a_j^L}{\partial z_{j}^{L}}\frac{\partial z_j^L}{\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1}.
$$
@@ -1646,19 +1654,23 @@ Bringing it togeth
We have thus
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \left(a_j^L - t_j\right)a_j^L(1-a_j^L)a_k^{L-1},
+\frac{\partial{\cal C}((\boldsymbol{\Theta}^L)}{\partial w_{jk}^L} = \left(a_j^L - y_j\right)a_j^L(1-a_j^L)a_k^{L-1},
$$
Defining
$$
-\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - t_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
+\delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - y_j\right) = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)},
$$
and using the Hadamard product of two vectors we can write this as
$$
-\hat{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\hat{a}^L)}.
+\boldsymbol{\delta}^L = f'(\hat{z}^L)\circ\frac{\partial {\cal C}}{\partial (\boldsymbol{a}^L)}.
$$
+
+
+Analyzing the last results
+
This is an important expression. The second term on the right handside
measures how fast the cost function is changing as a function of the $j$th
output activation. If, for example, the cost function doesn't depend
@@ -1686,7 +1698,7 @@
More considerations
With the definition of \( \delta_j^L \) we have a more compact definition of the derivative of the cost function in terms of the weights, namely
$$
-\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
+\frac{\partial{\cal C}}{\partial w_{jk}^L} = \delta_j^La_k^{L-1}.
$$
@@ -1711,10 +1723,6 @@ Bringing it together
We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are
-
-
The starting equations
-
-
$$
\begin{equation}
\frac{\partial{\cal C}(\hat{W^L})}{\partial w_{jk}^L} = \delta_j^La_k^{L-1},
@@ -1738,7 +1746,6 @@
Bringing it together
\label{_auto3}
\end{equation}
$$
-
@@ -1749,8 +1756,12 @@ Final back propagating equation
\delta_j^l =\frac{\partial {\cal C}}{\partial z_j^l}.
$$
-We want to express this in terms of the equations for layer \( l+1 \). Using the chain rule and summing over all \( k \) entries we have
+We want to express this in terms of the equations for layer \( l+1 \).
+
+
+Using the chain rule and summing over all \( k \) entries
+We obtain
$$
\delta_j^l =\sum_k \frac{\partial {\cal C}}{\partial z_k^{l+1}}\frac{\partial z_k^{l+1}}{\partial z_j^{l}}=\sum_k \delta_k^{l+1}\frac{\partial z_k^{l+1}}{\partial z_j^{l}},
$$
@@ -1770,61 +1781,43 @@ Final back propagating equation
We are now ready to set up the algorithm for back propagation and learning the weights and biases.
-Setting up the Back propagation algorithm
+Setting up the back propagation algorithm
The four equations provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
First, we set up the input data \( \hat{x} \) and the activations
+
First, we set up the input data \( \hat{x} \) and the activations
\( \hat{z}_1 \) of the input layer and compute the activation function and
the pertinent outputs \( \hat{a}^1 \).
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
+
Secondly, we perform then the feed forward till we reach the output
layer and compute all \( \hat{z}_l \) of the input layer and compute the
activation function and the pertinent outputs \( \hat{a}^l \) for
\( l=2,3,\dots,L \).
-
-
-Setting up the Back propagation algorithm, part 2
+Setting up the back propagation algorithm, part 2
-
-
-
Thereafter we compute the ouput error \( \hat{\delta}^L \) by computing all
$$
\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
$$
-
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
$$
\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
$$
-
Setting up the Back propagation algorithm, part 3
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
Finally, we update the weights and the biases using gradient descent
+for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases
+according to the rules
+
+
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1833,66 +1826,18 @@
Setting up the Back pr
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
-
-
-Setting up the Back propagation algorithm, final considerations
-
-The four equations above provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.
-
-
-
-
-
First, we set up the input data \( \boldsymbol{x} \) and the activations
-\( \boldsymbol{z}_1 \) of the input layer and compute the activation function and
-the pertinent outputs \( \boldsymbol{a}^1 \).
-
-
-
-
-
-
-
-
Secondly, we perform then the feed forward till we reach the output
-layer and compute all \( \boldsymbol{z}_l \) of the input layer and compute the
-activation function and the pertinent outputs \( \boldsymbol{a}^l \) for
-\( l=2,3,\dots,L \).
-
-
-
-
-
-
-
-
Thereafter we compute the ouput error \( \boldsymbol{\delta}^L \) by computing all
-$$
-\delta_j^L = f'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}.
-$$
-
+with \( \eta \) being the learning rate.
Updating the gradients
-
-
-
-
Then we compute the back propagate error for each \( l=L-1,L-2,\dots,2 \) as
+
With the back propagate error for each \( l=L-1,L-2,\dots,1 \) as
$$
-\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).
+\delta_j^l = \sum_k \delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l),
$$
-
-
-
-
-
-
Finally, we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,2 \) and update the weights and biases according to the rules
+
we update the weights and the biases using gradient descent for each \( l=L-1,L-2,\dots,1 \) and update the weights and biases according to the rules
$$
w_{jk}^l\leftarrow = w_{jk}^l- \eta \delta_j^la_k^{l-1},
$$
@@ -1901,12 +1846,7 @@
Updating the gradients
$$
b_j^l \leftarrow b_j^l-\eta \frac{\partial {\cal C}}{\partial b_j^l}=b_j^l-\eta \delta_j^l,
$$
-
-
-The parameter \( \eta \) is the learning parameter discussed in connection with the gradient descent methods.
-Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training.
-
© 1999-2024, Morten Hjorth-Jensen. Released under CC Attribution-NonCommercial 4.0 license
diff --git a/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz b/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz
index 88082c5e85172ab520b000227a432a3e2ef07bfd..6cd7c61433ce12c6a7102d305280df3cf050293d 100644
GIT binary patch
delta 139
zcmWN_y8?j#06(Q73slkU+W;N>BRI
z&Ojm=%1Fi%%S5J<$V^h1%R(|)%1YL`Q_K*HrweoA{8|06u^?TltK-$ug
zuJojtP$KEeK!y^_NX9aeM5Z#6R5F>%LYA_UwQOW7JK4)Yaw+5}Cpk+gm0aZd;r5OH
Dq_-hY
diff --git a/doc/pub/week1/ipynb/week1.ipynb b/doc/pub/week1/ipynb/week1.ipynb
index 81f9445e..74996037 100644
--- a/doc/pub/week1/ipynb/week1.ipynb
+++ b/doc/pub/week1/ipynb/week1.ipynb
@@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "markdown",
- "id": "4320b4a1",
+ "id": "fcc6f7f7",
"metadata": {
"editable": true
},
@@ -14,7 +14,7 @@
},
{
"cell_type": "markdown",
- "id": "4e28dae4",
+ "id": "1e7100ab",
"metadata": {
"editable": true
},
@@ -27,7 +27,7 @@
},
{
"cell_type": "markdown",
- "id": "3e300193",
+ "id": "eef90275",
"metadata": {
"editable": true
},
@@ -43,7 +43,7 @@
},
{
"cell_type": "markdown",
- "id": "66a491f7",
+ "id": "4b59a81b",
"metadata": {
"editable": true
},
@@ -61,7 +61,7 @@
},
{
"cell_type": "markdown",
- "id": "d8907311",
+ "id": "e33689b5",
"metadata": {
"editable": true
},
@@ -97,7 +97,7 @@
},
{
"cell_type": "markdown",
- "id": "1aa2e78e",
+ "id": "5da650f6",
"metadata": {
"editable": true
},
@@ -113,7 +113,7 @@
},
{
"cell_type": "markdown",
- "id": "0e013cd9",
+ "id": "d30dc02c",
"metadata": {
"editable": true
},
@@ -136,7 +136,7 @@
},
{
"cell_type": "markdown",
- "id": "6829fcc9",
+ "id": "5b072b26",
"metadata": {
"editable": true
},
@@ -152,7 +152,7 @@
},
{
"cell_type": "markdown",
- "id": "722521f9",
+ "id": "08decd18",
"metadata": {
"editable": true
},
@@ -170,7 +170,7 @@
},
{
"cell_type": "markdown",
- "id": "eae20b92",
+ "id": "41440cf8",
"metadata": {
"editable": true
},
@@ -186,7 +186,7 @@
},
{
"cell_type": "markdown",
- "id": "06cebda5",
+ "id": "57fa58bc",
"metadata": {
"editable": true
},
@@ -206,7 +206,7 @@
},
{
"cell_type": "markdown",
- "id": "09675d7a",
+ "id": "faa8a182",
"metadata": {
"editable": true
},
@@ -225,7 +225,7 @@
},
{
"cell_type": "markdown",
- "id": "e4b40616",
+ "id": "458087cf",
"metadata": {
"editable": true
},
@@ -243,7 +243,7 @@
},
{
"cell_type": "markdown",
- "id": "12913b33",
+ "id": "c16e338d",
"metadata": {
"editable": true
},
@@ -265,7 +265,7 @@
},
{
"cell_type": "markdown",
- "id": "df48c2e7",
+ "id": "599bb230",
"metadata": {
"editable": true
},
@@ -288,7 +288,7 @@
},
{
"cell_type": "markdown",
- "id": "c5a59f0c",
+ "id": "0fddab7e",
"metadata": {
"editable": true
},
@@ -304,7 +304,7 @@
},
{
"cell_type": "markdown",
- "id": "0a8f1137",
+ "id": "0e6656d4",
"metadata": {
"editable": true
},
@@ -330,7 +330,7 @@
},
{
"cell_type": "markdown",
- "id": "fd888494",
+ "id": "4781b12f",
"metadata": {
"editable": true
},
@@ -346,7 +346,7 @@
},
{
"cell_type": "markdown",
- "id": "25e974bc",
+ "id": "90177fea",
"metadata": {
"editable": true
},
@@ -362,7 +362,7 @@
},
{
"cell_type": "markdown",
- "id": "62edc1e1",
+ "id": "de16686e",
"metadata": {
"editable": true
},
@@ -382,7 +382,7 @@
},
{
"cell_type": "markdown",
- "id": "6a7ede58",
+ "id": "4ffb766a",
"metadata": {
"editable": true
},
@@ -398,7 +398,7 @@
},
{
"cell_type": "markdown",
- "id": "ecaa3c0d",
+ "id": "f3183d87",
"metadata": {
"editable": true
},
@@ -416,7 +416,7 @@
},
{
"cell_type": "markdown",
- "id": "66a2360b",
+ "id": "51f13830",
"metadata": {
"editable": true
},
@@ -432,7 +432,7 @@
},
{
"cell_type": "markdown",
- "id": "d8f2772d",
+ "id": "fbfd3a76",
"metadata": {
"editable": true
},
@@ -444,7 +444,7 @@
},
{
"cell_type": "markdown",
- "id": "c4564c51",
+ "id": "16865eff",
"metadata": {
"editable": true
},
@@ -462,7 +462,7 @@
},
{
"cell_type": "markdown",
- "id": "e10b17a7",
+ "id": "20995064",
"metadata": {
"editable": true
},
@@ -474,7 +474,7 @@
},
{
"cell_type": "markdown",
- "id": "9dbfd14c",
+ "id": "2dbc3418",
"metadata": {
"editable": true
},
@@ -486,7 +486,7 @@
},
{
"cell_type": "markdown",
- "id": "c4c97e30",
+ "id": "8db378c8",
"metadata": {
"editable": true
},
@@ -498,7 +498,7 @@
},
{
"cell_type": "markdown",
- "id": "713dc432",
+ "id": "d629aae3",
"metadata": {
"editable": true
},
@@ -508,7 +508,7 @@
},
{
"cell_type": "markdown",
- "id": "3581a429",
+ "id": "8569ed0f",
"metadata": {
"editable": true
},
@@ -520,7 +520,7 @@
},
{
"cell_type": "markdown",
- "id": "39f963a3",
+ "id": "d7afc473",
"metadata": {
"editable": true
},
@@ -530,7 +530,7 @@
},
{
"cell_type": "markdown",
- "id": "6040d9f8",
+ "id": "25c09815",
"metadata": {
"editable": true
},
@@ -542,7 +542,7 @@
},
{
"cell_type": "markdown",
- "id": "057958de",
+ "id": "44a7fd46",
"metadata": {
"editable": true
},
@@ -554,7 +554,7 @@
},
{
"cell_type": "markdown",
- "id": "839d28b6",
+ "id": "f86d02ac",
"metadata": {
"editable": true
},
@@ -564,7 +564,7 @@
},
{
"cell_type": "markdown",
- "id": "f2a17acd",
+ "id": "37912f62",
"metadata": {
"editable": true
},
@@ -576,7 +576,7 @@
},
{
"cell_type": "markdown",
- "id": "3d4a645f",
+ "id": "f761828c",
"metadata": {
"editable": true
},
@@ -586,7 +586,7 @@
},
{
"cell_type": "markdown",
- "id": "f80b8072",
+ "id": "d8b42487",
"metadata": {
"editable": true
},
@@ -598,7 +598,7 @@
},
{
"cell_type": "markdown",
- "id": "e3b48953",
+ "id": "d21a6415",
"metadata": {
"editable": true
},
@@ -610,7 +610,7 @@
},
{
"cell_type": "markdown",
- "id": "78067d8b",
+ "id": "86fbebd4",
"metadata": {
"editable": true
},
@@ -620,7 +620,7 @@
},
{
"cell_type": "markdown",
- "id": "af1edd1f",
+ "id": "9dcb1dcb",
"metadata": {
"editable": true
},
@@ -633,7 +633,7 @@
},
{
"cell_type": "markdown",
- "id": "99d1c97d",
+ "id": "81a077d7",
"metadata": {
"editable": true
},
@@ -643,7 +643,7 @@
},
{
"cell_type": "markdown",
- "id": "17c04539",
+ "id": "9585b276",
"metadata": {
"editable": true
},
@@ -655,7 +655,7 @@
},
{
"cell_type": "markdown",
- "id": "c831d1a3",
+ "id": "f7e3f8f7",
"metadata": {
"editable": true
},
@@ -670,7 +670,7 @@
},
{
"cell_type": "markdown",
- "id": "c2871973",
+ "id": "4a165419",
"metadata": {
"editable": true
},
@@ -683,7 +683,7 @@
},
{
"cell_type": "markdown",
- "id": "7f51cbbb",
+ "id": "b9bd529d",
"metadata": {
"editable": true
},
@@ -695,7 +695,7 @@
},
{
"cell_type": "markdown",
- "id": "0429287d",
+ "id": "a85ad75f",
"metadata": {
"editable": true
},
@@ -707,7 +707,7 @@
},
{
"cell_type": "markdown",
- "id": "f5fd3ebb",
+ "id": "4ca9ef86",
"metadata": {
"editable": true
},
@@ -719,7 +719,7 @@
},
{
"cell_type": "markdown",
- "id": "ab377e04",
+ "id": "9719144a",
"metadata": {
"editable": true
},
@@ -729,7 +729,7 @@
},
{
"cell_type": "markdown",
- "id": "20effe57",
+ "id": "c3e13391",
"metadata": {
"editable": true
},
@@ -742,7 +742,7 @@
},
{
"cell_type": "markdown",
- "id": "09ec86f6",
+ "id": "62ea75aa",
"metadata": {
"editable": true
},
@@ -753,7 +753,7 @@
},
{
"cell_type": "markdown",
- "id": "6b302097",
+ "id": "5a7f8000",
"metadata": {
"editable": true
},
@@ -765,7 +765,7 @@
},
{
"cell_type": "markdown",
- "id": "3b41c760",
+ "id": "0317aa3e",
"metadata": {
"editable": true
},
@@ -776,7 +776,7 @@
},
{
"cell_type": "markdown",
- "id": "b9b21938",
+ "id": "25ce2c4c",
"metadata": {
"editable": true
},
@@ -790,7 +790,7 @@
},
{
"cell_type": "markdown",
- "id": "02b7197c",
+ "id": "0540e4f8",
"metadata": {
"editable": true
},
@@ -800,7 +800,7 @@
},
{
"cell_type": "markdown",
- "id": "1d3fee6d",
+ "id": "612e87aa",
"metadata": {
"editable": true
},
@@ -812,7 +812,7 @@
},
{
"cell_type": "markdown",
- "id": "e6ff02fd",
+ "id": "867e4f08",
"metadata": {
"editable": true
},
@@ -822,7 +822,7 @@
},
{
"cell_type": "markdown",
- "id": "7cd823d8",
+ "id": "f26bcd09",
"metadata": {
"editable": true
},
@@ -836,7 +836,7 @@
},
{
"cell_type": "markdown",
- "id": "2b40d9fc",
+ "id": "5b739e65",
"metadata": {
"editable": true
},
@@ -848,7 +848,7 @@
},
{
"cell_type": "markdown",
- "id": "16c99897",
+ "id": "d1277c88",
"metadata": {
"editable": true
},
@@ -859,7 +859,7 @@
},
{
"cell_type": "markdown",
- "id": "7dbe5801",
+ "id": "2edba55a",
"metadata": {
"editable": true
},
@@ -873,7 +873,7 @@
},
{
"cell_type": "markdown",
- "id": "2021e74c",
+ "id": "48ad1100",
"metadata": {
"editable": true
},
@@ -884,7 +884,7 @@
},
{
"cell_type": "markdown",
- "id": "ba22386d",
+ "id": "5b2a972a",
"metadata": {
"editable": true
},
@@ -896,7 +896,7 @@
},
{
"cell_type": "markdown",
- "id": "45b474b6",
+ "id": "f01c985e",
"metadata": {
"editable": true
},
@@ -906,7 +906,7 @@
},
{
"cell_type": "markdown",
- "id": "443dea3b",
+ "id": "9e189ab9",
"metadata": {
"editable": true
},
@@ -918,7 +918,7 @@
},
{
"cell_type": "markdown",
- "id": "afd29081",
+ "id": "c0bad804",
"metadata": {
"editable": true
},
@@ -928,7 +928,7 @@
},
{
"cell_type": "markdown",
- "id": "87a24bf9",
+ "id": "ba855e4b",
"metadata": {
"editable": true
},
@@ -943,7 +943,7 @@
},
{
"cell_type": "markdown",
- "id": "91b91975",
+ "id": "45026dd0",
"metadata": {
"editable": true
},
@@ -955,7 +955,7 @@
},
{
"cell_type": "markdown",
- "id": "0b975bac",
+ "id": "041d9a49",
"metadata": {
"editable": true
},
@@ -966,7 +966,7 @@
},
{
"cell_type": "markdown",
- "id": "d43a8352",
+ "id": "f26ae58b",
"metadata": {
"editable": true
},
@@ -978,7 +978,7 @@
},
{
"cell_type": "markdown",
- "id": "5513e434",
+ "id": "4b045765",
"metadata": {
"editable": true
},
@@ -989,7 +989,7 @@
},
{
"cell_type": "markdown",
- "id": "54676b42",
+ "id": "2272c38f",
"metadata": {
"editable": true
},
@@ -1001,7 +1001,7 @@
},
{
"cell_type": "markdown",
- "id": "ab329304",
+ "id": "43b8f49b",
"metadata": {
"editable": true
},
@@ -1011,7 +1011,7 @@
},
{
"cell_type": "markdown",
- "id": "6dade8f1",
+ "id": "ebbc9e2d",
"metadata": {
"editable": true
},
@@ -1023,7 +1023,7 @@
},
{
"cell_type": "markdown",
- "id": "dc5cb70d",
+ "id": "26f242cb",
"metadata": {
"editable": true
},
@@ -1033,7 +1033,7 @@
},
{
"cell_type": "markdown",
- "id": "2e6c935f",
+ "id": "8b788b85",
"metadata": {
"editable": true
},
@@ -1045,7 +1045,7 @@
},
{
"cell_type": "markdown",
- "id": "f4645d72",
+ "id": "b4729e36",
"metadata": {
"editable": true
},
@@ -1055,7 +1055,7 @@
},
{
"cell_type": "markdown",
- "id": "b936614e",
+ "id": "bebfd109",
"metadata": {
"editable": true
},
@@ -1072,7 +1072,7 @@
},
{
"cell_type": "markdown",
- "id": "927d9ce8",
+ "id": "ed851734",
"metadata": {
"editable": true
},
@@ -1088,7 +1088,7 @@
},
{
"cell_type": "markdown",
- "id": "0b83882c",
+ "id": "4676d2f8",
"metadata": {
"editable": true
},
@@ -1104,7 +1104,7 @@
},
{
"cell_type": "markdown",
- "id": "4b167325",
+ "id": "0aa50d69",
"metadata": {
"editable": true
},
@@ -1128,7 +1128,7 @@
},
{
"cell_type": "markdown",
- "id": "1e5c9c72",
+ "id": "14fac089",
"metadata": {
"editable": true
},
@@ -1144,7 +1144,7 @@
},
{
"cell_type": "markdown",
- "id": "174ffd64",
+ "id": "85066d35",
"metadata": {
"editable": true
},
@@ -1159,7 +1159,7 @@
},
{
"cell_type": "markdown",
- "id": "a2bffedb",
+ "id": "1ff54312",
"metadata": {
"editable": true
},
@@ -1183,7 +1183,7 @@
},
{
"cell_type": "markdown",
- "id": "416e2acf",
+ "id": "8fcc30c7",
"metadata": {
"editable": true
},
@@ -1194,7 +1194,7 @@
},
{
"cell_type": "markdown",
- "id": "706e047f",
+ "id": "f8208016",
"metadata": {
"editable": true
},
@@ -1206,7 +1206,7 @@
},
{
"cell_type": "markdown",
- "id": "95ebfc88",
+ "id": "a7333648",
"metadata": {
"editable": true
},
@@ -1216,7 +1216,7 @@
},
{
"cell_type": "markdown",
- "id": "788f0513",
+ "id": "04b809f2",
"metadata": {
"editable": true
},
@@ -1228,7 +1228,7 @@
},
{
"cell_type": "markdown",
- "id": "983ec1b3",
+ "id": "35c6d449",
"metadata": {
"editable": true
},
@@ -1238,7 +1238,7 @@
},
{
"cell_type": "markdown",
- "id": "0478cfae",
+ "id": "02ed1e44",
"metadata": {
"editable": true
},
@@ -1255,7 +1255,7 @@
},
{
"cell_type": "markdown",
- "id": "2f7a6f35",
+ "id": "2726c365",
"metadata": {
"editable": true
},
@@ -1271,7 +1271,7 @@
},
{
"cell_type": "markdown",
- "id": "5e4cd821",
+ "id": "1a158d2b",
"metadata": {
"editable": true
},
@@ -1283,7 +1283,7 @@
},
{
"cell_type": "markdown",
- "id": "01e1e34b",
+ "id": "92e435b2",
"metadata": {
"editable": true
},
@@ -1293,7 +1293,7 @@
},
{
"cell_type": "markdown",
- "id": "130dc6ae",
+ "id": "5ca8cb95",
"metadata": {
"editable": true
},
@@ -1307,7 +1307,7 @@
},
{
"cell_type": "markdown",
- "id": "073f6823",
+ "id": "f7ccba75",
"metadata": {
"editable": true
},
@@ -1319,7 +1319,7 @@
},
{
"cell_type": "markdown",
- "id": "f4293d54",
+ "id": "9e6a39dd",
"metadata": {
"editable": true
},
@@ -1333,7 +1333,7 @@
},
{
"cell_type": "markdown",
- "id": "73625652",
+ "id": "8bc5b83b",
"metadata": {
"editable": true
},
@@ -1345,7 +1345,7 @@
},
{
"cell_type": "markdown",
- "id": "507af02f",
+ "id": "6cacf1ae",
"metadata": {
"editable": true
},
@@ -1355,7 +1355,7 @@
},
{
"cell_type": "markdown",
- "id": "ce5a5a85",
+ "id": "72fa175a",
"metadata": {
"editable": true
},
@@ -1367,7 +1367,7 @@
},
{
"cell_type": "markdown",
- "id": "ae408292",
+ "id": "e69baaf2",
"metadata": {
"editable": true
},
@@ -1377,7 +1377,7 @@
},
{
"cell_type": "markdown",
- "id": "d04eb159",
+ "id": "2bf65fb9",
"metadata": {
"editable": true
},
@@ -1389,7 +1389,7 @@
},
{
"cell_type": "markdown",
- "id": "c5cfcaf7",
+ "id": "167833fc",
"metadata": {
"editable": true
},
@@ -1408,7 +1408,7 @@
},
{
"cell_type": "markdown",
- "id": "fe04f2ad",
+ "id": "54a4c7c6",
"metadata": {
"editable": true
},
@@ -1421,7 +1421,7 @@
},
{
"cell_type": "markdown",
- "id": "1e32111c",
+ "id": "8ae2f0fc",
"metadata": {
"editable": true
},
@@ -1431,7 +1431,7 @@
},
{
"cell_type": "markdown",
- "id": "6649f0b8",
+ "id": "9572a7da",
"metadata": {
"editable": true
},
@@ -1443,7 +1443,7 @@
},
{
"cell_type": "markdown",
- "id": "ba618c2c",
+ "id": "7cdbb0a0",
"metadata": {
"editable": true
},
@@ -1462,7 +1462,7 @@
},
{
"cell_type": "markdown",
- "id": "6dd790c2",
+ "id": "5ff0320b",
"metadata": {
"editable": true
},
@@ -1484,7 +1484,7 @@
},
{
"cell_type": "markdown",
- "id": "f706146b",
+ "id": "2b30a102",
"metadata": {
"editable": true
},
@@ -1496,7 +1496,7 @@
},
{
"cell_type": "markdown",
- "id": "8ebeabbc",
+ "id": "9259245f",
"metadata": {
"editable": true
},
@@ -1508,7 +1508,7 @@
},
{
"cell_type": "markdown",
- "id": "a97aa34b",
+ "id": "0a01b003",
"metadata": {
"editable": true
},
@@ -1523,7 +1523,7 @@
},
{
"cell_type": "markdown",
- "id": "048eb9d4",
+ "id": "59ed49ed",
"metadata": {
"editable": true
},
@@ -1541,7 +1541,7 @@
},
{
"cell_type": "markdown",
- "id": "121bf07e",
+ "id": "ad9a1b18",
"metadata": {
"editable": true
},
@@ -1561,7 +1561,7 @@
},
{
"cell_type": "markdown",
- "id": "5d8e9037",
+ "id": "0b3cd96e",
"metadata": {
"editable": true
},
@@ -1573,7 +1573,7 @@
},
{
"cell_type": "markdown",
- "id": "51bc5f5b",
+ "id": "40fa1452",
"metadata": {
"editable": true
},
@@ -1585,7 +1585,7 @@
},
{
"cell_type": "markdown",
- "id": "2d5e294d",
+ "id": "58692225",
"metadata": {
"editable": true
},
@@ -1595,7 +1595,7 @@
},
{
"cell_type": "markdown",
- "id": "939d0a55",
+ "id": "fe9f809e",
"metadata": {
"editable": true
},
@@ -1607,7 +1607,7 @@
},
{
"cell_type": "markdown",
- "id": "f570ac90",
+ "id": "439867db",
"metadata": {
"editable": true
},
@@ -1617,7 +1617,7 @@
},
{
"cell_type": "markdown",
- "id": "009f4e96",
+ "id": "f2cb0e82",
"metadata": {
"editable": true
},
@@ -1629,7 +1629,7 @@
},
{
"cell_type": "markdown",
- "id": "a8139423",
+ "id": "2aba0c7e",
"metadata": {
"editable": true
},
@@ -1639,7 +1639,7 @@
},
{
"cell_type": "markdown",
- "id": "20390079",
+ "id": "1dfab70b",
"metadata": {
"editable": true
},
@@ -1651,7 +1651,7 @@
},
{
"cell_type": "markdown",
- "id": "9e8fc2e5",
+ "id": "0ff4e530",
"metadata": {
"editable": true
},
@@ -1665,7 +1665,7 @@
},
{
"cell_type": "markdown",
- "id": "7ff12a50",
+ "id": "4ef9710b",
"metadata": {
"editable": true
},
@@ -1684,7 +1684,7 @@
},
{
"cell_type": "markdown",
- "id": "6da7044a",
+ "id": "b37c800f",
"metadata": {
"editable": true
},
@@ -1694,7 +1694,7 @@
},
{
"cell_type": "markdown",
- "id": "9b65a19d",
+ "id": "a8bb360d",
"metadata": {
"editable": true
},
@@ -1706,7 +1706,7 @@
},
{
"cell_type": "markdown",
- "id": "7d5c7ed6",
+ "id": "8364ba08",
"metadata": {
"editable": true
},
@@ -1725,7 +1725,7 @@
},
{
"cell_type": "markdown",
- "id": "38810b88",
+ "id": "3e5040e3",
"metadata": {
"editable": true
},
@@ -1735,7 +1735,7 @@
},
{
"cell_type": "markdown",
- "id": "a371bcfc",
+ "id": "b5096aa2",
"metadata": {
"editable": true
},
@@ -1748,7 +1748,7 @@
},
{
"cell_type": "markdown",
- "id": "9f305f27",
+ "id": "16fdafee",
"metadata": {
"editable": true
},
@@ -1758,7 +1758,7 @@
},
{
"cell_type": "markdown",
- "id": "e385a971",
+ "id": "10e4e6fe",
"metadata": {
"editable": true
},
@@ -1769,7 +1769,7 @@
},
{
"cell_type": "markdown",
- "id": "9da2f063",
+ "id": "604468f7",
"metadata": {
"editable": true
},
@@ -1781,7 +1781,7 @@
},
{
"cell_type": "markdown",
- "id": "073a76c0",
+ "id": "96f6b399",
"metadata": {
"editable": true
},
@@ -1791,7 +1791,7 @@
},
{
"cell_type": "markdown",
- "id": "1fbe78bc",
+ "id": "43608437",
"metadata": {
"editable": true
},
@@ -1816,7 +1816,7 @@
},
{
"cell_type": "markdown",
- "id": "664e154e",
+ "id": "40aaa76b",
"metadata": {
"editable": true
},
@@ -1844,7 +1844,7 @@
},
{
"cell_type": "markdown",
- "id": "517f499c",
+ "id": "7658ab1c",
"metadata": {
"editable": true
},
@@ -1870,7 +1870,7 @@
},
{
"cell_type": "markdown",
- "id": "b063094b",
+ "id": "23bb1a54",
"metadata": {
"editable": true
},
@@ -1882,7 +1882,7 @@
},
{
"cell_type": "markdown",
- "id": "b009e439",
+ "id": "597c8f3c",
"metadata": {
"editable": true
},
@@ -1894,7 +1894,7 @@
},
{
"cell_type": "markdown",
- "id": "a03f04b3",
+ "id": "99a7a59e",
"metadata": {
"editable": true
},
@@ -1904,7 +1904,7 @@
},
{
"cell_type": "markdown",
- "id": "983fa2a5",
+ "id": "251b9027",
"metadata": {
"editable": true
},
@@ -1916,7 +1916,7 @@
},
{
"cell_type": "markdown",
- "id": "d89f4b01",
+ "id": "c9b81034",
"metadata": {
"editable": true
},
@@ -1928,7 +1928,7 @@
},
{
"cell_type": "markdown",
- "id": "f6493197",
+ "id": "8bfec184",
"metadata": {
"editable": true
},
@@ -1940,7 +1940,7 @@
},
{
"cell_type": "markdown",
- "id": "7417e746",
+ "id": "b8d4568b",
"metadata": {
"editable": true
},
@@ -1950,7 +1950,7 @@
},
{
"cell_type": "markdown",
- "id": "ae684067",
+ "id": "73970bf7",
"metadata": {
"editable": true
},
@@ -1962,7 +1962,7 @@
},
{
"cell_type": "markdown",
- "id": "5094a5a8",
+ "id": "f28dea6e",
"metadata": {
"editable": true
},
@@ -1972,7 +1972,7 @@
},
{
"cell_type": "markdown",
- "id": "19810edd",
+ "id": "ca070216",
"metadata": {
"editable": true
},
@@ -1984,7 +1984,7 @@
},
{
"cell_type": "markdown",
- "id": "649d5c0c",
+ "id": "4f91c2de",
"metadata": {
"editable": true
},
@@ -1996,7 +1996,7 @@
},
{
"cell_type": "markdown",
- "id": "1b684c8d",
+ "id": "8f8f3863",
"metadata": {
"editable": true
},
@@ -2008,7 +2008,7 @@
},
{
"cell_type": "markdown",
- "id": "628cdc31",
+ "id": "9fd62926",
"metadata": {
"editable": true
},
@@ -2018,7 +2018,7 @@
},
{
"cell_type": "markdown",
- "id": "9e9adf2c",
+ "id": "c4ad376f",
"metadata": {
"editable": true
},
@@ -2030,7 +2030,7 @@
},
{
"cell_type": "markdown",
- "id": "1474846d",
+ "id": "4db27b21",
"metadata": {
"editable": true
},
@@ -2041,7 +2041,7 @@
},
{
"cell_type": "markdown",
- "id": "1eca1e5c",
+ "id": "55a2b442",
"metadata": {
"editable": true
},
@@ -2053,7 +2053,7 @@
},
{
"cell_type": "markdown",
- "id": "19147098",
+ "id": "4abbd2ad",
"metadata": {
"editable": true
},
@@ -2063,7 +2063,7 @@
},
{
"cell_type": "markdown",
- "id": "67bf9119",
+ "id": "1ac87f54",
"metadata": {
"editable": true
},
@@ -2075,7 +2075,7 @@
},
{
"cell_type": "markdown",
- "id": "604ff323",
+ "id": "141392b6",
"metadata": {
"editable": true
},
@@ -2085,7 +2085,7 @@
},
{
"cell_type": "markdown",
- "id": "3b6c39c6",
+ "id": "c9cb1f95",
"metadata": {
"editable": true
},
@@ -2097,7 +2097,7 @@
},
{
"cell_type": "markdown",
- "id": "077837d9",
+ "id": "a59c8aaf",
"metadata": {
"editable": true
},
@@ -2118,7 +2118,7 @@
},
{
"cell_type": "markdown",
- "id": "02639901",
+ "id": "bf5b964d",
"metadata": {
"editable": true
},
@@ -2140,7 +2140,7 @@
},
{
"cell_type": "markdown",
- "id": "3898692a",
+ "id": "d10460e6",
"metadata": {
"editable": true
},
@@ -2167,7 +2167,7 @@
},
{
"cell_type": "markdown",
- "id": "91b801ce",
+ "id": "764b98ce",
"metadata": {
"editable": true
},
@@ -2184,7 +2184,7 @@
},
{
"cell_type": "markdown",
- "id": "fee3f920",
+ "id": "650b2586",
"metadata": {
"editable": true
},
@@ -2196,7 +2196,7 @@
},
{
"cell_type": "markdown",
- "id": "2c75a68c",
+ "id": "1349d83b",
"metadata": {
"editable": true
},
@@ -2216,7 +2216,7 @@
},
{
"cell_type": "markdown",
- "id": "61fcad7e",
+ "id": "96406bf0",
"metadata": {
"editable": true
},
@@ -2245,7 +2245,7 @@
},
{
"cell_type": "markdown",
- "id": "ab4441a1",
+ "id": "13cc0146",
"metadata": {
"editable": true
},
@@ -2259,7 +2259,7 @@
},
{
"cell_type": "markdown",
- "id": "b062353c",
+ "id": "53b94748",
"metadata": {
"editable": true
},
@@ -2276,42 +2276,39 @@
},
{
"cell_type": "markdown",
- "id": "aaef8a3d",
+ "id": "41d0322a",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "{\\cal C}(\\hat{W}) = \\frac{1}{2}\\sum_{i=1}^n\\left(y_i - t_i\\right)^2,\n",
+ "{\\cal C}(\\boldsymbol{\\Theta}) = \\frac{1}{2}\\sum_{i=1}^n\\left(y_i - \\tilde{y}_i\\right)^2,\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "0522096e",
+ "id": "f5c84939",
"metadata": {
"editable": true
},
"source": [
- "where the $t_i$s are our $n$ targets (the values we want to\n",
+ "where the $y_i$s are our $n$ targets (the values we want to\n",
"reproduce), while the outputs of the network after having propagated\n",
- "all inputs $\\hat{x}$ are given by $y_i$. Below we will demonstrate\n",
- "how the basic equations arising from the back propagation algorithm\n",
- "can be modified in order to study classification problems with $K$\n",
- "classes."
+ "all inputs $\\boldsymbol{x}$ are given by $\\boldsymbol{\\tilde{y}}_i$."
]
},
{
"cell_type": "markdown",
- "id": "5c26c834",
+ "id": "6b755763",
"metadata": {
"editable": true
},
"source": [
"## Definitions\n",
"\n",
- "With our definition of the targets $\\hat{t}$, the outputs of the\n",
- "network $\\hat{y}$ and the inputs $\\hat{x}$ we\n",
+ "With our definition of the targets $\\boldsymbol{y}$, the outputs of the\n",
+ "network $\\boldsymbol{\\tilde{y}}$ and the inputs $\\boldsymbol{x}$ we\n",
"define now the activation $z_j^l$ of node/neuron/unit $j$ of the\n",
"$l$-th layer as a function of the bias, the weights which add up from\n",
"the previous layer $l-1$ and the forward passes/outputs\n",
@@ -2320,7 +2317,7 @@
},
{
"cell_type": "markdown",
- "id": "5acfb987",
+ "id": "b7ae4c93",
"metadata": {
"editable": true
},
@@ -2332,7 +2329,7 @@
},
{
"cell_type": "markdown",
- "id": "a09a55ea",
+ "id": "95b179ce",
"metadata": {
"editable": true
},
@@ -2345,7 +2342,7 @@
},
{
"cell_type": "markdown",
- "id": "b2e5b321",
+ "id": "7020d8b0",
"metadata": {
"editable": true
},
@@ -2357,13 +2354,15 @@
},
{
"cell_type": "markdown",
- "id": "d72fef8d",
+ "id": "8366fb38",
"metadata": {
"editable": true
},
"source": [
- "With the activation values $\\hat{z}^l$ we can in turn define the\n",
- "output of layer $l$ as $\\hat{a}^l = f(\\hat{z}^l)$ where $f$ is our\n",
+ "## Inputs to tje activation function\n",
+ "\n",
+ "With the activation values $\\boldsymbol{z}^l$ we can in turn define the\n",
+ "output of layer $l$ as $\\boldsymbol{a}^l = f(\\boldsymbol{z}^l)$ where $f$ is our\n",
"activation function. In the examples here we will use the sigmoid\n",
"function discussed in our logistic regression lectures. We will also use the same activation function $f$ for all layers\n",
"and their nodes. It means we have"
@@ -2371,7 +2370,7 @@
},
{
"cell_type": "markdown",
- "id": "985fa159",
+ "id": "d740a7bf",
"metadata": {
"editable": true
},
@@ -2383,7 +2382,7 @@
},
{
"cell_type": "markdown",
- "id": "f604e895",
+ "id": "1d2ebbd5",
"metadata": {
"editable": true
},
@@ -2395,7 +2394,7 @@
},
{
"cell_type": "markdown",
- "id": "30552830",
+ "id": "ec6341d5",
"metadata": {
"editable": true
},
@@ -2407,7 +2406,7 @@
},
{
"cell_type": "markdown",
- "id": "dedb85bb",
+ "id": "b6bdbe01",
"metadata": {
"editable": true
},
@@ -2417,7 +2416,7 @@
},
{
"cell_type": "markdown",
- "id": "e2b0f10a",
+ "id": "cd92ba61",
"metadata": {
"editable": true
},
@@ -2429,7 +2428,7 @@
},
{
"cell_type": "markdown",
- "id": "90b643f0",
+ "id": "0f4860e8",
"metadata": {
"editable": true
},
@@ -2439,7 +2438,7 @@
},
{
"cell_type": "markdown",
- "id": "4f4b11a3",
+ "id": "4c3a18b3",
"metadata": {
"editable": true
},
@@ -2451,7 +2450,7 @@
},
{
"cell_type": "markdown",
- "id": "02ec7cbb",
+ "id": "4ece12a5",
"metadata": {
"editable": true
},
@@ -2465,19 +2464,19 @@
},
{
"cell_type": "markdown",
- "id": "2078e934",
+ "id": "4104e482",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "{\\cal C}(\\hat{W^L}) = \\frac{1}{2}\\sum_{i=1}^n\\left(y_i - t_i\\right)^2=\\frac{1}{2}\\sum_{i=1}^n\\left(a_i^L - t_i\\right)^2,\n",
+ "{\\cal C}(\\boldsymbol{\\Theta}^L) = \\frac{1}{2}\\sum_{i=1}^n\\left(y_i - \\tilde{y}_i\\right)^2=\\frac{1}{2}\\sum_{i=1}^n\\left(a_i^L - y_i\\right)^2,\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "5d6fe1a6",
+ "id": "4fa9192d",
"metadata": {
"editable": true
},
@@ -2487,19 +2486,19 @@
},
{
"cell_type": "markdown",
- "id": "b1bfd605",
+ "id": "c3d0038b",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\frac{\\partial{\\cal C}(\\hat{W^L})}{\\partial w_{jk}^L} = \\left(a_j^L - t_j\\right)\\frac{\\partial a_j^L}{\\partial w_{jk}^{L}},\n",
+ "\\frac{\\partial{\\cal C}(\\boldsymbol{\\Theta}^L)}{\\partial w_{jk}^L} = \\left(a_j^L - y_j\\right)\\frac{\\partial a_j^L}{\\partial w_{jk}^{L}},\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "b03519e9",
+ "id": "c2f5dcfa",
"metadata": {
"editable": true
},
@@ -2509,19 +2508,19 @@
},
{
"cell_type": "markdown",
- "id": "5348b212",
+ "id": "e49090ce",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\frac{\\partial a_j^L}{\\partial w_{jk}^{L}} = \\frac{\\partial a_j^L}{\\partial z_{j}^{L}}\\frac{\\partial z_j^L}{\\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1},\n",
+ "\\frac{\\partial a_j^L}{\\partial w_{jk}^{L}} = \\frac{\\partial a_j^L}{\\partial z_{j}^{L}}\\frac{\\partial z_j^L}{\\partial w_{jk}^{L}}=a_j^L(1-a_j^L)a_k^{L-1}.\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "7c3fd36c",
+ "id": "344365f7",
"metadata": {
"editable": true
},
@@ -2533,19 +2532,19 @@
},
{
"cell_type": "markdown",
- "id": "a86efd84",
+ "id": "9145df31",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\frac{\\partial{\\cal C}(\\hat{W^L})}{\\partial w_{jk}^L} = \\left(a_j^L - t_j\\right)a_j^L(1-a_j^L)a_k^{L-1},\n",
+ "\\frac{\\partial{\\cal C}((\\boldsymbol{\\Theta}^L)}{\\partial w_{jk}^L} = \\left(a_j^L - y_j\\right)a_j^L(1-a_j^L)a_k^{L-1},\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "6039061e",
+ "id": "7af5f61c",
"metadata": {
"editable": true
},
@@ -2555,19 +2554,19 @@
},
{
"cell_type": "markdown",
- "id": "3cd929b7",
+ "id": "908bec69",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\delta_j^L = a_j^L(1-a_j^L)\\left(a_j^L - t_j\\right) = f'(z_j^L)\\frac{\\partial {\\cal C}}{\\partial (a_j^L)},\n",
+ "\\delta_j^L = a_j^L(1-a_j^L)\\left(a_j^L - y_j\\right) = f'(z_j^L)\\frac{\\partial {\\cal C}}{\\partial (a_j^L)},\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "19bc6c08",
+ "id": "0277e537",
"metadata": {
"editable": true
},
@@ -2577,23 +2576,25 @@
},
{
"cell_type": "markdown",
- "id": "567909ed",
+ "id": "5f5eb1a8",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\hat{\\delta}^L = f'(\\hat{z}^L)\\circ\\frac{\\partial {\\cal C}}{\\partial (\\hat{a}^L)}.\n",
+ "\\boldsymbol{\\delta}^L = f'(\\hat{z}^L)\\circ\\frac{\\partial {\\cal C}}{\\partial (\\boldsymbol{a}^L)}.\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "4af79549",
+ "id": "b3cfe16b",
"metadata": {
"editable": true
},
"source": [
+ "## Analyzing the last results\n",
+ "\n",
"This is an important expression. The second term on the right handside\n",
"measures how fast the cost function is changing as a function of the $j$th\n",
"output activation. If, for example, the cost function doesn't depend\n",
@@ -2605,7 +2606,7 @@
},
{
"cell_type": "markdown",
- "id": "c6cfb413",
+ "id": "1b750f1d",
"metadata": {
"editable": true
},
@@ -2623,7 +2624,7 @@
},
{
"cell_type": "markdown",
- "id": "0edb4ed3",
+ "id": "52df8c1e",
"metadata": {
"editable": true
},
@@ -2635,7 +2636,7 @@
},
{
"cell_type": "markdown",
- "id": "ff4a9c6e",
+ "id": "dc70df7c",
"metadata": {
"editable": true
},
@@ -2645,19 +2646,19 @@
},
{
"cell_type": "markdown",
- "id": "bbf3a975",
+ "id": "94070518",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\frac{\\partial{\\cal C}(\\hat{W^L})}{\\partial w_{jk}^L} = \\delta_j^La_k^{L-1}.\n",
+ "\\frac{\\partial{\\cal C}}{\\partial w_{jk}^L} = \\delta_j^La_k^{L-1}.\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "bba3716b",
+ "id": "a8006c9c",
"metadata": {
"editable": true
},
@@ -2669,7 +2670,7 @@
},
{
"cell_type": "markdown",
- "id": "b973d14f",
+ "id": "fb90872d",
"metadata": {
"editable": true
},
@@ -2681,7 +2682,7 @@
},
{
"cell_type": "markdown",
- "id": "c08afe8e",
+ "id": "048e84fd",
"metadata": {
"editable": true
},
@@ -2691,7 +2692,7 @@
},
{
"cell_type": "markdown",
- "id": "198a2d36",
+ "id": "cdd867fd",
"metadata": {
"editable": true
},
@@ -2703,7 +2704,7 @@
},
{
"cell_type": "markdown",
- "id": "8d58a7ea",
+ "id": "88415bff",
"metadata": {
"editable": true
},
@@ -2713,21 +2714,19 @@
},
{
"cell_type": "markdown",
- "id": "ce9c4ffb",
+ "id": "e748189f",
"metadata": {
"editable": true
},
"source": [
"## Bringing it together\n",
"\n",
- "We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are\n",
- "\n",
- "**The starting equations.**"
+ "We have now three equations that are essential for the computations of the derivatives of the cost function at the output layer. These equations are needed to start the algorithm and they are"
]
},
{
"cell_type": "markdown",
- "id": "6633f82a",
+ "id": "7a40af52",
"metadata": {
"editable": true
},
@@ -2745,7 +2744,7 @@
},
{
"cell_type": "markdown",
- "id": "ee2956f9",
+ "id": "8cf04e16",
"metadata": {
"editable": true
},
@@ -2755,7 +2754,7 @@
},
{
"cell_type": "markdown",
- "id": "888a5c5e",
+ "id": "e1c592fd",
"metadata": {
"editable": true
},
@@ -2773,7 +2772,7 @@
},
{
"cell_type": "markdown",
- "id": "770ca792",
+ "id": "e5f442c2",
"metadata": {
"editable": true
},
@@ -2783,7 +2782,7 @@
},
{
"cell_type": "markdown",
- "id": "6450f307",
+ "id": "5c5b0b75",
"metadata": {
"editable": true
},
@@ -2801,7 +2800,7 @@
},
{
"cell_type": "markdown",
- "id": "0b0a6b2f",
+ "id": "2a46a89c",
"metadata": {
"editable": true
},
@@ -2813,7 +2812,7 @@
},
{
"cell_type": "markdown",
- "id": "3c82c708",
+ "id": "1e8f5355",
"metadata": {
"editable": true
},
@@ -2825,17 +2824,29 @@
},
{
"cell_type": "markdown",
- "id": "7859a3cf",
+ "id": "092c449a",
+ "metadata": {
+ "editable": true
+ },
+ "source": [
+ "We want to express this in terms of the equations for layer $l+1$."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "cfc2e20b",
"metadata": {
"editable": true
},
"source": [
- "We want to express this in terms of the equations for layer $l+1$. Using the chain rule and summing over all $k$ entries we have"
+ "## Using the chain rule and summing over all $k$ entries\n",
+ "\n",
+ "We obtain"
]
},
{
"cell_type": "markdown",
- "id": "ce1ecc95",
+ "id": "6538d6f5",
"metadata": {
"editable": true
},
@@ -2847,7 +2858,7 @@
},
{
"cell_type": "markdown",
- "id": "08817904",
+ "id": "e8e0f5dd",
"metadata": {
"editable": true
},
@@ -2857,7 +2868,7 @@
},
{
"cell_type": "markdown",
- "id": "5a18714a",
+ "id": "11160d18",
"metadata": {
"editable": true
},
@@ -2869,7 +2880,7 @@
},
{
"cell_type": "markdown",
- "id": "5bcb264f",
+ "id": "b30fdbce",
"metadata": {
"editable": true
},
@@ -2879,7 +2890,7 @@
},
{
"cell_type": "markdown",
- "id": "a1ab6d71",
+ "id": "1d6a73e3",
"metadata": {
"editable": true
},
@@ -2891,7 +2902,7 @@
},
{
"cell_type": "markdown",
- "id": "74ac928f",
+ "id": "ddbfba54",
"metadata": {
"editable": true
},
@@ -2903,20 +2914,20 @@
},
{
"cell_type": "markdown",
- "id": "c1d9a800",
+ "id": "40e6bbd9",
"metadata": {
"editable": true
},
"source": [
- "## Setting up the Back propagation algorithm\n",
+ "## Setting up the back propagation algorithm\n",
"\n",
"The four equations provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.\n",
"\n",
- "First, we set up the input data $\\hat{x}$ and the activations\n",
+ "**First**, we set up the input data $\\hat{x}$ and the activations\n",
"$\\hat{z}_1$ of the input layer and compute the activation function and\n",
"the pertinent outputs $\\hat{a}^1$.\n",
"\n",
- "Secondly, we perform then the feed forward till we reach the output\n",
+ "**Secondly**, we perform then the feed forward till we reach the output\n",
"layer and compute all $\\hat{z}_l$ of the input layer and compute the\n",
"activation function and the pertinent outputs $\\hat{a}^l$ for\n",
"$l=2,3,\\dots,L$."
@@ -2924,19 +2935,19 @@
},
{
"cell_type": "markdown",
- "id": "dd14afca",
+ "id": "c169304e",
"metadata": {
"editable": true
},
"source": [
- "## Setting up the Back propagation algorithm, part 2\n",
+ "## Setting up the back propagation algorithm, part 2\n",
"\n",
"Thereafter we compute the ouput error $\\hat{\\delta}^L$ by computing all"
]
},
{
"cell_type": "markdown",
- "id": "f5753db9",
+ "id": "12ee2fcb",
"metadata": {
"editable": true
},
@@ -2948,7 +2959,7 @@
},
{
"cell_type": "markdown",
- "id": "ad5f3e2c",
+ "id": "ca5a7e18",
"metadata": {
"editable": true
},
@@ -2958,7 +2969,7 @@
},
{
"cell_type": "markdown",
- "id": "dcfa414d",
+ "id": "47aee3e4",
"metadata": {
"editable": true
},
@@ -2970,19 +2981,21 @@
},
{
"cell_type": "markdown",
- "id": "a16b5fcf",
+ "id": "d56cb6ea",
"metadata": {
"editable": true
},
"source": [
"## Setting up the Back propagation algorithm, part 3\n",
"\n",
- "Finally, we update the weights and the biases using gradient descent for each $l=L-1,L-2,\\dots,2$ and update the weights and biases according to the rules"
+ "Finally, we update the weights and the biases using gradient descent\n",
+ "for each $l=L-1,L-2,\\dots,1$ and update the weights and biases\n",
+ "according to the rules"
]
},
{
"cell_type": "markdown",
- "id": "2ca1a8ee",
+ "id": "8849613b",
"metadata": {
"editable": true
},
@@ -2994,7 +3007,7 @@
},
{
"cell_type": "markdown",
- "id": "fe92be6a",
+ "id": "a1b74037",
"metadata": {
"editable": true
},
@@ -3006,87 +3019,51 @@
},
{
"cell_type": "markdown",
- "id": "2a6e532a",
+ "id": "900569e4",
"metadata": {
"editable": true
},
"source": [
- "The parameter $\\eta$ is the learning parameter discussed in connection with the gradient descent methods.\n",
- "Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training."
+ "with $\\eta$ being the learning rate."
]
},
{
"cell_type": "markdown",
- "id": "4fa8cee9",
- "metadata": {
- "editable": true
- },
- "source": [
- "## Setting up the Back propagation algorithm, final considerations\n",
- "\n",
- "The four equations above provide us with a way of computing the gradient of the cost function. Let us write this out in the form of an algorithm.\n",
- "\n",
- "First, we set up the input data $\\boldsymbol{x}$ and the activations\n",
- "$\\boldsymbol{z}_1$ of the input layer and compute the activation function and\n",
- "the pertinent outputs $\\boldsymbol{a}^1$.\n",
- "\n",
- "Secondly, we perform then the feed forward till we reach the output\n",
- "layer and compute all $\\boldsymbol{z}_l$ of the input layer and compute the\n",
- "activation function and the pertinent outputs $\\boldsymbol{a}^l$ for\n",
- "$l=2,3,\\dots,L$.\n",
- "\n",
- "Thereafter we compute the ouput error $\\boldsymbol{\\delta}^L$ by computing all"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "203c265b",
- "metadata": {
- "editable": true
- },
- "source": [
- "$$\n",
- "\\delta_j^L = f'(z_j^L)\\frac{\\partial {\\cal C}}{\\partial (a_j^L)}.\n",
- "$$"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "2fb48f44",
+ "id": "26dac645",
"metadata": {
"editable": true
},
"source": [
"## Updating the gradients\n",
"\n",
- "Then we compute the back propagate error for each $l=L-1,L-2,\\dots,2$ as"
+ "With the back propagate error for each $l=L-1,L-2,\\dots,1$ as"
]
},
{
"cell_type": "markdown",
- "id": "bc3bc72b",
+ "id": "53c4e37d",
"metadata": {
"editable": true
},
"source": [
"$$\n",
- "\\delta_j^l = \\sum_k \\delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l).\n",
+ "\\delta_j^l = \\sum_k \\delta_k^{l+1}w_{kj}^{l+1}f'(z_j^l),\n",
"$$"
]
},
{
"cell_type": "markdown",
- "id": "ea6c8276",
+ "id": "dce1efb3",
"metadata": {
"editable": true
},
"source": [
- "Finally, we update the weights and the biases using gradient descent for each $l=L-1,L-2,\\dots,2$ and update the weights and biases according to the rules"
+ "we update the weights and the biases using gradient descent for each $l=L-1,L-2,\\dots,1$ and update the weights and biases according to the rules"
]
},
{
"cell_type": "markdown",
- "id": "65221b07",
+ "id": "f7fd0c67",
"metadata": {
"editable": true
},
@@ -3098,7 +3075,7 @@
},
{
"cell_type": "markdown",
- "id": "01434784",
+ "id": "1ee56f72",
"metadata": {
"editable": true
},
@@ -3107,17 +3084,6 @@
"b_j^l \\leftarrow b_j^l-\\eta \\frac{\\partial {\\cal C}}{\\partial b_j^l}=b_j^l-\\eta \\delta_j^l,\n",
"$$"
]
- },
- {
- "cell_type": "markdown",
- "id": "61c6bb9f",
- "metadata": {
- "editable": true
- },
- "source": [
- "The parameter $\\eta$ is the learning parameter discussed in connection with the gradient descent methods.\n",
- "Here it is convenient to use stochastic gradient descent (see the examples below) with mini-batches with an outer loop that steps through multiple epochs of training."
- ]
}
],
"metadata": {},
diff --git a/doc/pub/week1/pdf/week1.pdf b/doc/pub/week1/pdf/week1.pdf
index a1cfaa57ccdb887a723713927acb43cb1b43a839..79d8103e4358964f1a47f8d3c2d24da8cee2116f 100644
GIT binary patch
delta 78473
zcmXuKV|-n0+dLerv2EM7ZKJVmX9qjB&8D$!JB^(kqe)|{!Q1P(|L>vw)#Gshfr
z&e|Rx0MG0g01pO&10jHrKqw$I5C#Yfgag6@5rE%-h(IJDG7trb3Pb~<12KS@KrA3O
z5C@10!~^0334nw^A|Nr41V{=b1Cj$NfRsQgAT^K%NZUfkxC_C@o2gdDLJ!E-R!RhM
zqxN3c?=8TPN9>lOLV@d3Li^btNQ6+r)+&L-ds2QRdOk1i?vO>^$s^}N@gs2H5suY23`KE<@>fPTgu0o^o=kpT)jM;4+{qG|Y7X>-g`bcAz5aN0
zbbr3f_W&}`r=wXk(4FDcI2#3+sdKSaCjWLJP94XUUB3gP
zVBJ~yTRy6tj5xm3Hwoa|smV8N?XW%Req+0Ygv3A+Vc>red_4+K{Pv+b%C0z_>Bn0J
zx91|Pg=+>*C+yL8ZC`Q3O%Umh=BQK5Q=HzGN!Dp%Xap#Ts(nymd!b8)#%pHI&H
zlHfzkat=bOz^4&g?bs_Sjj~g>xZE-#I4HD1{1{bRp5h-7#I&sm2uCNVaxWLa#>B2L$;J>TKYCM
zy}7uIK$~QZxePVUb{LH2clY9)pfP9`@)@j|u(X(7nvzXLGsCezf*Y*VSOYbh*-pN-
zWCr7(UVSLessR1aAr6e;mfKJ0%l)Izj5e<^b$5bXn*oSieU+fm6q!_0t;`NS;+36Z
zEac_nyQk8cdkz|y^}JbEl&op0b`sGpUwB??bJu#|9e47wN8wl>jk*yY2)?|L{qFq3T-9{$b8d!j7m%@+O8NsIa5UvrgQb=
zCu?vjDc8fLXrCV=(E6fb@Q!OzR|*LT0uSDYY$;xrx*mShPzJRAke48*)p2r7&(PeX
z$LpvVMa*Pj`nkfgj6Djw^3{i_kc4eF^lf_2+`jMTbKF@;_3I1F3_Wc;ldOXdBXj1E
z4hxclohP*zmln{gJM9FL#@znP(2=I*E_f~T%{hjf3Bh7PR1|@eNmLDU|#)Ce4HI+4D*$a%e*$qS?6a=0bi?e(w#fZxAmP)0UdOik4i{N-5j1
zE=a`pRKz%WmbJuXcL3)wl85Ev?(pvH32O;&Kgw0E`4~XGBx_Qj7LrW1;ggI<
zQ}E}F))5u?g|~CnPbGkEcrjUdM{j-16Lk#YnHcn97E(CcQk@P8h!uPeWOp1!yGrcz
z0XTHo=6no_2$kW%IM&d5cBFrhlq;%@5}i9N8PJr(h&_he@y0#mcnJrYNCnI{M>;t|4<$#}ZhG-f~NB_Xd&
zz;VmCl#C{-0op))bC<#kDHQy=?|>_HRxj}kR2Cgy-l&PrZzxyB!?WpVEDya&D5!F7
z2s*ZU;iPR$_+TbvqR=;k3cg_x^&l-Zg)WF;v0K*K6;>zg7gv>9wdz@n-g
zTJyl=M6t2bKY_*{lFOgC#2_^vgUYW_1|w4!c?AyVD8KDgQ|x
zbC8e*k~EqIMb&qjD$`)J($Q}9v#<4eu;CxoWe+Wz8Ic3i6tNlz22Qj3M%!2dtffV(
zZy&u&{7#M<^+>dMlt~cq%YI19{AkN#+P6C@i|yhxwNV=c7BF83cNsDa05gKI2eUuH
z5T?2xuk$>g8ltT;JK||F_+^`yZ*BH5%%s&I?a#XRb%E%!+Z(SESDo-RWu@0GO(Q-)
z3%{hFy)g;65_YxU7aYX;DDY6*r?c^yzB8ZgvhOtKOyg6tu4~Yx>PL_4rA|KJf_7?O
ztI|$KC@S^wMv_l)y^cN?kbA^DOe%g66ufW}(^I3poP743+L>5oPq9S}@{`)in@%2)
z5s8I&9V$sRZx%cv67)F38Au0-`~c6Qz?#U}V_znV%}X@$WJ~NCEE+#BiZCe}g!(;3
zt8{vppQ>OWqmmMlee4n#T401FICf+2Dlaw>$^|BVYY%>|i$p?88%uyR(X;5xbihrYkhmiQRzCf7g8(RtaDpYOKvTs{$!%1tL0M&NeMqd;!w#i$A{aEr
zD(_imm<8y>g4Bz>D(a_|)<($4ys&xfu-3j$KLV^&Tn9x145VLUlEGSJCcon&COEU!
zQ>5lDECI7hM}omxDgG9=y5|yjUiRK9;lH*?p^32~@u$yGVwEtG8M9PLf7)>u9ib^L
zj=XZ5xE@V`89yhx!V~(-?Shj)>fG5R_o&e}GKr8K;xT;{~hsJj*1gUw$HbS`wNydY3cmU|)&s_R14F6F3hYE6=O)f#PT59~JVM`xry;ibmS_|uNQRC{oOSF@e$u%QddIeI_qJFd{N|<{i@9e0WiI-^?GRmK)
z(gSEI9$KiSUxWqLJ7BQBN!V?%$+=>VSqrRI+GudHPNP0Wxsk4_K2sWd`z+1x5W0&E
zd&=Sy==-bqXC7&-QMq3%GW!-BWj74aq2s(t*ZLZCBd}s^B)p`;O@u=Hn7|`M>1ZJ#
zF6Y{|N&i7S$|rZOKq+)K!Y9WtPy46Rff1nG*}PE%*ik9ddJ7%@oweF_%rE=MHp1$vwYTY>#&y4ZW_(yfbKW^iyum2xYu0e60XJ;6F1kE+!nSNGtB?o
zROuP*Y&r2Xf7dIwqmc)kR{n5+Mn!sOly&_fai~nd0ri%3UT(J#(3f57v^uj`SN8Us
znV+MYxt`m;VClnZwQ8pgf3^LPT$-zP%J;GJJ|Mdyd>n*yV26ETsW=?yXoKb0g#|Uq__J{I8cMbMIgd)vdnO4fh&uH&_C=wii{`7(~NUI&ozp2AX
z^Up|OXiLo{mb`QiH$UK4H?n5C^zJtb{gdIkxZmp4uw-NgWaES=%FFB_0)qC-Ksobe
z+c-)_2Cos5)CTKWZ(o0?n?0_Jfoc7MvzS+3h^i<&oFb%@5z7H
z19wHF`X6*sM2<7Y(fF&;JKl^fG@`La%nSwlLdfynDxC2~A*Asu+=Okg@k$A_aqM24
zKZx7ZQ;bQD!(kI5(b_G)%wXXOn{-A92Umt%>E6M!1f0?(43#U$W?>ygFH+u9&!&*r
z9L$OQ&zR%^Q|0UtBnu}PxHk9}@ItL{hMvv8+W9bX7Qx^Vkm)reoxw#WRbY%nWMccP
zRy%)tuBp&h>+N>(Y?S5yf(Tp9lthQ}p1=~6nqVsHpS#4vH$L5XfY$$UFbF)Xb(N_(
z{3+Hhp@0cqgD^|!o84wzqm8aEt$_$SoArn%ZMQh=4boO6X5W1Yf4-L+Jt#8=mNbm4t5h
z@N)M6o+y@MkP3uolzlFyqS^ONx#!N)vM$v~%FMppmvc7FZbmcL1a=>|)+e8sYk6z4
zxjM%OJrrwqJ;%Rf%}H`0Retko?d;56SC1YC`^Cr~~gMW3xvK0y8zOLM<*zmtx
z`Fzry6^G4Ks(Gq^{tXtkhHN6s1XQq6Pv3BoI+Xqywzzf}99=Py?TJZq#795F@?Y
z;8aT;^`Qdg=!;O~5F9)kvDQ*3CfQQzugN-7ys56}o#MH{9g?Q=sjQLb@Q|t2p)oE_
zI9%t^^~L`ZQ-snnbm~ON(eTMluX1pU`1?LAh&cVHtPg9r>moh)e)kRmPe+}1ahy7v
z^4tX+fVud2Mf^d?#@n=BVj%;T44fB%dd$}aGXevi%{pc)(DcSzuHd^gZSM}}^=A!@
zj_=EF!>$ECPYvT`(5PKDto60?{SxRlmaH8G0$3h__Kv>L%=crUH={;9!+@;0zQ0i6
zSE6Maq0uUddZILF8l+@+G3l;!iC42<5T%s@=vXD2y9qR}5Qs2@_Q_xhn!rArH*>y`
z#zp&T&WCptm}}*Eju7XTPT(WXljZ)WXAH!OhfDQ8v5;s?(LQ?dR!f0@RLoTqbSa4%
zi}+0W?+O$GSUZLNKeQX@c6z~9_4T|m8SAsL4L5?Xj4yalIg)>T!-XGZ>TyBd-$XV6
zj6Fek=|nBKSDVGnzG%nQkSi1WpLIBl?HZ+rjwi?k$c->%JG``P@UFRLb&&bMj7AH|
ztEU-pXI0tdu0oF^rJ-jj)7uI#-()K!{Fp4r%sNuvN)af2BBbp~?G}I&!4^V`Avbe6
znd(M|qGwCMtmrdVSGSd$!%__jfI{{KFzn01hU9B@y{#JBbwUPDnfM__MS^W$B>8Un
z#N2ot;6bhaO}J{c#j~Xp01!6+$Uv&N*ammYPp3Z>C$rWrz~Hj|tV$!B47ETYWN(VT
z>R!~=QYUMKF8Yay(~IYloc@pJzXb>42OGi$@mnHI)Dez(8tf
z<36=brcQhKDqK9WFYjes>dOTTkY4N@TAAFBQtr)7_qWk}d66NJo{7B|immvP$>9Aw^k(Q)+wsX8*|qPLlVf6^u{g@J{^rbPmzlCC37aSK#hy-tO}F1pkv-Er
z7WYK|2Fs<1A=kj9<<-lz5X2q|pv*h#`1!VtAF5xFcImqRP{dh!b-pcJiez##k&jkb
zJW{U~I7~?uw9ui+Qcl|wj=$}6*9&g6h2?gV>8EeJ_UHS&?)FGYU@w3DyrMG=NU$r1
zheUd*+oov>lEf%PXxKoppzBr^`=Gh8f+ELkEfnBc*oM8)a~Mo;?MwUTxP^_KtRfJ53uc01{msipdTs(HpDFSv@z?
zD-+x16nWMauWGa8`MNTSNI5ej>3Eef%+;tvoq&dOH0^;&qA=25aij1-rx&7c^ThsJa%M9ndVIxv_$O
zhbCPC6|L(54^Z4(Z?ntud=w}Ym_eT9?_~Ql$4ufO6SZY?9Nd
zgs6-@xb1guC9Tb`H!1Ji=e<+6f#r3xzM@SP?i7z1iBaxD0~d(m_B--@J)`|0$?B9%
zA-WRYWH$8TL+H>pd9JISUVRn$V@_pYM>Cz}uWlMgDnJ=%qBOP?gqR^Qp1fvNv1mC~
zmLw-K8NvkVCDUnBSb9x^I3}%R!idypb@Bm7ndJ1;VUhV(wy;>AfSot7)cq?(|D0_t
zgI?I1)tDyLAl)w!=bn!Yyf`XLKtLk9^WJYO9YwGA`+itugcdwI7&>Hbsz@EZ)R>(X
zUE|7nE|q*TP~NKw^MLtE2f)(UI}lCi?DmKGQDgWLBvO$GnlYaF)2scz4_5_}9tWH?
zbFhn!9$-|z#ETl(-#?UC*}_H)pq6DQHe%{7+D;Fa7X(8omUA9fN>#{dYv?a!!)DAM(74)M=he$!FoR3NYA|VPOL^NZA84g
zoBCSk1nJMoh9a%5b+V}Jls`LPd>-KbjxQ!kdC^wSIrZ5-J&)H-R0=^2#fqu^B#f~Z
z27Cp!9Z=h$EjA9>qPO;Ugg1Zm!=Z0*CKk{W5u&2oQJOpnUjl^3P$e{m-%zDI?V^e*
z<&EZdg$i&}Y30l$_&A*Q!~6vpP32pYU-&blWBISA=)r!+a%0EJv}s2LW#&))Uhf*U
zI$w2A?H7R)6QinQTr{taoz*TD30zt`1kB=4R=e-!Ff;mUpg_-UxvX1j9h)@^vN&co
zO^uu7dP`;a_sp!kHKFdTj>=*!udxc(vfNvyElK9(XeRrhKV{Ihwfr1Wzb^#MgGHAq
zs{D=SHL1MCj>L7D*J7kfTn!_PQP~)KHa&l9@y7CYEP#XkX;`32ALcS^+3w!l2H@R#
zE}@Uml#q}n=Nd;AI7}Ev;)+K2n~WnYk)P<0b?Lm1hk`;p1mcX`Cmi=P?>tC+)8Rm0
z$O*IF4+O0K1|p|N1{AOzo9@ur`+DefzZ>tj5igjo5mpPLrWjOA)l~Kky!9%
ztx#9j#6N_32!;W8jWPu#VeL
zl1Ve4zh?xb-3*8MRp~hSo|F9_eCdHoTFE*-IFHYRh$~6o9(*VS^?qMMEmPgbmCj&~
zQZz_~;#*oBfNRcZ-O^5p?VYO|VXGkGS4j4-!RFYJeE1!GN)kI~CwcO*0!nq})$eg1
za87)J=3~nb@v&>g$SNEuveY?+*zt4EKvE1D0((~8>$yy!%w!D$m7M(G+rjZ$SeM4wpgeZT7A_j+g!coRp4@7oT;en}|rx9`bIDK{L
zN>ehy7}7Aw9P9f5aBPR0>9@#RpBXULEE4@GCoK{;l$a~6Fd|KNOc52$ILqcG5E(Ck
zBV`cfuHxW5h1*Q4Ff3-Ky^^jd?~Q*qKYl-Z#~g{mwu@XT_`HozBb*!GOiZ&}JpB3n
zpEh7-R>9I^w3~dGtnOV{XzRY$6pG=4c8K6X}C^W-hMmF^2QJW=`0+TXuJv28OqFv)uF
zwk@LFkZ?_jd^dB7Why2koUKRqa!Nz^OKi15PtR#eDYiQCvRPwW{7jDd{Jgr2_OGA!
z(gwf7Y|(Y+yq`6%-|gy6MI(C(J92(2reh>vgKP%%kE{&JG}3`5Aa-6ua*y~{PJd$r
z0_ty1a-3`J^v;(_<=p1nCaZu#KNEJd#GHLKk_@Ai+IFg-?`Sm$Le
zHHs4Y@7^muRlfNkqV#3NN`C1NVy33MSUWL$9-dJo^g*PLl%)4E1;>hU({&pb!r
zROr1eHm__GM!NAKQRfR<+~r7+iN#tYOQ!{sAyn~%!b&RJ38Wh&4q056-36HW#N-WX
zbsE#yJ=mei&?D|FFMU$eSR34k5q!TD!=bEQIJWLe_=koL;53<0+M+%(?xes5x1wS`
zyfl0MY@I4YnhOL6wB#I8(5NW5h_rh%4LXgSeKh|SKR+RU7D>!zzSYXc(U$F{E1xTd
z%RVi6w-93ncJ66|9h=q_PZIG+a}0Uzb=N+2B%0cFfDYNKC$IJ!mB
z1$G!gYM>ust$O`w0%A}MWD#pL>{
zxPc~Fa3cF`Jg!1C86i@(pT$I4J_$Qkw2^G4D0xuX=|{Z;H~d6O%bb~hmbw1?QIR?|
z_YkWD=aWTG0hPWrL+@o=&=teQoCEleX?NLk2k`L_M&2H9#~f#oA<$Hq;9QyD6D$mXd`+dqrGNJHS?3oU
z+EmSWLVr>cY7}{GM}y`;!v&eg5{k`|lRoXuIo`P3ibZqmi0#MqTEGhYais?sFA}v!fQj?Gvy*bAsL&-nc@9?rX6$topayd0DQtLq)Ky=BpMQT%lHXYQNm@-nrnV09ctHs2}#mN%n63vV3y
z;)hHXW!@j?ezoTCj`hGm>Kuj{czCY!_K;q}^8o3Lc>>rPs=6|N##@b>*ea}NR~*Mz
z24!PkUXM59_MI*}?hB#%yCz!
zEMwn^w~@th0WmZp+;fLf)B!})pX8YYJYS#qdhP&lhE?(ZXBb(r=&`|h|1SiY(N*}b
z-_rMAzeSopf+OGuoN9{=0@P`7Wi0l@?|;V8kt)c+qC56s&HJT6XK~8D9e(S(VhkZt
z|LgC;WawPB!JsbmUNjphe36A1lSNW*b&i$^a3p(r?%ucKNrf-JqXhT_PG_6_o13T3
z{A(H=7Gs5RQP)burpYwEIzYHt1@!X(wfDvLX}eNyPW(4`rxv1No9{ngFMDcqOzmW5
zI)WDxp0IuE4Yz+zQr9qNnic|0(;jQSYH}Dmb>ntbc6tJk#M@1a*a_pPv(SdeDVM_$
zCR)RLWm`A7JzL>J=Uf-e<2@ETEo$I?E9-?Ux}IP$47kw+EPnjO-3HLEQ@ghGZNLmu
z5JO*jPxeP>BRp)`=I``io21mV{c>}b!dxLlV#VrAa7%{PhN}~88gj$rb_|%vsGgPE
zLgy?s&My6KDcHsYJ-{V-UOGAc&K4!r8LaR2txdvvP4H&R--5Oy{dN7i$q*&)!Rb1k
zq(merPZb$T1%3)Pi41@ejEPrQW6?fc?R^d1{bPpbh?WY;KdM9_0uxGXCG+lVA-G{u3`gM;xpDoD7>2OUBs7`98xYEZlT$qZF4@2a*7Ptr9R
zdDLBk1`#T-o)hTvcAERL~TC8m{A<2Bv0RvP}u>0EPoM--?za`MtCaae*_2w6C8Gt3}SP)ZHE
zGMFCrVgY#9ocg{fsg1I(E4Je1+Io
z8CSa{YgemuyBXB$2!`SuVIAl_erXjIBUbJn!O{yp{7oXxsp|QgR|fLL=UL%a@_P0)
z@6&0{s{KJtUq9W#89$Zt;G(e;q}cqoqpOd_8qFjq|E(2!Ug%zn(PxH!O*x<@A7ugP
z$IY;7h3_^V`6IV$OOv4BGlB0dGrb&lk8H714zSXKvs3W76w80p4~~dvqL8^y{Lj!{
z;L&41^6+tF4(6gU0F0cMSTO_tg*4{$Xmh(>Sw3QBvNkLZffA(Q+<(Y7OBGTFUmgv|
zm2)i0ECu-*69eajo(-p=9MQx`a4fjhap~d`StU?)5DA#kmJl_{RP&1iZeK^szK9#e
zpg3OYOwC?81={a|T06S-Vbnr8RS|35EE*^w*yLQq7wp3N0FuvoOPI%Acty;vZ`nbk|=rUh)!NvNRtdCi&BI6K@c2zbrMKkyaKqOn6Ls
zJX}y$ZN;D40RT3=xHtAv!HU7=ihLR5u9|J0U35)N*;k%1Vo9E!4+_2pbxdMCap6S^
zj;p*QqKFSuZ`p9)}FrvcW3!&$Q&RdSFL;G0J+PB%+%=-hkVH&x4VSWRczGM
zy3OEs43g}F@=?Ue8M%1Lm{m7NwdkbDsEn0e6JjVyyT~PUi@Xb^5^y!puu6>wQ^ht>
zl_Ao=J(}fp;kE*{RD|Kl)H6i+MVphrZ^{z^B#7_=f2}w`c};~~Zf(9dENKhc?GeHa
z)F_-_0A0)jsMaHwez-t6h2aP@=+ZP?&O_ehNjx$!XJq<#6S3TJadU}ZY5!t@x~Z8f
zKA*Y)%`=Xm>*GnIuuIs-sqFJRMi8%5;9)^se%D&h`sH`&ie`@wR`>97t}9QSaGfsl
z@Ze(MqoQ)N`fKn#&P#ABe^L8O)$XJb1cLk`0GgdwxUv&+ZZ@9vSZH>9X^FU*)I-bB
z*D}ZO~V5M!>Mqh4^E6M
z-B#i-^e;lmc{WKiHpx!p_G+{;nVV^u5zRlRwM1zI9wLU3cNP%MOHUwFNZWotC@+>Y
zf_~vvblz_ae=XFm9@z^zy5DvGHt4^Dx6`|l{zn%YNM3IC|5Y+twg1PJ-T$}}b^AO{
zO$h~2YWt5XqMH$Ep0?e!Mhd5#gY_r8%4?+^j-m&X34w?HSG_01xykURNB3wTd_L)f
zTLEp{WNB3zE;MabX2eQ)cHy_Xag{GYbs1c&EFE2+SKi(B`yii(->Ltv#>z|mUyXG>
z4&bL8(SF3oV>~ej`3t}@(L{Ow>Rt=
zOy}I=I6LLLj18b46h#Z7vQT6)376JPnFL94$H89W6p4XYN7VG>R@3@IMNvt^O|^8))n?-f
zJS6K3(o32!c%qYCsdkLb9&~ZX=Zw#%P0g4ygcZfyj~0ySMZ87^ms6YoKqIeQ>JS|O
zHTGSA(F_*5F(l=OgsL|pHL{mbUg<1ONROF&P}&L!ok$#haZzQC-K4)=l^s?~dszvc
z-m>OKWm{9Uow-!b*vuLw_VLgo@aiyrDY8@B<$#YFQ*}HUVGu=nEFf1V4el;ySfK
z0^f+>>C;~Au^W5Ev=TeAPAaq1sooy#B$nIVo6hrOiZ`KtC?CbRg3pgd5JM?|tqo@u
zMI_%f`4a}LztH^uwloNC_S6t0jDLGt(to!IGujIO6`}f`{}rLYn(5V5>Ty{;bYT&@
zYi4TC2bJ>1bEx$WO4j>HEcp^H>6Q2|8d5E?$wiV$NJ5peO^jzHouTzZR*gN#vp6U6
zml2uVM2ESdbMpcLu7B!vuCn%I{~{4_lC0H#bP^Wi8_WIASnmV*>i+Z9bHML12I((w`$V=aDG3+gOfQN=n3R!r
zYU_z|Ft<^BRl7}C96LG;$s{LQeU81Rb6@i44Y#h8mUs2Tk*l{#A@X?9+>n()2HV?s
z1bic}`Z4TEMi5OR9<`~u)*ni+4lh;V+BYhk*_dOrV4|Pkjzz
zGi~fXCNCq#8vpRDcr`#PD*Gzn^q4;q%jA3{QVGLky!quhIsaQl7{6AKh37Cr=D2FS
zeP$_i#QYX3ja<74lL<$!_M0Nd{*YAUq;2VQ`7-
zWI6rr&8vDr`%>PUT(
zf?#zUtrg@~rozNwDh16X+|4MmUFuRF>3AE=E)R~ZAYEkKd}*iXFFl}Mb)
zffUHT2*@Eqq%<18K(2;zBrJO4ujX4(zj8zYRrp6IWJnmoy--NlCea%Be`3mRbh-Tm
zyL&Ml6h$ikCP5DuFf7QxcAM~Ux@!jZ!?tK=(D`ZNXsxvpX~GKHvzx`OJaMjh
zeTzCq_SB-RKrH;OFZEdu_ccwv$n{hbCF!pl)3?SuMN{gU(1S!kExiS&6bqMzNdkb!tq=+jcHg
z#sBqf6#hwzZpNFFr}U2l&M;p?AN-g3r`w)98*EWU`yC##E(3HCZV%&WSjg7Jx)2I6#7^bI@cOH|DMUg
zCsXs50h$=utu7a39}^t|v<|7BEgu4iIq%rs(RNL?pS`lZ36Sl#B(gZdYTTro1}VLvI&hT%H<2%qAr5YIV*z?K-gWg8(U*p^Q;Fe993@${|1x{;B8(lhHJi_#59wx!wC
zWbMV+GC}f-7_bkpjJ5BXR$sI4&FwlODONFpNJ&qEl6`W?pyl))Nmf@kXPsZkcxZ7t
zex`naBJdN>m*teM9(_;q6ZfxRDTho=m1?azJe#RWq_Jb_uyy1(#Qq9t1?$GXoQnridXxQM?sH5#F
zeAyk*=|Qz{ahbVO!l;p3u#p4cNxgK~2fTHZ==gZk?}8+fZ}CI@4t2EkV;%(GjPFC8
zto~Do?BeesIZAyb0Uy-sX{!}8s$j!G+!?#WhBRTeBafkhr2+=-&iH$AeJ!Q=xlo!p
z8(>qHvNiA|TH2rw_FROh+@>P8rZ=q$BJjX5gGgF=jm_BkJKMz&@WKFUQ
z{&NXt#l*`W@-zTmgP;?1KE1TEnKu~eJ#O{t`snC*^`i+I&?5-NG6PYd2&VEKqX(e`
zL9k)H`{Q8771jC4eI$4I#V7GXK%vJQ`H0okH)G@&K6<)7r(?xUoyw_|IJu7WsWDu-
z!TN8ns@(8{a&xJ7PSOrt4?X>di*sa!{t-h@)uZfO1$yB&yI=RntST~m#`j>8+FmR{
z47iLVlcJgfKs?lTmRC*WYlF^dI81RStIR1+HYfE1#b4rO>V;4(SamCSivrY%fTHqT
zyW*;h3^dsJ>(w?tV_)4_FKIW~Xu)q(cTn<2KCCEsAn$Da$=cJy{S`SMC~aP>kBL!fDsb^%%C_kQp#8&KHSn_m&Elf
z=s=P?!2!kR5+S)+BI?N+xsx@RRwbC0>ZdK&WY&af)Az4
zQ~vpbdbm&I55+*)Kf_(=sVfv~pE7uNPH^@GkgM^q<|afN&oZbjK@EOj&L=)7wnx^*
z$%Ud602SCS>qk9RmUM&4_Gbed1PS>UJz6wsRq}TidESrA-{6^cYnyFw+NX{nkk^={
zY(DEZ&mpKYV4)c-{&>SjB}U{jJd0>PZhs`La*?v&`}WcCB>ECqv}`l!AV$*On$d`t
zUVD|s#_E?sJIU~~h;J&1=*9>1?T~5;0m|_NpwGQGu;a`5v@*TUkEG*R&&OId;=M%g
zMGE6Z>OH5(L8M6Nc7ACWaP};5A80KqXn{F5L|ZUG>m@R15PjLucOw@X`F^1HGK%ps
z`tB~lc{RH5LUb;M+uZZ|FgJ5q-v4W(w|?5Y8X#g^r@YfIq@!h>Y?C~f+x$zc|HYsI
zuz(J$cXTZe3YzurzcCs==@uz2=@$GBS~G5ch*;r@wiL_X;-_R~S}YS6*99u`=L!ui
z7-2?RKz6k?T`uKN=wa8T5*&i`nk^i?e>$f~EvuX_XCHYKl-ANU;hWI?pw~+erLPEa
zV-r`KNt|%}c$=+i<@;lz6^TICChj-_D1Fl_sZypp>zyCd(`6iWB3UMnPOq2PNF)-=
z_LqDjD>n0C012M8xA|?lfe?mkC%>=oe%pP>mG>rD({?i3$i>LUM-%wSYLjJgX-IY9
z79^*$ek`5{3XF=W4%(9CJAIdG@=uD^HCLr9_THK(80%A=lNyczrya$U-k316jEVuf9hgjyM3MO<`VW@3(DxE#
z8k+xue)IHDgqhude=zNWK7}CDKK~ySgk!KH${g+Z2YaX(_z5zrkN!bKF$Pug^i)Rd
zb~kedIl|0CvVU*^z>q_f*)Q}DhU7Cy5@gPrF=C_f^78)wQ=FZYkC&A@vsj%O3yPJG
zi!*bfg`O7B4y~DHkMpbqhc2Iaq`JuJ5w<+PP0NSK`tLzXMovYxh!n~RvADgxNE8b6
z=sn~+;d>L@ZD0M%yc+s?X@C8-^SA4{58Tg=ONkraJqr&Cs@dE0&;k|#yflrA>zFSD
zj5H~i2<~X8sC+H7YhCKsMr&L9BYhjX9q8wt4l|BcXRG0
zUo&7hk8w0h7yL>Uhanhd1n=Moh7I=J4!_9E4)#^W`2zO=jESM*ewQYP^&5`21`QmT
zQ$u(t+0*O|u8zNZ6b4uUaS>fTl&WX&`qyR6LpvnHULgqFrOEx1;aTN@g{=Yq6^K%^DY3RkV8Q1~l6Czp!CWvqV7wi*U%L@ql
zIe+I7{O&pNI*oWev<3qI4b~9M192O!FS;kf+LN!eO$vrXqZ3vCTnPK6gqxQK=1#i~
zZ35E>+%$jJj?5*)25%=KXZ6;eI6wCnut##r8mUM{
zABzZm`P0+E--G!y!{0%!cld;WDFaAFM1l)-Q1|}sirbaV0}x&*w0CwOU{K$?zrRR*
zlv}j}#)5APsZ1f>+cOXlg?mxK4zEoC&_~dr`@6hfg)*P@NnibPUq9~vBu`&V@JDw4
z9{@){xW8-rmjmNJ9{(Q)Q#+`w=RYp+@pW~9uYxiRz7Y2R>#7U+dx4c9=1^C=|2>s=
ze=&uxgoM2%e5~0xx&Agnok36!h`Acn1#I>A)ckGN{
zzGh$>_>06DKAL|WLEu~T-$`Zc!7%e*>&C^)2QYPVGWEoOKWaGf0=zlldua~w_-mK}
z9PIWm7kCH&-k>kQ0_KGA>$&pr0yrdof0_O!J^+X0Z{i1VNc|>30Ef)K#Lorbko!&C
z01o-z1Xoe|O>h>mcpUg&`gQvp=6@h3fCKUe!i%%`1L4(K{ekdng#Ll>8?gQZf8l9u
z|3G+NyWa?3Sq}R@5T3~44}>c_|ABC2mp>4$?Dhx3mEC_M7hKun4}^E*`3J%idHsd|
zHBn$!Cnxv|_OH!{kK2FvueUh_;sF6;%uU0^eWc)_XSS%L3bk4}8x1PL+f6_35MZrqRYVKFHZ%-|IyK$s7-Wuj)@8PlekX
zh!azy;z|y?w_AmLuXwZBf0ZLT`%z78EX<|CK?kB-e%1(9m|;q<^KE{xVUdOO63k^FrM
z)vh6p38GQeC3p>J7#0VC5#yXh`LS+w+l~8#CVTY>J5FQ~V4$uwf8>9ph8%6IO3OCz
zIJeZBl>#!b-g`5EKPou+)uJ%HDM{bd)>aq}B(tYt754=^oQ>hP#Vg>=AXZ@b)+M;%gJQZ}kuV;{Rnjv+1wvUoPsGT?&
zG{3;)Sw1hL!JPBdfAmLQi!HBkRrLaWI!{xqTf$xX2!5^c#ul;b_r<49^%dtYLp!BR
zbWH>_2~kpq$q=7z^Iq^rdbKI$nCoB=FGS2@B4>N;y^O!uV?^Vmks1|^b~4~pba_D(
zlV)rf+tc4Fq5h+MxOQyngQy;HeW)-k)6a8kz!BcOxp35%e|-WC_BD*wEb8ZE>;$&2
z2Dv!53mOi_>WAvnu`q|NZ3fLtr+oL*s~uvrkp|^AfpTmy!c4_$_#b*p(JMoH2AnHw
z7}JbRG
z*2^Y@UMX@=fAgO1Zf2YZTYYE1Z*>!O{g{setg9{a38|aF*CS-_&rMn;CJJ~43pEl`
zaeBhkd8KKn>P~i3m~8%9lWq=@$B3D>D}IIPsOukH;O
z3C2(!<0B=o&EXdU;@-x&yB9?Ir>0mq&sN2xnaI#+1~H*^{74iQwH-Wu`CJGiXy~*i
zsKdE>ee>)HBIwq
z;sgbZAe18*OEqKo{X_bEit^Vplu!lvx`{6fM}1P}`Z0TN$AtW?x_z~Dm6t=Y6nq=p
zw--eNYXj7ujMBz>WR~PcXCgC6bQq5iYL;-Pe}4Z|MRh?`6{N9IcyH}bb9oU#iLSMW5YxU9V*=9sHu4{eRm@$tx%7biSm#dEJ>lYNHJRn}_&o4a<&$BZQ7{Z9q
z=7!QMe1lg|hiNk91)K!3GIw&J8(Rb|SCjn17w=?~VWZEEMSZ^+m54~?oO?dIEsZ;i2{Mpj
za#J8`ylH%YU`&?f;UcPT1)JKq{*Xmm)0~TmF6xL7l)RDHplv5m0k9?pZK}Gx?dwixvz%5IOurbmLTzdw*P99jO{u;0P4SeF5luQ
z_}X3MN5Y0&BL${0@`x2IB@_`a`ArzJ=9ig`d*z*XXDz|CoAU*C&sxbD>X8#GN})Fe
z6hWEJPJGT7yW;6!H}JKZlvt}Ne^u=5mc+Y^itiaMK)Lhb;2Gb~rj4HjB?6jRCJVP@rRX)FtB14ujxBgf1FRe|{|y_9ox0
zQ*(FA@-=zJI0&UrlkqNW1tJ9;E+cCD@ab*dVU;@^JO9j2>Wxo6J59}@8`I+qQa
zZ-&FNfAP6xZ)Zf+e9%Pg
z9M0yOh-8=s6)!9SUr$*!zuurMEbV{n)@fPvb}Yqje|-e|3LBFp9HJm?iyVe!8Vjbb
zq@JNixh_-GV18%(4Ch)^W6ejj#bhjJQG;qT);>dxg&~`nyTCAPXtEcE9+S|}bZ6q7
zY}NFARFuC8D%to
zn3UJNIstB@t6wjXgUZwme{MCYTCPTKOu&C7Egjy{W_@{{jHV(n**VbjC;{{6OIEGf
zoPeZS{Zq>%VSx;k+u_KDmx|g7C0m|^q_#!3B>R?IZufa!f4ye~F}y2H>d#uoit
z3|byIMcH}x7pBm61bK7?2RC&P!E;WotX#>8MxlW^DLA;dO#v8EP2T7zf^!!w-%clW
zdZlYY!WEpH5AH2d%n~-!<3sUS7zVY#WTg6siU7jZiaIhI4QQnM?zotTe_|=a;zoth
zVIGU@O|lvvf4-wN5)9-92PI&T2hj$wTXZ-YIp7)|pfB5Z}(G;1K@zD_vPhUC_>(Zez
zBKJd?0M!e$MdpjS`qG@0jyC_k9SBOC9m*Vie}~QbioOO%;Cg{6$-89ESn$Dey4J&l
zu6VnV1`-SRvN7%+*0Wyw@k^>Nm7IaHJKoDhMey{C#f7J0T4`Y+-tnO=>yZDLuItBk;Nrra6
zw$2M`;5SRGsXkQ&B4=qtQh#vIlUNCkK)i7Z<1#9oR5WBow0F>>dtH|g{p@i!^xhCXxi^m0QISbLn+Q-$8Oz>}gx)5nrN_jJ0;vwsan5gT
zej;uYzbL<00mtfa^tXHTO)4r}d$N0%zV`i6dT})A-J`qR?
z6eLn5+~8L9@j=R6DL`C<|1Rnf%`K*ufAo5RnDptX66V&_%dm$#Dje0m#lSG*IX9KV
z?E_@C5Q}wT6VFeyCMc?(`~aLvO_hF1Zzes73V5
zr#o&FE#vC6;NH=X%@^kYm9l9V5LP|C_>HAmv&$j>T*HH=>EQ98^0
zbtT~0L))B76}>q#+Lz`itm51Ue>v9lV*A2^+qSUn>VB4^!aFpeZ>9;sJ%&1u0#RkW8wWWLeFQcUCc`93cMl?9aWDJihCDvL89QISIB{WoHe%Q^0o
z`;~_zOtLB2fypNqDP$ZnPc@EHlf}cC<$2mJmKvNc6e6cw6RvAYb(xGue^mTEeUcV4
zh(o*!K&N9Ni40jpn=9vPv}%-Rqpt_Ar#cn`_s}BwAGO^{YVsb9dbZ={$qa}QGfYi_
zT$U9t18b%Y8OII2T+SwAr?rq-sa#vu2hiZsGAI)o_B?62S99A|iX+^{RXNzYu@6uF
zxnb05A{rvolxNN+2C>6Lf7D53ZKioBP}EeROo-xM$rFtdn!&wFp!q!aK3E%xyHMuP
z(-!lp`Axi-v4pQs+c##{d9sMzm}~1tpf;AMM);FsQ<*rt7Q}rIlpN4yM@QxR3
zYtd@R(XBzBmTV?m{oi|h7*Ow`IQ^X+jkNC>i84|QmnBL(^vx?qsAH*?dU`c0MCgR
zxKE*>2BxrdrH)eOn7wV|uDPrO=^*)?{eY!V*rQ3Dk?t#@y9Wa!JrNY{w!z?AG$&A1
zY08VWd`Y7TKdUupe{Pxy-Zw|K()-Iszw-jFs$f`C-|&3rLWW_%eAFi(*1T$%cQ~$W
z#>LHh>@s2!V;t5OmkHdVK`RW@cZ|^?gxdbMDDIy&%T8Q|p{#|vdtEu>q>mwwAx}%)
zPa3DN9G|8Uf2+|$72l0WdL+U{s35_reE5^4%i}X_mEE<~f4WPg$cmvb^zHF=^HpRL
z0=+8#;~_RFE>zsA@2Z3fC*QnUP8Ce_fr@>J)n
zr7wjz#Rbi<76Y5eZB?QMXcj)Cct1#)EgI1OIQ@Jn%-r_V?+xQ*we6W4XTAksHAjaO
zG}F@-J`;D@f3KPEOv%~5W#_kRinv&w7-k_z+ZxBa^);N`tSq=l+cd=J$q3X=`9*Ob
zLITpUauXe1=&^&tjqJRsOdeTdbvzPv3pP@Ul0TJt9oF{^-#0ABMGh6&%<0)edc8^a
z6EzC`7TaqWm)F#w{<=ZSEX~58HA}S>sa-%N`GC}lf5M=EC%D`U{V7I9*Q57KrjodN
z;E}K$DGIP(L%ju_2PNja^l-Mm94em(5x)T0QiR)bp
zvyd_|DZZvJb30o*XY003qW;kIv*xAx-ZlHGTCmwLwIG@;5kO4@TpweD-;Mko;|KHe
znX=7GaMBbT`jQWcZ+9tV?kFdx(B3D;K(Y6-fB4k%C19!Yt3)}~;%
z$H7ZH7dy+TJPq20pBb@*Q(nN%*ng}-5Q7jdj+@M4#C|%DY`5xf3DqQMinf|-)~&FK
zF0-M_U+*Vrbj`4`TNuUTwfY$k>}{^cp=E^aiaK#JFqaEipY9pX0HwaVN-4U7Dxp>w
zf7+-|iJ2VNNv?3YPTw01mv=YBd%(LAe=^>1i1o#hQTRjBqZqc6j5V^vIvy=~jS&(4u+2tcZiKS|
zoe2oVe&5$tUFs=+9>A?nKDEtRc!Kc=e~TXQG5U2d;p|bmS^4E~I<}V9+N-mJ2UI+~
ztM~|sr8<4yo*x}QsMhm0kT;?jH1wY?WTx$vo6TEKA`A4~4;tVq)bDfxG9Sl`-dEz~
zizbgiBGDI(Xht})aL$C*EPMFscwR8DrOd0kn!M~XL=%2(+H(w+_GYd5qI0>
zB@6z{tdvvisMn!mo{FC1iQ;O?oP2nGZ&c@b8}PUdfW4)uCaS#Aweh>Q@sXCY2zg^c
zZQ33gB&s#ym;u|gR-UDc$gf{be+qS}iYmAk`%2W_4vEQ#(bhs5U!?$x!Pd4;Tx@kVro6knU
z22Ro;o#XzWad=eKgb$>Sv{x~7Ff7sZxiBZBHPO`L`9SAcanRU-$8AMVe+ZfZku4rW
zX&(mnyVj@=Lny+u_Xn?b>dek!ZYd=7yh@4e=z3FU7e8X298yPX$&!TW8p%b(kgP<9
zw)dNFr+3ft+u;jwy&j)`1Wm-4h!Di6ychEcjt*X?=qGtFM@O^5Aq{Nk6O>Yaiz)^x
zD8%RrD?O3Sxu1?fBfCeMf9?(?^*4wvRAo*?-+q=76EAb~QETMKjvM;OIRU>B
z-smA)loKP;v>l^DeA~(Zg2?Z+yO1*J@$xGHj>Wri
zvraiaLSRd|R2dJOqpA35demK80N$JnhDUK)$^A|6@O0*U+Ky0->6
zkkarG;v;BiLcyDLM~2iMf02liXd)UKk=Z``Vx(jKc6gWU#$z_-1Pu6{!
zSGJ^jrldg8lfnGbXdy3%QZ9f)ZLfPJToiN=?qA%2edpPxHsAd`+>G{ar}gN=QP78veWQqi-!w!K7b8NgO!uI2)zg`aWBC
zI$X|r%EykV!0fn~j&XZVHFFhTRehGaBSd#753cGH+ts5VnQk!ONj#`5IkiAd`BqWi
z?>OPwf3ct(-Nqn|C(Z;i=z8wWr$g3kmH?{2(^9cRo1Evp2k1S?wPQXnN;S)XeAgekDva&XC42kf8^Q
zdvKvmxF0R~#&I1r%a=}T&!Im^2$zG01t%)se;a0%USt?fa^vKD3N7b;6Tv75vgb6c
zXbYZ>2?xoSgDz|=YlI9lWl{xFDeeibs%0@B+tC;z=nau$pKD=Dw-QDoc6HW%nu@ox
z*lRZpm4g+MBq*s)v9YQTeFZLNXOQ!r;GIu514ZcTi8;)8JA{N$Kc<+z5)YU#f;{)c
ze~K0Js(in?D%qV|S;|-~?_3eg#rc`oZTt21xC>=)!9Y
zZ#p4Kp!*gn4V^xEu5{aIJd+usoJ4|7J$n($F+j7(kp;6U@SDf}KC{A>*^+p$Qn*{Z~m3v-2
zn|1(M=9Z8iR3z!LHOu5KpneqC(-NNTQpZf{vs753#IuCd>eJ=K_KMIvkfuYXT%e%h
zx+X(%^2leZza%>5yh5Nv*Yoqae?X2I$a%{Z9iPR)ZZc0nPhB^?a@$~dkz8b+{CoMX
zagDijf@GwW8J*5sg5E6l>>qAZrkCqFwArU|nQV;|BUG#@jtYaS$%<;NEwl@9UWIaV
zKEpX!ohEMnJ%W?wf+W-FEX9VS@9Os%_$D={?dLQ1XmrQ}bXM={vNuXmf8r28+yjFL
zW;EsFn+%DoNKUOB1btriHF8@~&b|<1R0Se#zImPJVTPg0WnIA^W37FDv^q<$nuzrY
z)2zx(ZHudsu2Y-=s~>Hp;{`=RYKJev$4G&k7o(h(H%G?UWe!Hf-)LEXvfLTBezrMK
zXbMf^ma;I~aE;z+i1@hJe;+h>(QckG&LSC>#J>>XoNX52n0)rti*Y}(Fi<1SuQ-?B
z8=9w~;bK)?n7|wQB?VX=TZI7WH(V)l_Yf97KJTO?Y%@uZ2aVR?QT+L*Ox5oKhuaF4
zzKBpqrbEObsvKaXq4*~)xa!sN2TsRKhL^Gu+ZPrV`EA%o;~pnDe}e4ZpsE&q_b_ro
znpx_R)KX04aTbAE_G}I%i@n9-PmC>XEAz{s2AI{6v)d5xSTuEG2_%gn)z!Fco3edO
zw~a~5r0I=iu4TSK8MyLEzL)iF-YnypEY9ZnoylCFlgPv|EOCov^)_L*n7zR`dRBaY
zCt47-A%+Ug@%m=1e-~Z7vR1f02Svq*E*GXWKR~yKNi<@-`f~uEgUbDdchRLCZn5}%|bn>bc*&@8%>1|Ac#;w{>?@2$xQ|bbMo(jE_InI;CVNR`O>obXXh(yWr%MAI{Yr-PS3|jnhAwM(Q$K_u_
zW724&xW%9EjkBZS{H!;|zcQo8cjCvP@}Lxte+TaxW%}S05S1fFM~>7{C9Y^D$e8oe
z)^+E!R6{IgBogNcY*wy5n%azE%S0d}Yi2#sdi6*?X-`5k!0h>JgU?L)g+)&tF0aTi
zSem0Va$%_bIv=XQPJKY`fgCidWPM?eTd&X$&%bBkRdkGjJ}_C#)zrqOQzpfj(R3dp
ze;Y60uUcxp;!|0^4iG@7naF=q1@)apr#nM$YNJ3k*~;yz{lSdMKg|`Gc)*ENcTZHR
z$3M_6Sc2;Z+MxZwD0QZBXMI48!^>wBFfdgH?$k!?@-?4nQ)?>Uy34I$52afFjw!hO
zO~+ju<3xlRGQ>uiiLH|UmOi>W;brc(VGst9<%(VB#}Qi>SuR==f7^f`
z7#yZaet7bQk$yc!a%qoDgryVPRE>a`8_X7#&a{SDMe3%neIBCiiXFZV>OVQeWK$pU
zvL(4(U7*KlN~vnmyG`(7Qu7-A~+Iii$dCaTKplJCJbh4M-
z-)w}J@Uq@N`TB(@`+2A+q;0G*N~Er=`7=g(t4=YX>tl?497goeCrR(0IX@Av`NQQL
z@IGjjMz4}~*HtZJQFQSv;o7y%+DQ%vSqbXJ^jf|Qr8;gW@1wObC;=DOT`@^BMEcs|~Ul5$UM7I&nf(pcGY$j-Im
zO)lTyK;n5j&tW~q9aF7tOG&=n*NB%~G-N~@5K#IVBMB>8)GK?df7q`A7$ir>V1cH{
z603vn)v{|2mXWJjPNZTmA0DD{JE0W7!}cQ5pN_WCTzOM+ge3^Dk<%TOzNxJ>LE*em
z5bAwQn@)Vfu1b$diV^_`q9TPsf7FU{+<{}8`zmX)MqFQgzO$Q=7LsTfevoH5EBrK+
z`^K6Fp>an1BqSgqe_r#j5qOW=#9?DL=)`6|y{rT9C%t-VawFMV=v8dM{^)7JH`VMc
zpDPk{z
znZj8H8ch3Gs~LvjTzk*vkW#cJkgcrtV&z^7P{=U)HuGNG2tz>Q>d
zm>2T~i+EU8!r)FU|M?G4GOA~mS+2<*irn!EUBzocOl^xAo_N8cTf*+~>eg(vgIZQj
zvN|{6C7)4se}?`@mqPYE6}HQ;!m&`7Hig)VcJ8QI+lmZ}^RM?GvlNvyw>Z15b+I#~
zFg!dc=!>qb)7+|>)UTTt(BEQ~IoP@Ore>R7HtHd$_7dBSgbG?h3S5C
zuNVY9sZvL-fbe0-_Z0?8mz0QjeqypDKa^Fq4BlXgzP9`Eu?u5DOHeN8TDpBtr8OV1
zi}OLvYyTt{ca|zuNrgE_`Jj_}QHf^l>V>)qwox&r(Ha#O82Ery0}lEGLqohRUHx24
zguoPWe|$T?2`RSmMGh3hhRE1|27AlYb`M0{>;)&j@`=4V$I}5jOuPO&bc{rOgdLJWap;swGuzI3`QI
z0@)el$~St&df`Woshd=p8T41Zs1UAQR;Ae^Xq#HosV%t9<2OCMB#PbTLNf0P9UyQb7ErMi{9
zNvqw(xdgwH_`Drxx06}u%UfM^^YtrC_B81%uJ(}lqgC5it!H*^%Y&kH{e0ck7mnfR
zf75Hw-m)3aQtI)bAXJs9N3uW~AWF8QzfpikSi#aZiWy>a@Pv5rI^qQ1&Kl;+;O4S1
zy*n=Tx&V#yJ#7JG4hef_b}185d8WhV_9;v<%~naK&YGWb5ije}EeqKzpuWTIXNqD^
zB1p|+yy4Cg1ld`|yT1s3pDbtP#%qNA`;vMWL-!1ygrsK89#Qxua8-6-Xb_69H6Eh;I0R0UX*{kuj%It2+CGSZP7-QA6JiHvQ)1U48WCm+q@`@k(I~8t
zh%iJJsAgblsv{x_gh0fEArMh=9v(9!)(!D@fSktyf$>73QL_Jk4WfoYK(TnAIuwgX
z8lX`?U2iv_h!{{rQdUGt76Ji^LLf4K0nr#)pgPnC2?rVo19j0TgcmuF8rs7jgLHPm
z;v4<-3IxIUfFd$7Qi8wJfhz6@3=#%K0S%y77lb>$BMj;WG)2RZ2(16VN`T~DuviaS
zFxc1ER~YKKzaLExXK0~8S!`Mcbo*}n=Q
zQNNR+Fc{k11B&uTqMU(FNH+w~NK;=J>xUHtLQ(KvK&YF47aE@r^?@SYppJOJ@7bY1
zO_f_fD1OC17xaQ*kRDhsVK1cHuO-31%HYpg0|i$@ySpP$STFKl^Qj{-2pIn0{lS0T
zuPX}ei^BbFIw4VTr(cV}y*I-fii#0i$AEp
zX7>4$2B1Gh!Uz0!EkiV(y$B%aAF|s)Bp@*SL*)NW`QIu38`FPP`M=5jpCf5{ySe=a
zfc`}M4*=91>E{0@1W#XYEM5u*XuKd${|U83{2^ci1RUw@{-04DEEF#Z6_hic)
z1ky`?6X}P58zZqWmp@qZhu{2{;@prZgfZF+`RgNtPk})GBZik5%oYD3@xoK{cL;%3
z(Z8E&pkQeDFS&_INCKf44Ah?-f@hSdgai;Lg4ZP+;rE-&K(H_hjm4(`@e2k5ozNKa
zU-v2{4Fntha{abMAW}fE>p!N5I1r5a#}pNR1%d;9oBx^t=8eJNCHb2x_yPa2f9oHC
z@I%1Jrzg-bx$wL7;jfNrRakw6R^Q8C*-1aL5*2!1Xitjm))>9Blw_KnHKSLqIo&0q
zX@9rPP}TD&XMS=Ix6;7=%wPOKh+T8dncXpQ;^N@acrxw~z&SzFNRVflrg}rR1k+7_
zV9XXv+fJqgckVXjp}BXn{>@afFE2x>rj+i3SB9D9?
z&8N%m31Oe&?4lnVW^(Guw;q!x`PR@b_Vo9Dr0Zhqhz-&830?ih9f76Okn07gi<8t?
ze9WgleOltPk$Dy8s70^FXljthd*^X~hT23_L@!zC>MQxx;s|rcC7T}UG#>*kH;{&g
zm|go*A}4O*@IUpv*(k
zuUGk*BnVk-LjtDV7H5)l1z)>;T#}g}poX2_Y8Pi8bfqJX7K%-(p1-*j)Fh%G?*MD?P`R
ziI$}8#xx$^WVZ1*X&((0HircYu2g8wzt8J1Xc3=(hWsGo{x$@wulG%4+7TKCOn4~k
zdS$+gC79A%aM!=~+pw~pf@29u{{hH`M^{Eo9nFYCxc3V1P^k8l>44-C&gBJ8{K3}@BqZL=n4vj=LtE6=;hTl2`y&E
zv$^KVnaaUq-67;P|M29u|^pQ8(m*_LEt!ehEH3
zU?Xhauu{#xG`d=V%(+=^;(iuzeJocAY7Uz4Kaz^C&ZH65Xzh)hd#%Yv0KnNdKv@BsY>ygKtNK%Wy-Q`E2?UZLZx8up=p%blZy5QuXE)
z>I%ez)IMQXXHhClYp0=?fc!yZB=-fib>Bk}A|!644>&yo9pJHpVw*hb#8DgeF3>#g0raBIic&9l3lGs*5O
zCm}uxpsX1ASlnXyuC~zyFMrhvRkUBWv2v7nU-0<}b5p0nky13soo{x~TS-GLTxZB#
zP{x$5m=1Da$G)EiEC*d!HJ6sT9OXn32+d-}(CUgM(5xH`y+Qyx$feSb40{Pq2EW
zOxv!f{cGHkODJmJ5E7heChJ`+?fI1SevB!f%|Z~h
OyzY*Q*Eh-nc)4b)tL{)Io^1+I3@q5a_t^5_ewj6$c
z)O)g@w0=JhS=iNqFLoTSIqesonrsfbB|Zt7;9n$Dm9f)$Fe6A@tctsE9|m*{6E&5S_rx;J-k@+EmL&@#L01AZHc-i
z4-#|D?mY4!c>64CNvzi9VA1wU{#18=cST7iu^FLDVVFgk5%6b|heb7n%7I59g}1){
z?S*O2=exN3ed@#jQPaJ$QF9r{?k;)cyUzfB^L?#B
zcK3Y9O@Z!&YojN8m*9gg4G))p{``*RhZR7+L8z`*_l=+LoM`)a)c2FFoveybKC#Fm
zl-vwi+PSXFCT5JV))dg%6t4eK%3`f!VPPQtbIm7WaL(O2m@s95{fmp_*OR0KAkAR1
zX96!|e$r(l^hwKUou$x$g|U{kzU=g!S4`a!T*@a?$3HH09J$az$ZpJkhSj-y)9>mh
zy138>3@tRDg;$u#VE
zCiF^n49BEgqrQ$Pe(3RkP>=HCx_VIU@wUJH7TAKOmTsB(UG1io+BCxHl82I8$noI@
zG*8aB+mPB_Ykgeq5Lpez;XDgW!yFhj4G&C4C}|YG(5eR-XDd5!hAh6HJ+T-IRjyKX
zMV3D)uBD(ex<8QeWh(evP;-e2cpO_ecX7Be&GH34djOW0>ZABZKrq`yI|rfdbW7D|6->5d!bA
zt(`vCaf<%RU9>5W+~!EcankAI8`a;HOLs;Rco)|P9{bvcJMd1&I=m#M-z3!~?hCU=
zjUDmAexCEoKh%9z;CHrx0q<5i8K`Yqm@V<);1F~fza(gX7M|D)P@UTe#U6MA>>C8-PnC4yXC<_h^r{e$1V51vEz5#lgz1ek1lmwPl>o9Zoa-o
z(OA;!P%qLGq-J}?n&?#@-_pQVu=)KgaOAJ&E
zPSNzSxvJr7+5NhbG?F%cyPZwbx_+c#8f^Fq--(ZZ>C18qVm|pk{n}Ds^ftg$xw46*
zbbpW5mZ*yS=_^iFM9V3O(iivEbmJZIH!|rdyS4g=J
zH2quwGgba9Fe>(~XIkKPG_WXQ2cRKiGub?EJcD{lBNTiWShp{z1XdviFMK(Fm>1TY
z-jWnH2|)&uj?)hS)|43YJj)stHoAqP{K=hv)KJo5hZq9X5Q*1)hI7E&AW$UY_H>)D
zdgv{wiuB_tZlw9e{tV*`mNDaOXFp7~cZa5G#%{c};F0I`Ei+W_;}?<4l9EO1d-3(A
zci*53bKX3fp0SH8$TfRRL?G!)+^gkJLblj8^jY)l60^Dk&6!kqy#D*+*AMRDPBX%P
z+U}i{RgQHievT_V{?OTpVm7KS*T^W6b|h+K?>v|_*Hdzj&Wq~7qb6Z=@wYs-|h#`VYbwkrExlf>JdX5U%^
zBQt7iHxEvHoF2yvZoqVHFh4gamv%gHZ#y!o^h6Spiz&ciQoY1kJ~(Z8u*xCDR&ryr
zpp!mm*dqDencF?*{GlgJD*}U=x0+i$#y(JCdt(gWCOd!5u2Ec$`{>W@%iWiMJ^OrT
zL@xSH&14l|3fLLaP`+IpS4(L)+7+2AHkq4s?zzIPuzajTVcb;g45DyvMKAF|ncU`M
zqRbN-PmGa(_l}(j3;TlvLm%#$IL7f5iQhK4qG!ngS5&ee)Jg-@Y`9CW@*35|-9Q9z
zQ%yWzb!3(pUdb9|t_nPE_e@oPyuHFt7274R%=C`Z6i~8hUPq5PmtC}jM^3Iha(`3H
zy>38o^y%1h={ti;fZhPJYi~OtbqEcO91nNo$LmjI05cSj#o2TVO)IaTD8Pg<#e<2wDpn8i0PvQ1N51vX_6Z^3#5(y^|zOxl!e;dSgw0Lo4fN#<08%?FPtW8u^+k-MOZ=$1t
zODC~RQ*oLNwC4?L!HhYVOR7!3Y;~nnJbx+A*xw)Kxg7?Xq{F%p4Dnki@*5f_i+J!F
zxvd21MfVRbCSARO{=D^nldD^u{WPVItgm>(U@g|TOEH6$2l8Uqv!0>HF1x+)BZTx6AW7A7d8R
z)1z5Pg1+Gt6_2c#V@L;Tubd*YTeQ*ZJ<80?3GmZAQ>;6S5MD9zkbZToc>rW=rG6H)
z))8a=`o;4{og2p8sXyrRPc#*-`g+#h^{I?1yP)@zXOEE#X<|@k=#{kdZZ5x`PSUbNF(`noTBi2}!#balKr!p^5;`fUf_>{I#6wM=hcp
zkQrfg&~}2L_ej8?RS>hm^GXtM|kxG&r1Pu
zTVKc))6NJFArFFqAR3Jjs1q9Y9Dqt4JJ~j6f^Qj7$GnjW0cL9&HTO7J?SUchSoo8
z@Aj69c4j;iQR*W!9<4_Td{sR7^!$6e1?9q$x$N_QH6t|vj;FctB=lx3XYsF>?A69=
z-i25mIEKFWvw(4k#dn+^D)#HiNkO>BVzI1-aqaE{hgoxmBkFFI0V(#vVsu`8@K
z;I1B|6|#vXs&>$PCM-@BVpBWvKE+?frKrw7p-A=HD^RT2Y2FNjyj
z`Er6)G}F_|+WnddTS^}*G~+&^Tmi(;YiA5)a8>Ty9>;6R1SSq_!Th!)*J&g@#gF2d
z0tK28Z(u!3pz(?qO1k#hi)&*!=i*dHh09uhQ};bS)V)kAa&;kZ*BQ*3?G@ecW5lo;
zT^7e-nyAWLl4#fN9_sX;kgJh>BuIEI@KPx+nrQ9n$LWJ=>G>N?EsUG@jw&Kv6yH7O
zG}Kp!CcDMt`mT>Y{OTTu>I>O)FjWY4pQ9F&{DjcELBi;!T?&WjC!KklDT9RK%iQRH
zlvmSD8p+y;8=h8X>UXj>Lj$*i@8$fcsI6v}37pD{KNKFJ0)uZ~(6#z0Js>71kiB`G
zzgq`sa%j|Oev@e}hufcFDoe{{EQ4*3vdmbmBZRzhREVz0(gW=X&v|gStEqw=N$v(q{M7w)5^*s?JKg?7$Gkl#mhAD?AO5(2}1H0?f12
z^>GCM2)=%+Ygj_p(#72M)8OKLtcI0be-YgX@q%gRF!1#zE#=c}hGAk_(_owt*!0B3)eTMgir;)l=xoNnXA31ito`3}-Cc=zhSoqhkuvIJeK<^`i(Y2@5wV-i=H&
zZSh`myV%g6l%DztMz3F@XS2MOe>R@BysFCV`rW)f+DK_dL+z;%l)#tivS&Nh&U2Su
z+s4dK^o@)3)@>RzZ+vy)jWE6$CK|)9s~W_v5SOL6!;gg%D;^}&7;~z~c4hJb?qxg=
znwB4)02N30JMVv@=KWRO7+HwCvQtL
zY88ilnUu(mVh=bSnlF#$f24@bIDX7m#33l~;k(cCC$r9tL?2XdWl&l39l}Z>4^y2V+c>qybMd+?^Wn@kM4>=MJe-^R3Z^Uo=)TKWN
zPK^A%-M7u|moTW$R%}_>Ay?EO#Nm*PRxB-yGVZmaLloy+&k|*(uoCuYdO392T8&ZA
zzVuD(#IR-g{HtWu5|%Hr=3+vmvoqiJCobM^0!o%1KYHIB{B9ws9!od@og7(dm@51)
znsl9N3T19&b98cLms!v%3zxi}EEct?EIJ{VD7-8exB4M0#T%Eu#w-=L)IBWvB$wv%
zEESigv?>p`@?tCt9ha~iEfu#CfGnyWmo#846}QozELR`Wpf(n-ox-AyB
z)y6ClEtjs{Ef%-@^DML|m;Vbc7MJ|?!S+Zj|I$s?-ogRE_3yH|o8y04e+jw#
zcd)nh`ky2J?()xpEIJYr;?fcf|8>DXZAp7G2XnB!B|zQvub?K*=Kn4H
z(^fG7|Jy?UA@d(g0f7ITD}S1}I)gm{`mD_V@(Am{p8rG!|BDh6bMW+G0`hVLnAq8X
z03Zh^CxC~S^XvbiYv$(c46=9qNArTQk(P8w_<@RxWntH(xCRlxAG@s#q|qwh_bYM|JNxw*1C
z`&z{|wXdTkw`><{Sne_u?J#zK(655L$~8G5_$G=T&Ig}P0K|@V)dF
zc|UiINrwvbO(*vKxg1tvlJ?`~UV>b{qKE)YA`Lj5=XHwa(0^?AwuzxtfD$+yW!vVO
zI6EQiZZb|bKjeVOM2=J8Y_=Ng{;|Eb5iYdx5fKZ%`}L`9M&p)&u8B{nK0aj{EvV(0y<2q!CjHJx1+{Uh5jTv
zm-X7wyg{n618NdqTW1bZMIB3zq9dPa&;52UK1Gz+X@A*|0)O!l9GLNeefD_Eor2=5
zer1h=Pj+|BwPz_-SFKP`)UE@u5yL+SKM9!V_h*X)j^l16CNES#P;&3eEJ1X$Fimzn
z^bJVIE-+|0`26lPq_)}TncTZWxukuw9MNI@(p~#zd_7G*OLL1-c&eJm=3w~5Zz^Ct
zwrcQ=Y=0^^_Q_;tyar8UR^AqUezTPLX&sMgafZH`BVYN0dVxsy#-t<6x?
zYAQ02Nk5-|z<67fRCuGoRzo)qxp}nFCecaruYaoKRudiH*rI8)S3l|2g4SMZILq+*
zG9^M7q|4xwqXzYnrui$-+1UM2L9$53+KQzXWI=$oFS3kf8AT96>2&oYF`bvOEa+h`
zCRD_u>UzkO#VAcEj>+MLd;GAB%0Otvd(^Kkc@!X+i~`#_@M?+^rR`M|z{~>ftQT6_
zpns`JD4PB9sI}H65ak4!J3l)gA`UO+-WP&c{t+^1Ad9!m8ZpuZq_Z0!uF_{bQV(Lq
zylnqdaOjk%PeWid0k<|5pW2lylPpT^~8>dhAYpDG~Uq5TbJO0LT~Xt1VdX(q(KLwQ#?DIw*H+$-`t0SKC3U
z!y0X;>7q+}+5RDQRnEx248(`IaXdpqhke1#lYj^g`{V5UwX|f3s9L(NDt|<3`dRVH
z(J6Xk@)Q1Pv2Z;_UI~#nJ|*p<(1GwX
z>J{|i00u<(gq@mPEFNh&pMOM_+&W7`k3F7INoOha)mP@AkCn|o)WaLQyJ{wwv?cqB
zaX*kGGN9n-6VrOhV7DyG^^E*`ff!5%2@@NU&tPt9-MkZ%i9I<+-8zLWd{wmtP8B;|
zU#hGgfob$sk&7sTVy#lmv8_}A`Rp-3#$5qZcDdxaU-Qp4N?x2_`g!h!;40_Fr
z)#L`7*Fn~C-(Fb0Bf0~y9dJAfCxpG^m_|vSb92dGR<}f{61_x3Wi(J*5rs)$;M?hO~$n>QZJ;(bLnL>>F=ENId9)J0DVPM
z3@8i6dAmdfGFN%Ul}^V8hbUrvkSfMzGcy2qQs({(i)8hMQ&S(MW+*X+=}>{^g~#)q
zm!}6Qnj5pq?thFHo5m#@`{Q-rmpJc%n@{~yuDe6G9t5j{n0_)#U^ih_JIfjCkK_;U
zPc`)J;YCDxxbAw)tEqj*ra3hkyD~FHN4;xl!W`VDE^Ng8AF8+gEUTyPHtiNOTz22D
z+oKyLm7G;(){v2$!5-|4V`ia8Ik5+uc1-fEd^-}n1uUeHpcC--}b)CUiyAcRlX%+JD_t!x(Fk-
ziR!>U4}UWSuKz)FQ770VO};sHl7a}TnjSmX;Ku5VtaCU|NJ%Y;{hX?{+qPsCtB@8+
z{AmeE4+)BuYHqSZjJQ0xA(2}y{bn#iqXB*cX9B?=HoT>Z3$Z$)kV?2yQyex$D4EBg
zPq*^T6OV>A-_M!JQ1Xxp!FYJYp|W@1z0m9555Ug^`gJ&nv??$mFm
zI3{9v0PNtYlVM^T#y$U)7A_+f3v{za)79&%jkxnS<^gyz|6euxOO}nOtG?zB8)h7h
zoa%LjYQn#nqRgYO923jZBs|}TvZRueSP|RelQVHKxt^BfqjgVIst8A+JZ*N^UWTN|
zW`F!^q>5K|gs7fA#P9KOh$9kGoNJP>kxw~=82aknGDi>q
zZo#L36zFM4j{g0p-4x00izpng$qH+7pZQm{U>(d#SkgnnKWR`DSd-{@^vR~lneSl?
z^fQ_?@CKVXUXSCbjtk%J^2(3}nr@woIeGNBD!-=v{k6~c)Dz@il_Sq$
zR`voZ+{Y~i!ytSFQM_K6BuxFDcYn`F$hs~4T;cafbsK_*w72us7YI`-$vdpa^yQpI
z{9-P|2NFB2xC0k8*3Vz~+i*+V)c*3P%IF|+{XtI=Bfhwco{##+=|lXA3fy+CU*SCR
z9N?3$^o8z~+Dpne7tiWG6;)r-CcLtvCkiNSVcmpZ6^Hi>&_$le#(qg3tAFdCoUb8n
zUi)dW7;qgtxeAcvQ~HUA6?aKwz@p5ZvCK{bM^$08{Vff@jRrTB*ijUNlBny1JD2#c
z24JYOR(#Dy&3ZJ2zhPQtXw&kJR{!prZs8>DvG4(!gymv-LGMF*=N)sh!7
z&FP4Y&Wnv-8*e!a*Q=o>ZLowhz07!lIOUGCQx>I9#M0AYKrThBnSZS#^BczCk9}lQ
zv`KfNmEXJm%ZV@3>=Dum*$B}F)@Gru-g~gG8HR%*60adnYP7X`lMnN7M@#V~6h8se
zTEs%z3>7F9z)vQ3j{Vjc>>TtW%X8!c>qu=&3)Rokc21dQG+nX*^_qmZtj%N6^OlU!
zyUqg_wss-DESBCj{eR$2f7!;lsqy$G1O9>Nm=Lo!?07`t)Dc_NGsIMgr_r0a;zEPs
znp}NnELQT)i6~vZ8*c2cb$>W}{!I1?c=O!}279kY70;Lk7X*(=(b{j8wkfZ|{;(Je
zdy#b`#g|Lz+)n(_mzsn{tZ!jwBzQtX$yhLRJ1#{F1nU(r8X&k1m*l~rZ*+e6W(VzH_Nkv
z#^t>c78kW{1DTtw+?*cm_x?y
z+JkH!yMi2Pg1wB}Sq>GH%ORL(8N0f5ijdVL=bC1;WzA@EyY_|B+B}`YhqTqfTqxr^
z_J8dx@_4@z5eV`7DX-0lJ7IEao6LLhfqK=V+FQutqM-(;^!>E&o
zoO8`PL`sINksWxKvY=H%dy8tnbU^!f6oxTy$dJ}{c&qcN}mXwRv&uWD{PeZ>((GzK$CLRJF9f@DIk?a({%
z^Nbo8^$~oQK1(m_YW?PcmEGdK=S)n%6{G0lyU_bQ$Ac>W8>%(&!9xRk%12n2rs6q1
z8dFwhh6qC|zOoEpGkWnP{xzW2_rZIu!VXOTFnC;w?or~^Q}f;>hSBV3!js&U4Syk61m4^H
z*g~h&3O5qmFA+idvhH2#=u&q|7RXv|v=_+$EUH)Qn%Z4uoKXZ
z4U*a&q6Tnw?Z;xD3wNs1uWze`H~1c!1!Pel!weBu7Q@)<4R=z;=PWQ+ec;&W&{Bwr
zEol@pUcH%{^|(0Y=3za`rGHuX%b&N!42HbD76B{vfk#3y8OvAu9lk1P94-D<1jM6;1g%Y{nQul`49x
z?er-B8Xw@SI-*{MqkWW-Y?@Nbc?k_gZGGZR(Nj;%<1<%so#>RDh?=899c49y=h69>
z{5ytKZ*hCuX#j4rp?~C6vcdY1)kLAh&~|>xQ~GaxCg0H+@x%Z^T$3oMn7ZO)YXoW}
z{yS3PrzYmflK|;>zpdq(LjhSOgvjPk8Cnw`=w{u-T@%8$$GVR_1#^I`xlCPu#Z0fY
zcR+spLb~WNgAs)@v(|d}BF;svcTw?8U)cd6LR|_qbth+@krg
zXya6S)bPC3Ie&W3Z67H``*XVKVZAZfetmPSiaL&QziqO)1h(pL7V(S$eX2qRlobzs
zns-brbP(*0MFO>?%fE~c`Wa_MQ5KExOeGXFbBp^Ds#*I`iJZMK-aANfwHqtnhucC}
zN!S}x{Cb-!6_QdHF8j#Tcg%dC+l)xmWeOM$9SLIDw11#e9w;$=&MD#IyubT(II^KE}`?wO|E@ad`&`fGi8uxIOW#Nuzr)PusFt
zMpjSf6k3R+$k?FT-7+&YocFKY%7m(St)$NaZsBBq$_AsO<#G|v6iq^Z+T*X#m3d5+
z_!RH#0M0ND+$fj)D#b{q&VcX-FahIzLVda^DSw0|Jwm)uReNAJ$hWfkJ%;<$yR)Z!
z$r(!0M7N&Eoej;#n4T-i9NVqHI0vHq8Z0hwNs$aBXVWasoyf{fi|GIoS94Z&f%40P
zNMm+jj#JX{&%RYb^|Yw7(-aGSL?Z6r24B(S@>8%--!#wVk%}
z7J`S>d`h7oYtGY-ysfcCh6x)#0`l$LHC
z-SNS$B@k`~U4A*J*_)=Eey&&Q>&{FB2T>v6v!!~T?cXl|ShEPYBQz~RE9N?da_)!P
zN#{5;x2hgjm<{PWOQMX%3KlI~%XfG3iG6<2WbBUBoQ$j3VkK3<(^<3WAmG}tB!9Pe
z8Odl_NpAXdUlpyAnWXlyc#Xx}r$Mg4ZtxR!xYYE6@2lj8V{xPJ2^C2R=4@CSjb72h
zyk#K8?d8XDCGMo#QI}1oE-S$1c9LRmfR>*+6P*9_lenvVm9O2@?dZMC<
zRB6<}2Ulz9MFSlWRvFr`O7L5z2Wk^8-{+8=^gZz6v<{AsUd&DHauuln&50G#OuUnR
z1YO2_#lI0Pn_0O*bRydZw5m+$p>_1u-m#^JgR1h4Ataz!9sn&aoJewkCVzJ3`inb=
z{|(@3zU-#}kLjy(GoH3uga|pLW-{nVh4JxGx7)#$O;{%8XTXFlivf)ncH+NwjLOe#^o<}^B*zoTGn?|0F&YxE<2
zv)(pcoE|?{CB4xn&>=d0)qmKMXywHysEb#FRVE{;5g4YOc=?=M4un*AHEncPJop
z=74z4lVX4v>!m{raeaCp2FKe6b1Ac7(d0m@xhMp>$HTsBQdPTR0)O_}Lj!g8?@C~d
zdoByXLuttOE*d1=E|2zInwr@k>vMV1e{H}hIo>zQ;s!#ZAgu}Ev!J@?*-Eh{DJ
zA3^E$E2?Sj-B?SCR_i1mnk%jn-ufB3yeWn+#Ly44Sa8XXC5s+3?(!o!H
z`sW*qhNJJIckJZDP=A5EJ(^uXjs*+*1$&m>+CpMvZfHHydSHBv;yJQK0S;20oO7a6
z{C-<~@fo)?-S0nozaVY7O3PWXC5T9t~Tei?dkivbTnuK5{N7Tx0~YV{M1);kDH
z1BVNY^)%dkvrmxbRWm{SS%Ym=S
zC)x4!XOSTC86Y!Li#U?feW;N7oB&;Tuoz$E4&4DO5!ZZ;O{nMXAg98F7y6VoEWr?B
zF|~|Cl@UH>9!q9y!MbRueVx(v`Hj~0%ZAEH$FiC%OyVjr>Zt^hu_>Qke4)ay>y+jf
zYZO1cf`6N7IV|V`M2zlcIPu&ui6w+za09AmHQxJyOH7^}-s8@+b$Q1`oOU&nVLDc$
z>#qc|Z*+)+u|VuKvO_X9`?H$EU3LhZa@e7u&(3wj$XsYYim#vw?IFdvndAM@w`c;e
zLZ&OK;VKJ!>Ad7jb)9JEu(}nZex#RtGza?gR)2_og#ReWzn@4xRWF!L(`%HFSaH{$
zx-j*2^W*RgdGadZpm?Fc`3AIubGrv;x^TT3QPN#c_Tyrr=%TE*y;o!<
z#6_kwlh$09WPWvRF#b;8b}JH-Gy`nD4>zYwOPkgZTCj#ASkuRVIjK~fR=T3?_A{tC%2jvmhdwk}7-s?V;IqT?ZN?Gh0DmRWaT
zik-&$(XKJi~Rn+ysT51
zE<-(F9L`;(9-(8IH^eb0`ep2k0jmMN7}o&n&w8kN3UUi9Iw{i;>?LSC4*|w&!u=p|Mn^{{IL1JTf0If+&o>q9g5(tqMa&3vuN3SvuvO*KGO{_pUT1UE_!-%j@UYSgzf}Z9J$g(xN>p>TvVzqzaWSsLNT%p!ZT7H{6p+_W
z8&`W>%^8MpXi=n&o`HLz_w1ry2)cw2x;0Ni^N?W(sgcyg%+>c~YkoTpE{~H;DWk0n
zOiR2MT0P*;ND#adB!5_Wh5c$Dsh?uNQ}qc<_4Mu~V=DQm;Xs-5dNFL>6xYkA+X9Lg
z7$LIQ7$^~`(F-#jPMM~#S@~86Y|Yawq!ug=O&6w7bq|Uq4TQ!^9Hr~rY_{GhiLmyP
zY^7&2wjZ;UZ*xT=vyPV~2via}@6adZ(@HsfbPE&XJY%D}u75Ay@M%8#@=FE<9+h6Z
z(+nO-9f%w3A>}RTgfT0&qCCnzZj*7@QqeqEO52`~Dg>$3Bq?*YmEGi#)lbZ?l1Qoa
z^VEckahYeHIei?S;ZumA*P{9R+q1C^q9+jSbDQXZ-_l25YtEeM6AU)A(tOQS+7(3O
zs^i7d19GzPjDL3n;b!X~`zZWH1>xV{aVVLC9;^Eg_%^|lZ@V|V-bVw5cUc#%E8R!&
zGEm#SYAVOC0aW#Ii^2_&*cWDpRh!U`VH3L
z@sf_{8@Ke=29IV=m=&85iJ3$gH*)rl?K$O6?h9RQU~%=cw=MP9f7a>WwoQO$4ro{0
z)NkF5giecc+l=FF%f_}n47R_ZTp-3lE6#8+S!g#9ad$e%5T#3D9m#nbj;x!#=Dl=ovtNXjIHMFRo8R5EAUTO#i!
zQb21z4MAQ^R6y;I3zxde?%mN=3YTyriho26`+t$D&Y?8NxcE-d2K9=l_KF9uJ84+$
zkTgX4-Bp8j!e!suJD7pOZ`Fd2MlY4MSm7^h$hJ{j
zrg;f45hl1R)i{w(j?6q$zw!4pUlZedYHGN{2rl~Z1VnD=QFszZ{7KG#AZ))#sH?*o
zE`L9VOsK$~aO!xtu6Bu|tY|n7WL^Rdk39lHkg0I|eIB=N3B{|k$~Ncjt=zSb2IF0t
za-yA|-vbM}e_^P9t0Hwy=kfGeBX+!|?}8bKw(*wNU2R*ZzUq{Rrx&@s>+M9c*9YG~ZZo?!6f!aSC8*_!#zKR{H3ia+snd
z7cAmcO0+49O{f+^OcB!L8Iw2>=62)3JFtt)uGz=7=0q-Rq(fvfMt=OQ~*uG~c;GLmliAC9mBb
zlJs27@cNwO?U^Td5Hf_a9jQ5E^TWImy0(;o-_ae7({MWsWLHD`gJ^!M`RSNi!bwFM
zm58>=Z)-2tMM%}LibcO~vRrX-V}ID%lZ8>MO_Xga(-Dd@h@q)k|B}*A3%Xb)%Z&%!
znL=eVqnLvAWY`w5R9{!m=94|@$wmEWzPT7}TGK=TUeS2P9;!oK^#J}?Ug8HRJlik6
zGBo&fXwmz53v(gue$p4Tlb+0At4|B5;scU_4xMcpfC#|jcevUpIc4GAZ-3DH_DD(u
z$$NLb!T#4%R-{&1b{Tw*!l~a5rn#fEHR&8G;)cZ#PX+YXhYI!y-r7_}h0
zC`X8-E0Z1<40fe3&A2?m(|_$_KmUo4JD;Dvih!aryunWovPhjOBorDt=At`(tXrLY
z_~y?YkkLSwb(*x43wiP(D;RMLPZ^{*6_8+Kg*v4YO5~$AqX#g*59oDxjs_-fya#%n
z7maxyNpRUkB`!Ybs+>{iLfG0OGKH5qv(vnt!>pGdl|b)h#6s
zRk(bQ%NL2g>R2BOLfFT4p*o+KNObxZWQ$+x?RRdmNG&UNSIBLuts_Z(80E?kL9P+>
zU}o$}c2JpBYX9EFkWO$()=QUf_UIG^bCR6EiNA`Z$(R*fu*ASR=52%0{nAxkru-z)
zY8e%d(A4;vT$_~E1%Dr4v6-6toxzhYsvxSkLMAiS!-oew!(%ixYKxs_%e=j9K&BoBg
z($>-mpk{Al?`COi0pM|Sa}#iLcBON2<)iy2NZHgB;9_A4FtfBV1qjP4XiCb711QDi
zQ~~0qcBW2-HUI@zBO6O&fQ+TFshzVa6~N5i31IW@1Hjnc&cyP6Fgerx3xJ)Qsguh;
zz|5TNZ2_{PDuQD2aw-5(e_?tRVSu5X2|z~TpKv=DXRd#|O^uyB{-?TM0GIzrHirL^
zjQ%72$Li$qZ;1|ukr7~GY3u?pGBvlfgQ5Q)-X!hJ>;Y{5R-3pw{IBUBK+gZ*15p0M
zAr-*H)a)N`R~s8SLt9e-rLeuNgR6_F6F}D9#MH?SploR8{0}2ef0kwdA$uDWfT)Ly
zshx?b$^Ql#+FIIp{(nRNuZN_I;Xl*}+L`}DCd0o~md;|99;PM=mM;IWZ)Rxo4{QI@
ztDFA2RsWOd-_if10{qir6MH)w&;Oh7pSu2)j$TtsTu4-j=06khFSDqfvAv0gDYp!=VBVfdfteR>Z{k1pd
z4zWsa4XgAMEuO0ar6ym35f#qIlBA;-mPcSDIJ7xhz3)(t?StqqkUY<{hWmV#YRt4<%v2nh$
zg{#N|PlFnzmZ{Sypyct1bS#qw!#3_~Hi4Ad4Nbk^e>3S^pTG*#w_cR*xzo+SDouga
zsQIkPj4<2Seqd;x&qn2i!~_*vpPnC>08c9f>{dAyc1G|RaM4C)z?Z!2@fn65yUh!d
zBOHt_uy6)TD-(aWa*AL}Pf6*7%QKPB8z@B<7Ffd9L-uyRz`z?R)pn^yK(nWE7volj
zco^PHe~oR5EXM9W<|9Cg=a0mjmmfGgPI`zw#kldeRX}5Dh4KaBCaXVj*_@#(WX5(v
zZ4H`E23JevntldzB5VRx6c(fAA#J+Jzxe&(qsQfsh_iW7tl+7+B~3yLj?NNW^=YAF
zg(HDS|Dd;BcID10(K^YWamGnYri}=VGoLb6f5mI&nScx7>=dp1_!X_S&zZ*sYE8@?
zb-?f0jNQvg_YzF!p|ze4>)J+;Z!pLhilae;T|$}rs@SeY50t23umF}neI+z^7kV&j
zy^0Qn;45H}enio*e={TVvW!`P+Z=>4>G4+V9p4
z*Uy2oBe7R}=)dVqOE#j8;z;)-F3%u3y1_=Ee&DL=Y4ftQ5IcAJECj-s8TO%6;$J-3
zmlM)yM>x>F+`YJ>pmpn(Eb;Y2geOX?e@{WTsv?qvWo!dnux_l6ISC4~S&}pwzC4#~
zAsIlF4g!;j_WD@)He7W99@Ncms6ecs&!E8fi~-ylsVly`^)-r1LR<0ipR2yGe-q_8GvE
zMFu0v#Y>ECip9{ux!p-D0f?~_N!Y;2;RmJDAE@(^VhD&(c4)U2zTxYEqJY83&0%||
zJV|g;{!CD{wl3F?)2t8HFZM;fe<0cdiSnMaK*H{Sm>khO%&4sko7Eh;Y0Ogi7E1Tr
zuytPxU3Fx`ola+#*WL`&52FV-$vm6{EY;Wy&bpp~UgdrB=)3PsgLLdJfO+o9_Wz|@
z)-?<8f)@Ko`A8YT@YWfa`x=DIP#Y?KkYK!O%`!ljImVjTl~=a)Jtl!6f2fB{I|%PR
zO=03i;F^qm-PtS+zL`{^?mDe%MKf{QM3!MfDe`*DJR+M%5MAVYQ;AH&5JIfUPBOlk!FuJXN3s}BQn!cYW
z@>i~`KTD>98ahx*-Y{?H>d^dl|7XCKKT1gRkRFakP>l#6``S{a=zh=)SsN0P=XFE@
zQese?5p_^JHD)j`d9rNH1#$|;jaiA%`<|=g8l(r6
zbBx-Cc3oVnK8WJz&~IkJZ{gp39!)E_?&=dQ3M$!~voFY;V~@%6Het3z`m;fCaByeB
zL56h$iIg&E?(!X?^2`ow+&XJl{=6!Zd{ooD-j^^`Bxfty{10Cdo%4azfyf`=pYN@^
z5qEJB+O3xfhb=C|_TYjt5I;?;cPN%m2Kp_Wxn%++RKpJXx^2e^kghMjPT`isd>b8CFZ!
zMjaz=WJa~mjAUR5Y@s^Xm+B{!ed9SvmVi?y(Yb-mExNFj7X~IGFi>y`FwoneR=eA(
zc7U}R&CbF#{;kGxLayr48R2mhpSF>~gYsf)3MICmnW+#?2pGaSAdFF}>E52hH2)k0
zf_DJ;kHNEpf52@qZ|$s)X38tIm!3~8W$C)(#e*q@w{1$qOg~yO`?a#w?F)Xde|Z$$0+sqz@j%Fbo>Sxr5`JU&
z?E>yJ(Y*4snj79KW%yD+b!eavtKoAH+Zu7xLp@nUcxK$fbz6i$@#Om&Si6%Hu*b`S
zzb?7I!
z&QYv3e^ZlgM(Y}LQy~kr8e??v(}=ae1>*0ufLIBwMF!u)ANd^vsinC$&XTZ@8o?d5
z*XOz1?t{bQ$vTJ)N!cO8zZK&uV{0s-n&M>JD+%@vn^5Dp5$s-0)fVApm8MNFx-K>0
zY^Ml>4qG?tLzG#k3Nofv0erac=3ZgqpNoL%O-g!^N)i5tm+apt4>7oQMC6i
z{X0k8td~sJB*8>XejUW?YGj?EL@Vr+lYeeqsR?_nb|?f&URG^o+ItqKbxJp_!Q{AA
ze;X8OyN~!Wfm9>JK@iW6JJicc4EexDF3e6T+c}ho8khh9Vnd$+tw)_pDkY5Z)yMB_
z?I#Zkt4=1by+`}R>1tz2nql^7N;p^I+^L&6fX{n1W=`;Von$YHErGbXk`o-8y;r!THu!uSIXpZJ$}rLN-1I{W+hGg|v|{OGQh2H
zFt^+s$}*{Eu?e%LAAx{{{=A0oEp#2j66j4?4{G*8q7MO8QK~_^zkj(WfACJMf2E;K
z(IQ)>H1HyBu7#y|QJb+}_XjFtQ>$jb*_7$WRHoY{F4fhiOm=kLU%(^nb-8t_ih^TU
zsgs4mjk%-{OnjhGaqb~W>Z@Kx@ey0e>HyZi!WYZhrGyR_XGx!wo;I!X`;0!2Spxsa
zgs9FV?><$@)upln!y+`Fj8s-5e~+=(FfY&xFzuC!W8n3!zw%l8dA;~JBMX4aPW-O-gW$hklf{1ZDR3doPd~<~~
zS<1S+9q8y-b=8|=J%adSf_c$g-ALKHYw~tcufyZ07I!ts)?(`7$Kb-2e>At%qiDj@
zC^RK}94slLcOow^e=0(hxohtA90qg-RK8u4#!pcv@xHrg!xViOsOL?2E}YU2>9Qp(
zU>1I(Ag}P_jQjMYR!Dmu3mbf-2=iuv=G&esLF9nOCnd9__^lf9Zp7C
zk)R$~F6B?XS}55O&b0kayB{u>?kV
zZU0Sjdz7k&6S^|+>;O<-bKE2yd{|XTLTOwf{QFxr*$^2f
zdyOb7QI|UY+*cmO=;}4^_VVjjJ}e;nYw+42YZUSJw3uV^YYB{wNtw=%y%SdQnj+7;
zlSJoqhiFOEwuP9kf51B)^#S7ne(Gc0{JEmYJx5Vof?rRpk)Q5+YL-4{kwqPXn^;6*
zbPST_8xP8RsxMq@EvJe5cliyTeEe${zV!YPFfXs{#RI#4f_2gFO+h~je6Q=C6{woz
z{@+$M^~}psLzt;dnsO9u47NcP;WyGWfF$|DZs(`T5Y?9Re>NfBDpegUu3HPV!}&+z
z^c89v^eh7%YWDk^k1Te=iwdhKU+l!#MlC;+B<1QJc2&kCxxu;3P<+$h-lFoHan42Y
zd7;_I9~qbCu;u3itU6w?bPX+>f;F@EL%|U5VgNTl$iLr@nvBu{R*USy^OtN-i!~$H
zyi{_Dt?XgJqO}sW*nbpl(L>#Md@Tkow>6G!9DrRlI5Op5@
zb@KAc3axn2-9G_FQfy{@^aPSS`U&MRWXP_#ukL>q5#endKPR^Ba5>o4+iCEHg5JNMRB)v(Ph~-zTDaSR|&N-umb7k6C
z9!Vn6c!AyDtQM7-u0#MUF*erc!Yf`tD5por5e7t}4
zr4lPkNOa!)r0qcWPv9&lKUY+c^-YD5%||B}-bOB($aD7xRY~O&Dq*QsaN5l!B~?gR5OXDdgj!=_`A_;?D~ODiJQ-1QRX+mRRp)D=_gbl?76QJ0>
z;ztdvS53pgs2NAn)7#E#(w9wGTplUist|*ERYAp~(F?Ud^i6&}u{Y{^j>udBB`>b%
z$gU@ftWE)gPN|7oG9j3U*dA3O(sR@ibARu$(BUXWptr