diff --git a/README.md b/README.md index 723111c..1237e33 100644 --- a/README.md +++ b/README.md @@ -125,7 +125,7 @@ _Detailed notes at the link_ https://compphysics.github.io/MachineLearning/doc/L | Recommended readings | Hastie et al Chapter 3 | | | Lecture material at https://compphysics.github.io/MLErasmus/doc/web/course.html sessions 3 and 4 | | | Video of Lecture at https://youtu.be/iqRKUPJr_bY | -| | Handwritten notes at Handwritten notes at https://github.com/CompPhysics/MLErasmus/blob/master/doc/HandwrittenNotes/2023/NotesOct162023.pdf | +| | Handwritten notes at https://github.com/CompPhysics/MLErasmus/blob/master/doc/HandwrittenNotes/2023/NotesOct162023.pdf | | Monday October 23 | - _Lecture 815am-10am_: Resampling Methods and Bias-Variance tradeoff (MHJ) | | Recommended readings | Hastie et al chapter 7 | | | Lecture material at https://compphysics.github.io/MLErasmus/doc/web/course.html session 4 material | diff --git a/doc/pub/day3/html/day3-bs.html b/doc/pub/day3/html/day3-bs.html index 61f10b7..ab8ceb1 100644 --- a/doc/pub/day3/html/day3-bs.html +++ b/doc/pub/day3/html/day3-bs.html @@ -281,7 +281,11 @@ ('Exercise 2: Expectation values for Ridge regression', 2, None, - 'exercise-2-expectation-values-for-ridge-regression')]} + 'exercise-2-expectation-values-for-ridge-regression'), + ('Exercise 3: Bias-Variance tradeoff', + 2, + None, + 'exercise-3-bias-variance-tradeoff')]} end of tocinfo --> @@ -405,6 +409,7 @@
  • Overarching aims of the exercises this week
  • Exercise 1: Expectation values for ordinary least squares expressions
  • Exercise 2: Expectation values for Ridge regression
  • +
  • Exercise 3: Bias-Variance tradeoff
  • @@ -433,7 +438,7 @@

    Data Analysis and Machine Learning: Ridge and Lasso Regression and Resamplin
    -

    October 15 and 22, 2023

    +

    October 16 and 23, 2023


    @@ -447,8 +452,8 @@

    Plans for Sessions 4-6

  • More on Ridge and Lasso Regression
  • Statistics, probability theory and resampling methods
  • @@ -3571,6 +3576,74 @@

    Exerc

    and it is easy to see that if the parameter \( \lambda \) goes to infinity then the variance of the Ridge parameters \( \boldsymbol{\beta} \) goes to zero.

    + + + +

    Exercise 3: Bias-Variance tradeoff

    + +

    The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method.

    + +

    Consider a +dataset \( \mathcal{L} \) consisting of the data +\( \mathbf{X}_\mathcal{L}=\{(y_j, \boldsymbol{x}_j), j=0\ldots n-1\} \). +

    + +

    We assume that the true data is generated from a noisy model

    + +$$ +\boldsymbol{y}=f(\boldsymbol{x}) + \boldsymbol{\epsilon}. +$$ + +

    Here \( \epsilon \) is normally distributed with mean zero and standard +deviation \( \sigma^2 \). +

    + +

    In our derivation of the ordinary least squares method we defined +an approximation to the function \( f \) in terms of the parameters +\( \boldsymbol{\beta} \) and the design matrix \( \boldsymbol{X} \) which embody our model, +that is \( \boldsymbol{\tilde{y}}=\boldsymbol{X}\boldsymbol{\beta} \). +

    + +

    The parameters \( \boldsymbol{\beta} \) are in turn found by optimizing the mean +squared error via the so-called cost function +

    + +$$ +C(\boldsymbol{X},\boldsymbol{\beta}) =\frac{1}{n}\sum_{i=0}^{n-1}(y_i-\tilde{y}_i)^2=\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]. +$$ + +

    Here the expected value \( \mathbb{E} \) is the sample value.

    + +

    Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a +term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise. +That is, show that +

    +$$ +\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]=\mathrm{Bias}[\tilde{y}]+\mathrm{var}[\tilde{y}]+\sigma^2, +$$ + +

    with

    +$$ +\mathrm{Bias}[\tilde{y}]=\mathbb{E}\left[\left(\boldsymbol{y}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right], +$$ + +

    and

    +$$ +\mathrm{var}[\tilde{y}]=\mathbb{E}\left[\left(\tilde{\boldsymbol{y}}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right]=\frac{1}{n}\sum_i(\tilde{y}_i-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right])^2. +$$ + +

    Explain what the terms mean and discuss their interpretations.

    + +

    Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by +studying the MSE value as function of the complexity of your model. Use ordinary least squares only. +

    + +

    Discuss the bias and variance trade-off as function +of your model complexity (the degree of the polynomial) and the number +of data points, and possibly also your training and test data using the bootstrap resampling method. +You can follow the code example in the jupyter-book at https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/chapter3.html#the-bias-variance-tradeoff. +

    + diff --git a/doc/pub/day3/html/day3-reveal.html b/doc/pub/day3/html/day3-reveal.html index 759739b..2f7c403 100644 --- a/doc/pub/day3/html/day3-reveal.html +++ b/doc/pub/day3/html/day3-reveal.html @@ -184,7 +184,7 @@

    Data Analysis and Machine Learning: Ridge and La
    -

    October 15 and 22, 2023

    +

    October 16 and 23, 2023


    @@ -202,9 +202,9 @@

    Plans for Sessions 4-6

  • Statistics, probability theory and resampling methods
  • @@ -3667,6 +3667,84 @@

    Exercise 2: Expectat

    and it is easy to see that if the parameter \( \lambda \) goes to infinity then the variance of the Ridge parameters \( \boldsymbol{\beta} \) goes to zero.

    + + + +

    Exercise 3: Bias-Variance tradeoff

    + +

    The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method.

    + +

    Consider a +dataset \( \mathcal{L} \) consisting of the data +\( \mathbf{X}_\mathcal{L}=\{(y_j, \boldsymbol{x}_j), j=0\ldots n-1\} \). +

    + +

    We assume that the true data is generated from a noisy model

    + +

     
    +$$ +\boldsymbol{y}=f(\boldsymbol{x}) + \boldsymbol{\epsilon}. +$$ +

     
    + +

    Here \( \epsilon \) is normally distributed with mean zero and standard +deviation \( \sigma^2 \). +

    + +

    In our derivation of the ordinary least squares method we defined +an approximation to the function \( f \) in terms of the parameters +\( \boldsymbol{\beta} \) and the design matrix \( \boldsymbol{X} \) which embody our model, +that is \( \boldsymbol{\tilde{y}}=\boldsymbol{X}\boldsymbol{\beta} \). +

    + +

    The parameters \( \boldsymbol{\beta} \) are in turn found by optimizing the mean +squared error via the so-called cost function +

    + +

     
    +$$ +C(\boldsymbol{X},\boldsymbol{\beta}) =\frac{1}{n}\sum_{i=0}^{n-1}(y_i-\tilde{y}_i)^2=\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]. +$$ +

     
    + +

    Here the expected value \( \mathbb{E} \) is the sample value.

    + +

    Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a +term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise. +That is, show that +

    +

     
    +$$ +\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]=\mathrm{Bias}[\tilde{y}]+\mathrm{var}[\tilde{y}]+\sigma^2, +$$ +

     
    + +

    with

    +

     
    +$$ +\mathrm{Bias}[\tilde{y}]=\mathbb{E}\left[\left(\boldsymbol{y}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right], +$$ +

     
    + +

    and

    +

     
    +$$ +\mathrm{var}[\tilde{y}]=\mathbb{E}\left[\left(\tilde{\boldsymbol{y}}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right]=\frac{1}{n}\sum_i(\tilde{y}_i-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right])^2. +$$ +

     
    + +

    Explain what the terms mean and discuss their interpretations.

    + +

    Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by +studying the MSE value as function of the complexity of your model. Use ordinary least squares only. +

    + +

    Discuss the bias and variance trade-off as function +of your model complexity (the degree of the polynomial) and the number +of data points, and possibly also your training and test data using the bootstrap resampling method. +You can follow the code example in the jupyter-book at https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/chapter3.html#the-bias-variance-tradeoff. +

    + diff --git a/doc/pub/day3/html/day3-solarized.html b/doc/pub/day3/html/day3-solarized.html index 1fb9c0d..75423f1 100644 --- a/doc/pub/day3/html/day3-solarized.html +++ b/doc/pub/day3/html/day3-solarized.html @@ -308,7 +308,11 @@ ('Exercise 2: Expectation values for Ridge regression', 2, None, - 'exercise-2-expectation-values-for-ridge-regression')]} + 'exercise-2-expectation-values-for-ridge-regression'), + ('Exercise 3: Bias-Variance tradeoff', + 2, + None, + 'exercise-3-bias-variance-tradeoff')]} end of tocinfo --> @@ -346,7 +350,7 @@

    Data Analysis and Machine Learning: Ridge and Lasso Regression and Resamplin
    -

    October 15 and 22, 2023

    +

    October 16 and 23, 2023


    @@ -357,8 +361,8 @@

    Plans for Sessions 4-6

  • More on Ridge and Lasso Regression
  • Statistics, probability theory and resampling methods










  • @@ -3472,6 +3476,74 @@

    Exercise 2: Expectat

    and it is easy to see that if the parameter \( \lambda \) goes to infinity then the variance of the Ridge parameters \( \boldsymbol{\beta} \) goes to zero.

    + + + +

    Exercise 3: Bias-Variance tradeoff

    + +

    The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method.

    + +

    Consider a +dataset \( \mathcal{L} \) consisting of the data +\( \mathbf{X}_\mathcal{L}=\{(y_j, \boldsymbol{x}_j), j=0\ldots n-1\} \). +

    + +

    We assume that the true data is generated from a noisy model

    + +$$ +\boldsymbol{y}=f(\boldsymbol{x}) + \boldsymbol{\epsilon}. +$$ + +

    Here \( \epsilon \) is normally distributed with mean zero and standard +deviation \( \sigma^2 \). +

    + +

    In our derivation of the ordinary least squares method we defined +an approximation to the function \( f \) in terms of the parameters +\( \boldsymbol{\beta} \) and the design matrix \( \boldsymbol{X} \) which embody our model, +that is \( \boldsymbol{\tilde{y}}=\boldsymbol{X}\boldsymbol{\beta} \). +

    + +

    The parameters \( \boldsymbol{\beta} \) are in turn found by optimizing the mean +squared error via the so-called cost function +

    + +$$ +C(\boldsymbol{X},\boldsymbol{\beta}) =\frac{1}{n}\sum_{i=0}^{n-1}(y_i-\tilde{y}_i)^2=\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]. +$$ + +

    Here the expected value \( \mathbb{E} \) is the sample value.

    + +

    Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a +term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise. +That is, show that +

    +$$ +\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]=\mathrm{Bias}[\tilde{y}]+\mathrm{var}[\tilde{y}]+\sigma^2, +$$ + +

    with

    +$$ +\mathrm{Bias}[\tilde{y}]=\mathbb{E}\left[\left(\boldsymbol{y}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right], +$$ + +

    and

    +$$ +\mathrm{var}[\tilde{y}]=\mathbb{E}\left[\left(\tilde{\boldsymbol{y}}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right]=\frac{1}{n}\sum_i(\tilde{y}_i-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right])^2. +$$ + +

    Explain what the terms mean and discuss their interpretations.

    + +

    Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by +studying the MSE value as function of the complexity of your model. Use ordinary least squares only. +

    + +

    Discuss the bias and variance trade-off as function +of your model complexity (the degree of the polynomial) and the number +of data points, and possibly also your training and test data using the bootstrap resampling method. +You can follow the code example in the jupyter-book at https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/chapter3.html#the-bias-variance-tradeoff. +

    +
    diff --git a/doc/pub/day3/html/day3.html b/doc/pub/day3/html/day3.html index 94818c6..def74b7 100644 --- a/doc/pub/day3/html/day3.html +++ b/doc/pub/day3/html/day3.html @@ -385,7 +385,11 @@ ('Exercise 2: Expectation values for Ridge regression', 2, None, - 'exercise-2-expectation-values-for-ridge-regression')]} + 'exercise-2-expectation-values-for-ridge-regression'), + ('Exercise 3: Bias-Variance tradeoff', + 2, + None, + 'exercise-3-bias-variance-tradeoff')]} end of tocinfo --> @@ -423,7 +427,7 @@

    Data Analysis and Machine Learning: Ridge and Lasso Regression and Resamplin


    -

    October 15 and 22, 2023

    +

    October 16 and 23, 2023


    @@ -434,8 +438,8 @@

    Plans for Sessions 4-6

  • More on Ridge and Lasso Regression
  • Statistics, probability theory and resampling methods










  • @@ -3549,6 +3553,74 @@

    Exercise 2: Expectat

    and it is easy to see that if the parameter \( \lambda \) goes to infinity then the variance of the Ridge parameters \( \boldsymbol{\beta} \) goes to zero.

    + + + +

    Exercise 3: Bias-Variance tradeoff

    + +

    The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method.

    + +

    Consider a +dataset \( \mathcal{L} \) consisting of the data +\( \mathbf{X}_\mathcal{L}=\{(y_j, \boldsymbol{x}_j), j=0\ldots n-1\} \). +

    + +

    We assume that the true data is generated from a noisy model

    + +$$ +\boldsymbol{y}=f(\boldsymbol{x}) + \boldsymbol{\epsilon}. +$$ + +

    Here \( \epsilon \) is normally distributed with mean zero and standard +deviation \( \sigma^2 \). +

    + +

    In our derivation of the ordinary least squares method we defined +an approximation to the function \( f \) in terms of the parameters +\( \boldsymbol{\beta} \) and the design matrix \( \boldsymbol{X} \) which embody our model, +that is \( \boldsymbol{\tilde{y}}=\boldsymbol{X}\boldsymbol{\beta} \). +

    + +

    The parameters \( \boldsymbol{\beta} \) are in turn found by optimizing the mean +squared error via the so-called cost function +

    + +$$ +C(\boldsymbol{X},\boldsymbol{\beta}) =\frac{1}{n}\sum_{i=0}^{n-1}(y_i-\tilde{y}_i)^2=\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]. +$$ + +

    Here the expected value \( \mathbb{E} \) is the sample value.

    + +

    Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a +term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise. +That is, show that +

    +$$ +\mathbb{E}\left[(\boldsymbol{y}-\boldsymbol{\tilde{y}})^2\right]=\mathrm{Bias}[\tilde{y}]+\mathrm{var}[\tilde{y}]+\sigma^2, +$$ + +

    with

    +$$ +\mathrm{Bias}[\tilde{y}]=\mathbb{E}\left[\left(\boldsymbol{y}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right], +$$ + +

    and

    +$$ +\mathrm{var}[\tilde{y}]=\mathbb{E}\left[\left(\tilde{\boldsymbol{y}}-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right]\right)^2\right]=\frac{1}{n}\sum_i(\tilde{y}_i-\mathbb{E}\left[\boldsymbol{\tilde{y}}\right])^2. +$$ + +

    Explain what the terms mean and discuss their interpretations.

    + +

    Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by +studying the MSE value as function of the complexity of your model. Use ordinary least squares only. +

    + +

    Discuss the bias and variance trade-off as function +of your model complexity (the degree of the polynomial) and the number +of data points, and possibly also your training and test data using the bootstrap resampling method. +You can follow the code example in the jupyter-book at https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/chapter3.html#the-bias-variance-tradeoff. +

    +
    diff --git a/doc/pub/day3/ipynb/day3.ipynb b/doc/pub/day3/ipynb/day3.ipynb index 8cdade7..96aee94 100644 --- a/doc/pub/day3/ipynb/day3.ipynb +++ b/doc/pub/day3/ipynb/day3.ipynb @@ -2,8 +2,10 @@ "cells": [ { "cell_type": "markdown", - "id": "7fcd4e2f", - "metadata": {}, + "id": "dcfb8aab", + "metadata": { + "editable": true + }, "source": [ "\n", @@ -12,19 +14,23 @@ }, { "cell_type": "markdown", - "id": "4a75243e", - "metadata": {}, + "id": "7befd095", + "metadata": { + "editable": true + }, "source": [ "# Data Analysis and Machine Learning: Ridge and Lasso Regression and Resampling Methods\n", "**Morten Hjorth-Jensen**, Department of Physics and Center for Computing in Science Education, University of Oslo, Norway and Department of Physics and Astronomy and Facility for Rare Isotope Beams and National Superconducting Cyclotron Laboratory, Michigan State University, USA\n", "\n", - "Date: **October 15 and 22, 2023**" + "Date: **October 16 and 23, 2023**" ] }, { "cell_type": "markdown", - "id": "7c319be3", - "metadata": {}, + "id": "c2018078", + "metadata": { + "editable": true + }, "source": [ "## Plans for Sessions 4-6\n", "\n", @@ -32,15 +38,17 @@ "\n", "* Statistics, probability theory and resampling methods\n", "\n", - " * [Video of Lecture October 15 to be added](https://youtu.be/)\n", + " * [Video of Lecture October 16 to be added](https://youtu.be/iqRKUPJr_bY)\n", "\n", - " * [Video of Lecture October 22 to be added](https://youtu.be/)" + " * [Video of Lecture October 23 to be added](https://youtu.be/)" ] }, { "cell_type": "markdown", - "id": "5217a54a", - "metadata": {}, + "id": "db5859b5", + "metadata": { + "editable": true + }, "source": [ "## Ridge and LASSO Regression\n", "\n", @@ -50,8 +58,10 @@ }, { "cell_type": "markdown", - "id": "6b16a092", - "metadata": {}, + "id": "86774668", + "metadata": { + "editable": true + }, "source": [ "$$\n", "{\\displaystyle \\min_{\\boldsymbol{\\beta}\\in {\\mathbb{R}}^{p}}}\\frac{1}{n}\\left\\{\\left(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta}\\right)^T\\left(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta}\\right)\\right\\}.\n", @@ -60,16 +70,20 @@ }, { "cell_type": "markdown", - "id": "9eaad472", - "metadata": {}, + "id": "3c2c57bc", + "metadata": { + "editable": true + }, "source": [ "or we can state it as" ] }, { "cell_type": "markdown", - "id": "72444a78", - "metadata": {}, + "id": "bbb2dfb8", + "metadata": { + "editable": true + }, "source": [ "$$\n", "{\\displaystyle \\min_{\\boldsymbol{\\beta}\\in\n", @@ -79,16 +93,20 @@ }, { "cell_type": "markdown", - "id": "891fdce0", - "metadata": {}, + "id": "148b386e", + "metadata": { + "editable": true + }, "source": [ "where we have used the definition of a norm-2 vector, that is" ] }, { "cell_type": "markdown", - "id": "79e19a07", - "metadata": {}, + "id": "ecb6c1e6", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\vert\\vert \\boldsymbol{x}\\vert\\vert_2 = \\sqrt{\\sum_i x_i^2}.\n", @@ -97,8 +115,10 @@ }, { "cell_type": "markdown", - "id": "33387932", - "metadata": {}, + "id": "550bb691", + "metadata": { + "editable": true + }, "source": [ "## From OLS to Ridge and Lasso\n", "\n", @@ -110,8 +130,10 @@ }, { "cell_type": "markdown", - "id": "216ae801", - "metadata": {}, + "id": "e0166782", + "metadata": { + "editable": true + }, "source": [ "$$\n", "{\\displaystyle \\min_{\\boldsymbol{\\beta}\\in\n", @@ -121,8 +143,10 @@ }, { "cell_type": "markdown", - "id": "a71c6474", - "metadata": {}, + "id": "869ba708", + "metadata": { + "editable": true + }, "source": [ "which leads to the Ridge regression minimization problem where we\n", "require that $\\vert\\vert \\boldsymbol{\\beta}\\vert\\vert_2^2\\le t$, where $t$ is\n", @@ -131,8 +155,10 @@ }, { "cell_type": "markdown", - "id": "29bc8526", - "metadata": {}, + "id": "420257c6", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{X},\\boldsymbol{\\beta})=\\frac{1}{n}\\vert\\vert \\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta}\\vert\\vert_2^2+\\lambda\\vert\\vert \\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -141,16 +167,20 @@ }, { "cell_type": "markdown", - "id": "440d1dcd", - "metadata": {}, + "id": "80f0707b", + "metadata": { + "editable": true + }, "source": [ "we have a new optimization equation" ] }, { "cell_type": "markdown", - "id": "053066f3", - "metadata": {}, + "id": "a3b4e965", + "metadata": { + "editable": true + }, "source": [ "$$\n", "{\\displaystyle \\min_{\\boldsymbol{\\beta}\\in\n", @@ -160,8 +190,10 @@ }, { "cell_type": "markdown", - "id": "ab631a74", - "metadata": {}, + "id": "6b3e8c06", + "metadata": { + "editable": true + }, "source": [ "which leads to Lasso regression. Lasso stands for least absolute shrinkage and selection operator. \n", "\n", @@ -170,8 +202,10 @@ }, { "cell_type": "markdown", - "id": "ee57164b", - "metadata": {}, + "id": "e5301748", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\vert\\vert \\boldsymbol{x}\\vert\\vert_1 = \\sum_i \\vert x_i\\vert.\n", @@ -180,8 +214,10 @@ }, { "cell_type": "markdown", - "id": "72acfab8", - "metadata": {}, + "id": "55d97304", + "metadata": { + "editable": true + }, "source": [ "## Deriving the Ridge Regression Equations\n", "\n", @@ -190,8 +226,10 @@ }, { "cell_type": "markdown", - "id": "e91dae68", - "metadata": {}, + "id": "1d40b7bb", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{X},\\boldsymbol{\\beta})=\\left\\{(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})^T(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\right\\}+\\lambda\\boldsymbol{\\beta}^T\\boldsymbol{\\beta},\n", @@ -200,8 +238,10 @@ }, { "cell_type": "markdown", - "id": "960b85a2", - "metadata": {}, + "id": "154fbe54", + "metadata": { + "editable": true + }, "source": [ "and \n", "taking the derivatives with respect to $\\boldsymbol{\\beta}$ we obtain then\n", @@ -212,8 +252,10 @@ }, { "cell_type": "markdown", - "id": "bec7054f", - "metadata": {}, + "id": "301adcda", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}_{\\mathrm{Ridge}} = \\left(\\boldsymbol{X}^T\\boldsymbol{X}+\\lambda\\boldsymbol{I}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y},\n", @@ -222,16 +264,20 @@ }, { "cell_type": "markdown", - "id": "d9f7bf1e", - "metadata": {}, + "id": "a039a074", + "metadata": { + "editable": true + }, "source": [ "with $\\boldsymbol{I}$ being a $p\\times p$ identity matrix with the constraint that" ] }, { "cell_type": "markdown", - "id": "9eaa21b8", - "metadata": {}, + "id": "2343998c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\sum_{i=0}^{p-1} \\beta_i^2 \\leq t,\n", @@ -240,8 +286,10 @@ }, { "cell_type": "markdown", - "id": "7ae4b47c", - "metadata": {}, + "id": "379978c2", + "metadata": { + "editable": true + }, "source": [ "with $t$ a finite positive number. \n", "\n", @@ -250,8 +298,10 @@ }, { "cell_type": "markdown", - "id": "b7761c1f", - "metadata": {}, + "id": "887c145a", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}_{\\mathrm{OLS}} = \\left(\\boldsymbol{X}^T\\boldsymbol{X}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y},\n", @@ -260,8 +310,10 @@ }, { "cell_type": "markdown", - "id": "13159e48", - "metadata": {}, + "id": "7146c525", + "metadata": { + "editable": true + }, "source": [ "which can lead to singular matrices. However, with the SVD, we can always compute the inverse of the matrix $\\boldsymbol{X}^T\\boldsymbol{X}$.\n", "\n", @@ -274,8 +326,10 @@ }, { "cell_type": "markdown", - "id": "199ba41c", - "metadata": {}, + "id": "3449285f", + "metadata": { + "editable": true + }, "source": [ "## SVD analysis\n", "\n", @@ -285,8 +339,10 @@ }, { "cell_type": "markdown", - "id": "a39fd7fb", - "metadata": {}, + "id": "dd7fafee", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\tilde{\\boldsymbol{y}}_{\\mathrm{OLS}}=\\boldsymbol{X}\\boldsymbol{\\beta} =\\boldsymbol{U}\\boldsymbol{U}^T\\boldsymbol{y}.\n", @@ -295,16 +351,20 @@ }, { "cell_type": "markdown", - "id": "8d703fef", - "metadata": {}, + "id": "07394ecb", + "metadata": { + "editable": true + }, "source": [ "For Ridge regression this becomes" ] }, { "cell_type": "markdown", - "id": "8d65d498", - "metadata": {}, + "id": "87d7bca7", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\tilde{\\boldsymbol{y}}_{\\mathrm{Ridge}}=\\boldsymbol{X}\\boldsymbol{\\beta}_{\\mathrm{Ridge}} = \\boldsymbol{U\\Sigma V^T}\\left(\\boldsymbol{V}\\boldsymbol{\\Sigma}^2\\boldsymbol{V}^T+\\lambda\\boldsymbol{I} \\right)^{-1}(\\boldsymbol{U\\Sigma V^T})^T\\boldsymbol{y}=\\sum_{j=0}^{p-1}\\boldsymbol{u}_j\\boldsymbol{u}_j^T\\frac{\\sigma_j^2}{\\sigma_j^2+\\lambda}\\boldsymbol{y},\n", @@ -313,16 +373,20 @@ }, { "cell_type": "markdown", - "id": "b115c851", - "metadata": {}, + "id": "762fe137", + "metadata": { + "editable": true + }, "source": [ "with the vectors $\\boldsymbol{u}_j$ being the columns of $\\boldsymbol{U}$ from the SVD of the matrix $\\boldsymbol{X}$." ] }, { "cell_type": "markdown", - "id": "795d1be1", - "metadata": {}, + "id": "0a1f2b28", + "metadata": { + "editable": true + }, "source": [ "## Interpreting the Ridge results\n", "\n", @@ -331,8 +395,10 @@ }, { "cell_type": "markdown", - "id": "ead04830", - "metadata": {}, + "id": "01f4c52d", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\frac{\\sigma_j^2}{\\sigma_j^2+\\lambda} \\leq 1.\n", @@ -341,8 +407,10 @@ }, { "cell_type": "markdown", - "id": "cfb9f42f", - "metadata": {}, + "id": "c81e7d01", + "metadata": { + "editable": true + }, "source": [ "Ridge regression finds the coordinates of $\\boldsymbol{y}$ with respect to the\n", "orthonormal basis $\\boldsymbol{U}$, it then shrinks the coordinates by\n", @@ -355,8 +423,10 @@ }, { "cell_type": "markdown", - "id": "0b9fae34", - "metadata": {}, + "id": "b0ebc908", + "metadata": { + "editable": true + }, "source": [ "## More interpretations\n", "\n", @@ -365,8 +435,10 @@ }, { "cell_type": "markdown", - "id": "72dbba94", - "metadata": {}, + "id": "803e29ff", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}^T\\boldsymbol{X}=(\\boldsymbol{X}^T\\boldsymbol{X})^{-1} =\\boldsymbol{I}.\n", @@ -375,16 +447,20 @@ }, { "cell_type": "markdown", - "id": "ac72dbfb", - "metadata": {}, + "id": "e67d95f1", + "metadata": { + "editable": true + }, "source": [ "In this case the standard OLS results in" ] }, { "cell_type": "markdown", - "id": "d13f7879", - "metadata": {}, + "id": "477f02f4", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{\\beta}^{\\mathrm{OLS}} = \\boldsymbol{X}^T\\boldsymbol{y}=\\sum_{i=0}^{p-1}\\boldsymbol{u}_j\\boldsymbol{u}_j^T\\boldsymbol{y},\n", @@ -393,16 +469,20 @@ }, { "cell_type": "markdown", - "id": "53f823d6", - "metadata": {}, + "id": "d2c53221", + "metadata": { + "editable": true + }, "source": [ "and" ] }, { "cell_type": "markdown", - "id": "f6500c4a", - "metadata": {}, + "id": "7514e7fa", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{\\beta}^{\\mathrm{Ridge}} = \\left(\\boldsymbol{I}+\\lambda\\boldsymbol{I}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y}=\\left(1+\\lambda\\right)^{-1}\\boldsymbol{\\beta}^{\\mathrm{OLS}},\n", @@ -411,8 +491,10 @@ }, { "cell_type": "markdown", - "id": "5f4aa2e3", - "metadata": {}, + "id": "9f1253ba", + "metadata": { + "editable": true + }, "source": [ "that is the Ridge estimator scales the OLS estimator by the inverse of a factor $1+\\lambda$, and\n", "the Ridge estimator converges to zero when the hyperparameter goes to\n", @@ -426,8 +508,10 @@ }, { "cell_type": "markdown", - "id": "0f0562cd", - "metadata": {}, + "id": "be0831cd", + "metadata": { + "editable": true + }, "source": [ "## Deriving the Lasso Regression Equations\n", "\n", @@ -436,8 +520,10 @@ }, { "cell_type": "markdown", - "id": "5184af64", - "metadata": {}, + "id": "62a26e89", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{X},\\boldsymbol{\\beta})=\\left\\{(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})^T(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\right\\}+\\lambda\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -446,16 +532,20 @@ }, { "cell_type": "markdown", - "id": "82b509c8", - "metadata": {}, + "id": "e3ae918e", + "metadata": { + "editable": true + }, "source": [ "Taking the derivative with respect to $\\boldsymbol{\\beta}$ and recalling that the derivative of the absolute value is (we drop the boldfaced vector symbol for simplicty)" ] }, { "cell_type": "markdown", - "id": "cb53949c", - "metadata": {}, + "id": "9ac2c6ad", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\frac{d \\vert \\beta\\vert}{d \\boldsymbol{\\beta}}=\\mathrm{sgn}(\\boldsymbol{\\beta})=\\left\\{\\begin{array}{cc} 1 & \\beta > 0 \\\\-1 & \\beta < 0, \\end{array}\\right.\n", @@ -464,16 +554,20 @@ }, { "cell_type": "markdown", - "id": "98dffbc2", - "metadata": {}, + "id": "5c04e729", + "metadata": { + "editable": true + }, "source": [ "we have that the derivative of the cost function is" ] }, { "cell_type": "markdown", - "id": "f6b8e1f1", - "metadata": {}, + "id": "7f0d97a0", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\frac{\\partial C(\\boldsymbol{X},\\boldsymbol{\\beta})}{\\partial \\boldsymbol{\\beta}}=-2\\boldsymbol{X}^T(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})+\\lambda sgn(\\boldsymbol{\\beta})=0,\n", @@ -482,16 +576,20 @@ }, { "cell_type": "markdown", - "id": "3340097f", - "metadata": {}, + "id": "30af9fc7", + "metadata": { + "editable": true + }, "source": [ "and reordering we have" ] }, { "cell_type": "markdown", - "id": "75b76a69", - "metadata": {}, + "id": "c03f3828", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}^T\\boldsymbol{X}\\boldsymbol{\\beta}+\\lambda sgn(\\boldsymbol{\\beta})=2\\boldsymbol{X}^T\\boldsymbol{y}.\n", @@ -500,16 +598,20 @@ }, { "cell_type": "markdown", - "id": "99446adc", - "metadata": {}, + "id": "0ea59c2e", + "metadata": { + "editable": true + }, "source": [ "This equation does not lead to a nice analytical equation as in Ridge regression or ordinary least squares. This equation can however be solved by using standard convex optimization algorithms using for example the Python package [CVXOPT](https://cvxopt.org/). We will discuss this later." ] }, { "cell_type": "markdown", - "id": "ed2fbc69", - "metadata": {}, + "id": "c5f70b50", + "metadata": { + "editable": true + }, "source": [ "## Simple example to illustrate Ordinary Least Squares, Ridge and Lasso Regression\n", "\n", @@ -521,8 +623,10 @@ }, { "cell_type": "markdown", - "id": "f59ad285", - "metadata": {}, + "id": "98168425", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\sum_{i=0}^{p-1}(y_i-\\beta_i)^2,\n", @@ -531,16 +635,20 @@ }, { "cell_type": "markdown", - "id": "34c46d15", - "metadata": {}, + "id": "132b37ba", + "metadata": { + "editable": true + }, "source": [ "and minimizing we have that" ] }, { "cell_type": "markdown", - "id": "b58a39d0", - "metadata": {}, + "id": "9f5aeaa7", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\beta}_i^{\\mathrm{OLS}} = y_i.\n", @@ -549,8 +657,10 @@ }, { "cell_type": "markdown", - "id": "afbd5077", - "metadata": {}, + "id": "586753f9", + "metadata": { + "editable": true + }, "source": [ "## Ridge Regression\n", "\n", @@ -559,8 +669,10 @@ }, { "cell_type": "markdown", - "id": "22412c03", - "metadata": {}, + "id": "7a9a7e0f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\sum_{i=0}^{p-1}(y_i-\\beta_i)^2+\\lambda\\sum_{i=0}^{p-1}\\beta_i^2,\n", @@ -569,16 +681,20 @@ }, { "cell_type": "markdown", - "id": "8f6f22f0", - "metadata": {}, + "id": "c9937d2c", + "metadata": { + "editable": true + }, "source": [ "and minimizing we have that" ] }, { "cell_type": "markdown", - "id": "2a762fc5", - "metadata": {}, + "id": "7cc6e211", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\beta}_i^{\\mathrm{Ridge}} = \\frac{y_i}{1+\\lambda}.\n", @@ -587,8 +703,10 @@ }, { "cell_type": "markdown", - "id": "f745bcfc", - "metadata": {}, + "id": "abcbbea3", + "metadata": { + "editable": true + }, "source": [ "## Lasso Regression\n", "\n", @@ -597,8 +715,10 @@ }, { "cell_type": "markdown", - "id": "d19b1a18", - "metadata": {}, + "id": "d03cbdc2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\sum_{i=0}^{p-1}(y_i-\\beta_i)^2+\\lambda\\sum_{i=0}^{p-1}\\vert\\beta_i\\vert=\\sum_{i=0}^{p-1}(y_i-\\beta_i)^2+\\lambda\\sum_{i=0}^{p-1}\\sqrt{\\beta_i^2},\n", @@ -607,16 +727,20 @@ }, { "cell_type": "markdown", - "id": "83d1effc", - "metadata": {}, + "id": "8bca4327", + "metadata": { + "editable": true + }, "source": [ "and minimizing we have that" ] }, { "cell_type": "markdown", - "id": "3d4b494b", - "metadata": {}, + "id": "0da2fc2e", + "metadata": { + "editable": true + }, "source": [ "$$\n", "-2\\sum_{i=0}^{p-1}(y_i-\\beta_i)+\\lambda \\sum_{i=0}^{p-1}\\frac{(\\beta_i)}{\\vert\\beta_i\\vert}=0,\n", @@ -625,16 +749,20 @@ }, { "cell_type": "markdown", - "id": "f256fb7a", - "metadata": {}, + "id": "da0627ac", + "metadata": { + "editable": true + }, "source": [ "which leads to" ] }, { "cell_type": "markdown", - "id": "e4385b4b", - "metadata": {}, + "id": "be38f7a8", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}_i^{\\mathrm{Lasso}} = \\left\\{\\begin{array}{ccc}y_i-\\frac{\\lambda}{2} &\\mathrm{if} & y_i> \\frac{\\lambda}{2}\\\\\n", @@ -645,16 +773,20 @@ }, { "cell_type": "markdown", - "id": "bc046af0", - "metadata": {}, + "id": "51a6bfab", + "metadata": { + "editable": true + }, "source": [ "Plotting these results ([figure in handwritten notes for week 36](https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2021/NotesSeptember9.pdf)) shows clearly that Lasso regression suppresses (sets to zero) values of $\\beta_i$ for specific values of $\\lambda$. Ridge regression reduces on the other hand the values of $\\beta_i$ as function of $\\lambda$." ] }, { "cell_type": "markdown", - "id": "ef728521", - "metadata": {}, + "id": "3f05154c", + "metadata": { + "editable": true + }, "source": [ "## Yet another Example\n", "\n", @@ -663,8 +795,10 @@ }, { "cell_type": "markdown", - "id": "511c35c9", - "metadata": {}, + "id": "a3aa94ba", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{y}=\\begin{bmatrix}4 \\\\ 2 \\\\3\\end{bmatrix},\n", @@ -673,16 +807,20 @@ }, { "cell_type": "markdown", - "id": "a4a4d4cd", - "metadata": {}, + "id": "5e96cd27", + "metadata": { + "editable": true + }, "source": [ "and our inputs as a $3\\times 2$ design matrix" ] }, { "cell_type": "markdown", - "id": "c07e4afb", - "metadata": {}, + "id": "6b6056d1", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}=\\begin{bmatrix}2 & 0\\\\ 0 & 1 \\\\ 0 & 0\\end{bmatrix},\n", @@ -691,16 +829,20 @@ }, { "cell_type": "markdown", - "id": "58b41942", - "metadata": {}, + "id": "da66af0a", + "metadata": { + "editable": true + }, "source": [ "meaning that we have two features and two unknown parameters $\\beta_0$ and $\\beta_1$ to be determined either by ordinary least squares, Ridge or Lasso regression." ] }, { "cell_type": "markdown", - "id": "377f8025", - "metadata": {}, + "id": "991f82d7", + "metadata": { + "editable": true + }, "source": [ "## The OLS case\n", "\n", @@ -709,8 +851,10 @@ }, { "cell_type": "markdown", - "id": "d7447f1b", - "metadata": {}, + "id": "99f0d5f1", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{OLS}}=\\left( \\boldsymbol{X}^T\\boldsymbol{X}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y}.\n", @@ -719,16 +863,20 @@ }, { "cell_type": "markdown", - "id": "8f4785f7", - "metadata": {}, + "id": "3ace3ef6", + "metadata": { + "editable": true + }, "source": [ "Inserting the above values we obtain that" ] }, { "cell_type": "markdown", - "id": "99c7045b", - "metadata": {}, + "id": "a19a7f99", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{OLS}}=\\begin{bmatrix}2 \\\\ 2\\end{bmatrix},\n", @@ -737,16 +885,20 @@ }, { "cell_type": "markdown", - "id": "059838a5", - "metadata": {}, + "id": "ff0e54e2", + "metadata": { + "editable": true + }, "source": [ "The code which implements this simpler case is presented after the discussion of Ridge and Lasso." ] }, { "cell_type": "markdown", - "id": "3e012182", - "metadata": {}, + "id": "431feca6", + "metadata": { + "editable": true + }, "source": [ "## The Ridge case\n", "\n", @@ -755,8 +907,10 @@ }, { "cell_type": "markdown", - "id": "b10085df", - "metadata": {}, + "id": "e912f4d2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{Ridge}}=\\left( \\boldsymbol{X}^T\\boldsymbol{X}+\\lambda\\boldsymbol{I}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y}.\n", @@ -765,16 +919,20 @@ }, { "cell_type": "markdown", - "id": "932e05a5", - "metadata": {}, + "id": "2731c827", + "metadata": { + "editable": true + }, "source": [ "Inserting the above values we obtain that" ] }, { "cell_type": "markdown", - "id": "0eec6072", - "metadata": {}, + "id": "8b6e6b07", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{Ridge}}=\\begin{bmatrix}\\frac{8}{4+\\lambda} \\\\ \\frac{2}{1+\\lambda}\\end{bmatrix},\n", @@ -783,8 +941,10 @@ }, { "cell_type": "markdown", - "id": "b2e38999", - "metadata": {}, + "id": "ea774406", + "metadata": { + "editable": true + }, "source": [ "There is normally a constraint on the value of $\\vert\\vert \\boldsymbol{\\beta}\\vert\\vert_2$ via the parameter $\\lambda$.\n", "Let us for simplicity assume that $\\beta_0^2+\\beta_1^2=1$ as constraint. This will allow us to find an expression for the optimal values of $\\beta$ and $\\lambda$.\n", @@ -794,8 +954,10 @@ }, { "cell_type": "markdown", - "id": "31fe413e", - "metadata": {}, + "id": "7554f52c", + "metadata": { + "editable": true + }, "source": [ "## Writing the Cost Function\n", "\n", @@ -804,8 +966,10 @@ }, { "cell_type": "markdown", - "id": "35b1242e", - "metadata": {}, + "id": "d2a7275f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}\\boldsymbol{\\beta}=\\begin{bmatrix} 2\\beta_0 \\\\ \\beta_1 \\\\0 \\end{bmatrix},\n", @@ -814,8 +978,10 @@ }, { "cell_type": "markdown", - "id": "77dbd497", - "metadata": {}, + "id": "e8a2669d", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=(4-2\\beta_0)^2+(2-\\beta_1)^2+\\lambda(\\beta_0^2+\\beta_1^2),\n", @@ -824,16 +990,20 @@ }, { "cell_type": "markdown", - "id": "53267765", - "metadata": {}, + "id": "2d892e13", + "metadata": { + "editable": true + }, "source": [ "and taking the derivative with respect to $\\beta_0$ we get" ] }, { "cell_type": "markdown", - "id": "6a498c50", - "metadata": {}, + "id": "450f74e0", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\beta_0=\\frac{8}{4+\\lambda},\n", @@ -842,16 +1012,20 @@ }, { "cell_type": "markdown", - "id": "80f46853", - "metadata": {}, + "id": "dae76b52", + "metadata": { + "editable": true + }, "source": [ "and for $\\beta_1$ we obtain" ] }, { "cell_type": "markdown", - "id": "0c51bb9b", - "metadata": {}, + "id": "3c4d3b7c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\beta_1=\\frac{2}{1+\\lambda},\n", @@ -860,16 +1034,20 @@ }, { "cell_type": "markdown", - "id": "f78ca610", - "metadata": {}, + "id": "d7a158ed", + "metadata": { + "editable": true + }, "source": [ "Using the constraint for $\\beta_0^2+\\beta_1^2=1$ we can constrain $\\lambda$ by solving" ] }, { "cell_type": "markdown", - "id": "ad42b2e8", - "metadata": {}, + "id": "91ef4b72", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\left(\\frac{8}{4+\\lambda}\\right)^2+\\left(\\frac{2}{1+\\lambda}\\right)^2=1,\n", @@ -878,16 +1056,20 @@ }, { "cell_type": "markdown", - "id": "630f551a", - "metadata": {}, + "id": "a5a766b2", + "metadata": { + "editable": true + }, "source": [ "which gives $\\lambda=4.571$ and $\\beta_0=0.933$ and $\\beta_1=0.359$." ] }, { "cell_type": "markdown", - "id": "625f9a79", - "metadata": {}, + "id": "ca4c8552", + "metadata": { + "editable": true + }, "source": [ "## Lasso case\n", "\n", @@ -897,8 +1079,10 @@ }, { "cell_type": "markdown", - "id": "4d3a230c", - "metadata": {}, + "id": "49650ce3", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=(4-2\\beta_0)^2+(2-\\beta_1)^2+\\lambda(\\vert\\beta_0\\vert+\\vert\\beta_1\\vert),\n", @@ -907,8 +1091,10 @@ }, { "cell_type": "markdown", - "id": "28f81c66", - "metadata": {}, + "id": "a3f92b3e", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\frac{\\partial C(\\boldsymbol{\\beta})}{\\partial \\beta_0}=-4(4-2\\beta_0)+\\lambda\\mathrm{sgn}(\\beta_0)=0,\n", @@ -917,16 +1103,20 @@ }, { "cell_type": "markdown", - "id": "240943b2", - "metadata": {}, + "id": "1036291d", + "metadata": { + "editable": true + }, "source": [ "and" ] }, { "cell_type": "markdown", - "id": "69a63eb2", - "metadata": {}, + "id": "75f30dff", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\frac{\\partial C(\\boldsymbol{\\beta})}{\\partial \\beta_1}=-2(2-\\beta_1)+\\lambda\\mathrm{sgn}(\\beta_1)=0.\n", @@ -935,8 +1125,10 @@ }, { "cell_type": "markdown", - "id": "27d7b28a", - "metadata": {}, + "id": "1e1e7506", + "metadata": { + "editable": true + }, "source": [ "We have now four cases to solve besides the trivial cases $\\beta_0$ and/or $\\beta_1$ are zero, namely\n", "1. $\\beta_0 > 0$ and $\\beta_1 > 0$,\n", @@ -950,8 +1142,10 @@ }, { "cell_type": "markdown", - "id": "d1419711", - "metadata": {}, + "id": "38a11a8f", + "metadata": { + "editable": true + }, "source": [ "## The first Case\n", "\n", @@ -960,8 +1154,10 @@ }, { "cell_type": "markdown", - "id": "56fdbb72", - "metadata": {}, + "id": "8346dd0f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "-4(4-2\\beta_0)+\\lambda=0,\n", @@ -970,16 +1166,20 @@ }, { "cell_type": "markdown", - "id": "de38ca69", - "metadata": {}, + "id": "068a7b14", + "metadata": { + "editable": true + }, "source": [ "and" ] }, { "cell_type": "markdown", - "id": "94f54723", - "metadata": {}, + "id": "d74af879", + "metadata": { + "editable": true + }, "source": [ "$$\n", "-2(2-\\beta_1)+\\lambda=0.\n", @@ -988,16 +1188,20 @@ }, { "cell_type": "markdown", - "id": "faffc4f4", - "metadata": {}, + "id": "75eb5818", + "metadata": { + "editable": true + }, "source": [ "which yields" ] }, { "cell_type": "markdown", - "id": "d54aed47", - "metadata": {}, + "id": "82a6ca01", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\beta_0=\\frac{16+\\lambda}{8},\n", @@ -1006,16 +1210,20 @@ }, { "cell_type": "markdown", - "id": "717b2d10", - "metadata": {}, + "id": "3df03602", + "metadata": { + "editable": true + }, "source": [ "and" ] }, { "cell_type": "markdown", - "id": "d0e516c1", - "metadata": {}, + "id": "4dc31587", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\beta_1=\\frac{4+\\lambda}{2}.\n", @@ -1024,16 +1232,20 @@ }, { "cell_type": "markdown", - "id": "45cd8d70", - "metadata": {}, + "id": "39d594b7", + "metadata": { + "editable": true + }, "source": [ "Using the constraint on $\\beta_0$ and $\\beta_1$ we can then find the optimal value of $\\lambda$ for the different cases. We leave this as an exercise to you." ] }, { "cell_type": "markdown", - "id": "4c701337", - "metadata": {}, + "id": "da4b6d3e", + "metadata": { + "editable": true + }, "source": [ "## Simple code for solving the above problem\n", "\n", @@ -1044,30 +1256,13 @@ }, { "cell_type": "code", - "execution_count": 5, - "id": "1086f536", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[2. 2.]\n", - "Training MSE for OLS\n", - "3.0\n" - ] - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAioAAAGwCAYAAACHJU4LAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAA9hAAAPYQGoP6dpAAA/zUlEQVR4nO3dd3hUZcLG4WcS0hu9SYCArkiV3gULoCAiKrBIR1EWRBBRjK4KLBpFxa4IKKCsIq6g6EdX6b0pKgIWJEs3QBJaQpLz/XE2ZUghCUneKb/7us41M2femXkYIPPknDPvcViWZQkAAMAF+ZgOAAAAkBuKCgAAcFkUFQAA4LIoKgAAwGVRVAAAgMuiqAAAAJdFUQEAAC6rlOkAVyItLU2HDx9WWFiYHA6H6TgAACAfLMtSYmKiqlatKh+fvLeZuHVROXz4sCIjI03HAAAAhRAbG6tq1arlOcati0pYWJgk+w8aHh5uOA0AAMiPhIQERUZGZnyO58Wti0r67p7w8HCKCgAAbiY/h21wMC0AAHBZFBUAAOCyKCoAAMBlufUxKvmVmpqqixcvmo4BL+Pn5ydfX1/TMQDArXl0UbEsS0ePHtXp06dNR4GXKl26tCpXrsw8PwBQSB5dVNJLSsWKFRUcHMyHBUqMZVk6d+6cjh8/LkmqUqWK4UQA4J48tqikpqZmlJRy5cqZjgMvFBQUJEk6fvy4KlasyG4gACgEjz2YNv2YlODgYMNJ4M3S//1xjBQAFI7HFpV07O6BSfz7A4Ar4/FFBQAAuC+KCgAAcFkUFVyRAwcOyOFwaNeuXbmOWbVqlRwOh1t+TXzw4MG68847TccAAK9FUXFBgwcPlsPh0PDhw7PdN2LECDkcDg0ePDhj3fHjx/Xggw+qevXqCggIUOXKldWlSxdt3LgxY0zNmjXlcDiyLS+88EKuOTp27Jgxzt/fX7Vr11Z0dLSSkpIyxkRGRurIkSOqX79+0fzhCym3P1/60rFjx0I97+uvv67Zs2cXaVYABliWlJYmpabaS1bnz0tnz0pnzkiJiVJCghQfL50+ba/L6uRJ6a+/pBMn7OX48czl5EnnscePS0eOZC6HD2cu/5u6IMOxY9KhQ9J//5t9OXLEeezRo1JsbOZy8GDm8t//Zn/erPdnXWJjs+f980/nJS6uQG9zcfDYrye7u8jISM2bN0+vvvpqxtdcL1y4oE8++UTVq1d3Gnv33Xfr4sWLmjNnjmrVqqVjx47pm2++0clL/tNMmjRJw4YNc1p3uVNsDxs2TJMmTVJycrK2bt2qIUOGSJJiYmIkSb6+vqpcufIV/VmLwtatW5X6vx8+GzZs0N133629e/dmnFXb39/fafzFixfl5+d32eeNiIgo+rCAp0hJsT/US5WS0s9gf+aM9O239uX5887LhQtSq1bSHXfYY0+elEaMkJKT7eXixczLlBSpe3fpqafssQkJUpMmdslIScksHOnl4667pJkz7bEXL0qhofZ96UtW3btLixZl3i5d2n7dnNx0k/TNN5m3r75aOnUq57GtWklZfkFU06bZi0O6Bg2kH37IvH3DDdK+fTmPrVVL+u23zNtdu0o7d+Y8tlIlu8iku+cead26nMeGhdnva7qBA6Vly5zH3H+/NGNGzo8vId5ZVM6ezf0+X18pMDB/Y318pP+ViFzHhoQUPJ+kJk2a6Pfff9eCBQvUr18/SdKCBQsUGRmpWrVqZYw7ffq01q1bp1WrVqlDhw6SpBo1aqhFixbZnjMsLKzApSI4ODjjMdWrV9fHH3+s5cuXZxSVAwcOKCoqSjt37tT1118vSVq8eLHGjBmj2NhYtWrVSoMGDcr2vDNmzNCkSZMUFxenLl26qH379po0aZLT7qGvvvpKEyZM0E8//aSqVatq0KBBeuqpp1SqVPZ/thUqVMi4XrZsWUlSxYoVVbp0aUn2t2/effddLVmyRCtXrtS4ceP0zDPP6IEHHtC3336ro0ePqnr16hoxYoRGjx6d8VyDBw/W6dOn9cUXX0iytzI1bNhQgYGBmjlzpvz9/TV8+HBNmDChQO8r4HIsy95CcOiQ/UFXs6Z03XX2fb//Lj3yiF0sTp60P6jj46Vz5+z7n3xSeu45+/rRo1KPHrm/zkMPZRaV5GTp009zH1u3buZ1h8P5w/pSWbd8+PjkXjwk+89aEnx87EWy82e9vHReJT8/e8k6Jt0lv2jJz08KCMj5NS9d7+/v/JmW1aXrcxqbw8/bkmY+gQmhobnf17Wr9H//l3m7YsXM/4yX6tBBWrUq83bNmvYmwayu4D/EkCFDNGvWrIyi8sEHH2jo0KFaleU1Q0NDFRoaqi+++EKtWrVSQG7/eIvA999/r/Xr16tmzZq5jomNjdVdd92l4cOH6x//+Ie2bdumRx991GnM+vXrNXz4cL344ou64447tHLlSj399NNOY5YtW6b+/fvrjTfeUPv27fXbb7/pgQcekCQ9++yzhcr/7LPPKiYmRq+++qp8fX2VlpamatWqaf78+Spfvrw2bNigBx54QFWqVFHv3r1zfZ45c+Zo7Nix2rx5szZu3KjBgwerbdu26tSpU6FyASUmJSXzg+f336UJE+zN++m7GLJ+uP/zn9K//mVfT0113gJxqay/pEVESC1b2r+kBQVJwcH2ZVCQ/SHYvr3z2Ndftz8g/f0zP6zTlxo1MscGB0vr19v5fX3txccn83r6Fh3JXn/wYGZRcDicLy/94I+Ls+/LaUkvGulOnMi8T8peKrL688/c77vUjz/mf+zmzfkfm3Vr0OXk9XdskuXG4uPjLUlWfHx8tvvOnz9v/fzzz9b58+ezP9CuDzkvXbs6jw0Ozn1shw7OY8uXzz6mEAYNGmT16NHDOnHihBUQEGD98ccf1oEDB6zAwEDrxIkTVo8ePaxBgwZljP/Pf/5jlSlTxgoMDLTatGljRUdHW99//73Tc9aoUcPy9/e3QkJCnJbvvvsu1xwdOnSw/Pz8rJCQEMvf39+SZPn4+Fj/+c9/Msb88ccfliRr586dlmVZVnR0tHXddddZaWlpGWPGjx9vSbJOnTplWZZl9enTx+rWrZvTa/Xr18+KiIjIuN2+fXvr+eefdxrz0UcfWVWqVLns+/fdd985vZ5lWZYka8yYMZd97IgRI6y7774743b630W6Dh06WO3atXN6TPPmza3x48fn+Hx5/jsEiktSkmVt3WpZM2ZY1sMPW9ZNN1lWhQqW9eyzmWP278/+88rhsKxKlSyrUSPLevXVzLFnzljW9OmW9dlnlvXNN5a1Y4dl/f67Zf31l/1aQAHl9fl9Ke/conLpwVFZXbo57tIDnrK6tGkfOFDoSDkpX768unXrpjlz5siyLHXr1k3ly5fPNu7uu+9Wt27dtHbtWm3cuFFLly7VlClTNHPmTKeDbh977DGn25J01VVX5ZmhX79+euqpp5SQkKAXX3xR4eHhuvvuu3Mdv2fPHrVq1cpporPWrVs7jdm7d6969uzptK5Fixb6+uuvM25v375dW7du1XPpm5NlnxbhwoULOnfuXKFmHG7WrFm2ddOmTdPMmTP1559/6vz580pOTs7YhZWbhg0bOt2uUqVKxjl9AKMOHZJ695a2b5eyHPSeIeuukxo1pOefl6KipMhIqVo1qUqV7FsbJHvryCXHtwElxTuLSkGOGymusfk0dOhQPfTQQ5Kkt99+O9dxgYGB6tSpkzp16qRnnnlG999/v5599lmnYlK+fHldffXVBXr9iIiIjMfMnTtX9erV0/vvv6/77rsvx/FWPnZ1WZaVbcbWSx+XlpamiRMn6q677sr2+MDc9rdeRsglfz/z58/XI488oldeeUWtW7dWWFiYXnrpJW2+zGbVSw/CdTgcSrv0YD2gOCUlSRs22Ac+VqpkHz8iSRUqZJaUsmXtgzkbNJDq17cvsx7z4ecnRUebyQ8UgHcWFTdy6623Kvl/+427dOmS78fVrVs34wDQouLn56cnn3xS0dHR6tu3b45bNXJ63U2bNjndrlOnjrZs2eK0btu2bU63mzRpor179xa4WBXE2rVr1aZNG40YMSJj3W95HawHmJScLK1YIX3yifTll5lbhuvUySwq/v7SggXSNdfY307hFA7wABQVF+fr66s9e/ZkXL9UXFycevXqpaFDh6phw4YKCwvTtm3bNGXKFPW45Mj7xMREHc36tTXZ3+oJz3oQ2mXce++9evLJJ/XOO+9o3Lhx2e4fPny4XnnlFY0dO1YPPvigtm/fnm0eklGjRumGG27Q1KlT1b17d3377bdasmSJ01aWZ555RrfffrsiIyPVq1cv+fj46IcfftDu3bs1efLkfOfNy9VXX60PP/xQy5YtU1RUlD766CNt3bpVUVFRRfL8QJEZP97+6m3WKQcqVZI6d7YXy8osJV27mskIFBMmfHMD4eHhuZaJ0NBQtWzZUq+++qpuuOEG1a9fX08//bSGDRumt956y2nsM888oypVqjgtjz/+eIGy+Pv766GHHtKUKVN0JodjfapXr67PP/9cX331lRo1aqRp06bp+eefdxrTtm1bTZs2TVOnTlWjRo20dOlSPfLII067dLp06aKvv/5aK1asUPPmzdWqVStNnTpVNbJ+C+AKDR8+XHfddZf69Omjli1bKi4uzmnrCmDMpV+tPXXKLimVK0sPP2zv9jl8WPrwQ6l/f7acwKM5rPwcVOCiEhISFBERofj4+Gwf5BcuXNAff/yhqKioQh/TgJIzbNgw/fLLL1q7dq3pKEWKf4cokJMn7a/rTptmT5OQfgD4zz/bM4zecEP2A/4BN5TX5/el2PUDI15++WV16tRJISEhWrJkiebMmaN33nnHdCzAjPPnpTfekF54wZ62XZLmzMksKnXrOh8IC3gRigqM2LJli6ZMmaLExETVqlVLb7zxhu6//37TsYCSlZIizZ5tT7x26JC9rkEDe7K1S77CD3grigqMmD9/vukIgFmWZc9uvWGDfbt6dWnyZOnee9m9A2Th8QfTuvEhOPAA/PtDrhwOqV8/e76TqVOlvXulAQMoKcAlPHaLSvqkXOfOncs4+zBQ0s797zxR+TlTM7zArl328SjpszUPHy716SOVK2c0FuDKPLao+Pr6qnTp0hlTmwcHB2ebDRUoLpZl6dy5czp+/LhKly6d4xw48CKWZR8sO26cPVX97t32yVF9fCgpwGV4bFGRpMqVK0sS52GBMaVLl874dwgvlZwsjRxpT9gmSU2aZJ8nBUCuPLqoOBwOValSRRUrVtTFixdNx4GX8fPzY0uKt/vrL+nuu6U1a+ytJy+9ZE93z9ZdIN88uqik8/X15QMDQMn68Uepe3f7rOrh4fY5epjeHigwrygqAFDixo2zS0qtWtJXXzFhG1BIHv/1ZAAwYv58+7w8W7ZQUoArQFEBgKISF5d5PTzcPm8P3+oBrghFBQCKwk8/SdddJ734oukkgEehqADAldq3T7r5ZunECemzz6SkJNOJAI9BUQGAK3H6tHT77dKxY1KjRtLy5VJAgOlUgMegqABAYaWlSf37S/v32ycVXL7cPncPgCJDUQGAwpo4Ufq//5MCA6WFC6WKFU0nAjwORQUACuOHH6RJk+zr06fbU+MDKHJM+AYAhdGwofTBB9LPP0sDBphOA3gsigoAFNaQIaYTAB6PXT8AUBAffyydPGk6BeA1jBaVxMREjRkzRjVq1FBQUJDatGmjrVu3mowEALlbv97+lk/duvaZkQEUO6NF5f7779eKFSv00Ucfaffu3ercubNuueUWHTp0yGQsAMjuwgXpvvsky5Juu00qX950IsArOCzLsky88Pnz5xUWFqYvv/xS3bp1y1h//fXX6/bbb9fkyZMv+xwJCQmKiIhQfHy8wsPDizMuAG/35JNSTIxUubJ9AG2ZMqYTAW6rIJ/fxg6mTUlJUWpqqgIDA53WBwUFad26dTk+JikpSUlZpqZOSEgo1owAIEnasUOaMsW+/u67lBSgBBnb9RMWFqbWrVvrX//6lw4fPqzU1FTNnTtXmzdv1pEjR3J8TExMjCIiIjKWyMjIEk4NwOtcvCgNHSqlpkp9+kh33mk6EeBVjB6j8tFHH8myLF111VUKCAjQG2+8oXvvvVe+vr45jo+OjlZ8fHzGEhsbW8KJAXidV16Rvv9eKldOevNN02kAr2N0HpXatWtr9erVOnv2rBISElSlShX16dNHUVFROY4PCAhQACf7AlCSBg+WYmOldu2kChVMpwG8jktM+BYSEqKQkBCdOnVKy5Yt05T0fcEAYFrlytLbb5tOAXgto0Vl2bJlsixL1157rX799Vc99thjuvbaazWE2R4BmJaUJLEFFzDO6DEq8fHxGjlypOrUqaOBAweqXbt2Wr58ufz8/EzGAgCpXz+pa1dp3z7TSQCvZmwelaLAPCoAisWmTVLr1pKPj30gbf36phMBHqUgn9+c6wcAsrIsafx4+/qgQZQUwDCKCgBktXixtGaNFBgoTZxoOg3g9SgqAJAuNVV64gn7+sMPS0wqCRhHUQGAdJ99Jv34oz1FfnphAWAURQUA0s2ZY1+OHs35fAAX4RITvgGAS1i4UPr3v6U77jCdBMD/UFQAIF1goHTffaZTAMiCXT8AkJBgH0gLwOVQVADg0UelOnWkZctMJwFwCXb9APBuR49KH34oJSdLYWGm0wC4BFtUAHi3t9+2S0rr1lKbNqbTALgERQWA9zp7VnrnHfv6uHFmswDIEUUFgPeaPVs6eVKqXVvq0cN0GgA5oKgA8E6WJb3+un197FjJ19dsHgA5oqgA8E4bN0r790uhodLAgabTAMgF3/oB4J1atZJWrZJ++80uKwBcEkUFgHfy8ZE6dLAXAC6LXT8AAMBlUVQAeJ/u3aWHH5YOHzadBMBlUFQAeJc9e6Svv7bnT/HhRyDg6vhfCsC7zJplX3brJlWubDYLgMuiqADwHhcv2uf1kaShQ81mAZAvFBUA3mPJEunYMalSJalrV9NpAOQDRQWA9/jgA/ty4EDJz89sFgD5QlEB4B2OHrUPopWkIUPMZgGQb0z4BsA7pKZKjzwiHTwoXXed6TQA8omiAsA7XHWV9NJLplMAKCB2/QAAAJfFFhUAnu+bb6SUFOmmmziIFnAzbFEB4PkmTpRuvVWaNs10EgAFRFEB4NmOHJHWrbOv9+xpNguAAqOoAPBsCxZIliW1bi1Vq2Y6DYACoqgA8GyffWZf3nOP2RwACoWiAsBzHTsmrVljX7/7brNZABQKRQWA50rf7dOihVSjhuk0AAqBogLAc23caF+y2wdwW8yjAsBzzZkjPfqoVLmy6SQAComiAsBzORxSo0amUwC4Auz6AeCZUlJMJwBQBIwWlZSUFP3zn/9UVFSUgoKCVKtWLU2aNElpaWkmYwFwd/Hx9u6e/v2lpCTTaQBcAaO7fl588UVNmzZNc+bMUb169bRt2zYNGTJEERERGj16tMloANzZihVSXJy0bZsUEGA6DYArYLSobNy4UT169FC3bt0kSTVr1tQnn3yibdu2mYwFwN0tXmxfdu1qNgeAK2Z010+7du30zTffaN++fZKk77//XuvWrVPXXH64JCUlKSEhwWkBACdpadKSJfZ1igrg9oxuURk/frzi4+NVp04d+fr6KjU1Vc8995z69u2b4/iYmBhNnDixhFMCcCu7dklHj0ohIVL79qbTALhCRreofPrpp5o7d64+/vhj7dixQ3PmzNHLL7+sOXPm5Dg+Ojpa8fHxGUtsbGwJJwbg8tK3ptxyC8enAB7A6BaVxx57TE888YT+/ve/S5IaNGigP//8UzExMRo0aFC28QEBAQrgBw+AvHB8CuBRjBaVc+fOycfHeaOOr68vX08GUDiWJXXpIiUnS7fdZjoNgCJgtKh0795dzz33nKpXr6569epp586dmjp1qoYOHWoyFgB35XBIzzxjLwA8gsOyLMvUiycmJurpp5/WwoULdfz4cVWtWlV9+/bVM888I39//8s+PiEhQREREYqPj1d4eHgJJAYAAFeqIJ/fRovKlaKoAMiQliZ99ZV0001SWJjpNADyUJDPb871A8AzbNsm3XmnVKuWXVoAeASKCgDPkP5tn44dJR9+tAGegv/NADwDX0sGPBJFBYD7O3ZM2rrVvn7rrWazAChSFBUA7m/ZMvuySROpShWzWQAUKYoKAPe3YoV9ydYUwONQVAC4N8uSvv3Wvn7zzWazAChyRmemBYAisWqV9M03UuvWppMAKGIUFQDuzeGQrrnGXgB4HHb9AAAAl0VRAeC+0tKkQYOkd96Rzp83nQZAMaCoAHBfu3dLH34oPf64VIo92YAnoqgAcF/p3/a54QbJz89sFgDFgqICwH2lF5WbbjKbA0CxoagAcE8pKdLq1fZ1igrgsSgqANzT9u1SYqJUpozUqJHpNACKCUUFgHtK3+3TsaPk62s0CoDiQ1EB4J6OHrULCrt9AI/msCzLMh2isBISEhQREaH4+HiFh4ebjgOgpCUm2uf64f8/4FYK8vnNxAMA3FdYmOkEAIoZu34AuJ/UVNMJAJQQigoA93PzzVLLltLWraaTAChm7PoB4F7OnpXWr7fnUSlXznQaAMWMLSoA3MvmzXZJqVZNiooynQZAMaOoAHAv69bZl+3bSw6H2SwAih1FBYB7WbvWvmzXzmwOACWCogLAfaSkSBs32tfbtzebBUCJoKgAcB+7dtkH05YuLdWrZzoNgBLAt34AuI9SpaTevaWQEMmH37MAb0BRAeA+rr9e+vRT0ykAlCB+JQEAAC6LogLAPZw6Je3da5+EEIDXoKgAcA8LF0p16kh33GE6CYASRFEB4B7SJ3pr0MBsDgAliqICwD2kFxUmegO8CkUFgOs7elTav9+eMr9NG9NpAJQgigoA17d+vX3ZoIE92RsAr0FRAeD6OL8P4LUoKgBcX9YzJgPwKsxMC8D1TZokrVkj3XCD6SQASpjRLSo1a9aUw+HItowcOdJkLACupmtX6YUXpKpVTScBUMKMblHZunWrUlNTM27/+OOP6tSpk3r16mUwFQAAcBVGi0qFChWcbr/wwguqXbu2OnTokOP4pKQkJSUlZdxOSEgo1nwAXMCcOVKlSvbxKSEhptMAKGEuczBtcnKy5s6dq6FDh8rhcOQ4JiYmRhERERlLZGRkCacEUKJSU6VRo6TbbpN+/dV0GgAGOCzLNc7wNX/+fN177706ePCgquayHzqnLSqRkZGKj49XeHh4SUUFUFJ275YaNpRCQ6XTpyVfX9OJABSBhIQERURE5Ovz22W+9fP+++/rtttuy7WkSFJAQIACAgJKMBUAozZtsi9btKCkAF7KJYrKn3/+qZUrV2rBggWmowBwJRs32petWpnNAcAYlzhGZdasWapYsaK6detmOgoAV5JeVFq3NpsDgDHGi0paWppmzZqlQYMGqVQpl9jAA8AVnDol/fKLfZ0tKoDXMl5UVq5cqYMHD2ro0KGmowBwJVu22JfXXCOVL282CwBjjG/C6Ny5s1zki0cAXMktt9jf+vnrL9NJABhkvKgAQI58faX69U2nAGCY8V0/AAAAuaGoAHA9+/ZJAwZIH3xgOgkAwygqAFzPmjXS3LnShx+aTgLAMIoKANfD/CkA/oeiAsD1pE+dT1EBvB5FBYBrOX1a+vln+zoTvQFej6ICwLVs3mxf1qolVaxoNgsA4ygqAFwLx6cAyIKiAsC1HDsm+fhQVABIkhyWG89fn5CQoIiICMXHxys8PNx0HABFJTFRsiyJ/9eARyrI5zdT6ANwPWFhphMAcBHs+gEAAC6LogLAdYwdK7VpI331lekkAFwEu34AuI5Vq6SdO6WkJNNJALgItqgAcA3nzkk//GBfb9nSbBYALqNARWXKlCk6f/58xu01a9YoKctvPomJiRoxYkTRpQPgPXbskFJTpSpVpGrVTKcB4CIKVFSio6OVmJiYcfv222/XoUOHMm6fO3dO7733XtGlA+A90mekbdlScjjMZgHgMgpUVC6dcsWNp2AB4GrSi0qLFmZzAHApHKMCwDVk3aICAP/Dt34AmJeUJDVuLKWkSM2amU4DwIUUuKjMnDlToaGhkqSUlBTNnj1b5cuXlySn41cAIN8CAqQvvjCdAoALKtC5fmrWrClHPg5y++OPP64oVH5xrh8AANxPsZ3r58CBA1eSCwBydvy4VKEC3/YBkA0H0wIwKy1NuuYae/4UfhkCcIkCFZXNmzdryZIlTus+/PBDRUVFqWLFinrggQecJoADgMvau1dKSJASE5noDUA2BSoqEyZM0A/pU1xL2r17t+677z7dcssteuKJJ/TVV18pJiamyEMC8GDpX0tu2lQqxRcRATgrUFHZtWuXbr755ozb8+bNU8uWLTVjxgyNHTtWb7zxhubPn1/kIQF4MOZPAZCHAhWVU6dOqVKlShm3V69erVtvvTXjdvPmzRUbG1t06QB4PooKgDwUqKhUqlQp46vHycnJ2rFjh1q3bp1xf2Jiovz8/Io2IQDPlfWMyUydDyAHBSoqt956q5544gmtXbtW0dHRCg4OVvv27TPu/+GHH1S7du0iDwnAQ6WfMblyZSky0nQaAC6oQEeuTZ48WXfddZc6dOig0NBQzZ49W/7+/hn3f/DBB+rcuXORhwTgoSpUkMaOlQIDmUMFQI4KNDNtuvj4eIWGhsrX19dp/cmTJxUWFlZiu3+YmRYAAPdTbDPTDh06NF/jPvjgg4I8LQAAQI4KVFRmz56tGjVqqHHjxirEhhgAyBQfL23fLjVvLoWFmU4DwEUVqKgMHz5c8+bN0++//66hQ4eqf//+Klu2bHFlA+DJ1qyR7rhDql9f2r3bdBoALqpA3/p55513dOTIEY0fP15fffWVIiMj1bt3by1btowtLAAKJn3+lObNzeYA4NIKfFLCgIAA9e3bVytWrNDPP/+sevXqacSIEapRo4bOnDlTHBkBeKItW+xLJnoDkIcrOnuyw+GQw+GQZVlKS0srqkwAPF1aGkUFQL4UuKgkJSXpk08+UadOnXTttddq9+7deuutt3Tw4EGFhoYWOMChQ4fUv39/lStXTsHBwbr++uu1ffv2Aj8PADeyb599MG1QkH2MCgDkokAH044YMULz5s1T9erVNWTIEM2bN0/lypUr9IufOnVKbdu21Y033qglS5aoYsWK+u2331S6dOlCPycAN5C+NYUzJgO4jAL9hJg2bZqqV6+uqKgorV69WqtXr85x3IIFC/L1fC+++KIiIyM1a9asjHU1a9YsSCQA7ogTEQLIpwIVlYEDB8pRhNNcL1q0SF26dFGvXr20evVqXXXVVRoxYoSGDRuW4/ikpCQlJSVl3E5ISCiyLABK0MiRUr169hYVAMhDoabQLyqBgYGSpLFjx6pXr17asmWLxowZo/fee08DBw7MNn7ChAmaOHFitvVMoQ8AgPsoyBT6RouKv7+/mjVrpg0bNmSse/jhh7V161Zt3Lgx2/ictqhERkZSVAAAcCMFKSpX9PXkK1WlShXVrVvXad11112ngwcP5jg+ICBA4eHhTgsAN7NypTRjhvTbb6aTAHADRotK27ZttXfvXqd1+/btU40aNQwlAlDs3n9feuABad4800kAuAGjReWRRx7Rpk2b9Pzzz+vXX3/Vxx9/rOnTp2vkyJEmYwEoTunf+GnRwmwOAG7B6DEqkvT1118rOjpa+/fvV1RUlMaOHZvrt34uVZB9XABcwIkTUsWK9vVTpyTmTAK8UkE+v43PtHT77bfr9ttvNx0DQElIn+itTh1KCoB8MbrrB4CXYbcPgAKiqAAoOZyIEEABUVQAlAzLkrZuta9TVADkk/FjVAB4CYdD2r/fLisNGphOA8BNUFQAlJyyZaUuXUynAOBG2PUDAABcFkUFQMl48EEpOlo6dMh0EgBuhF0/AIrf2bP21PmpqdKIEabTAHAjbFEBUPy2b7dLylVXSZGRptMAcCMUFQDFb9Mm+7JVK7M5ALgdigqA4rdxo31JUQFQQBQVAMXLstiiAqDQKCoAitfBg9LRo1KpUlLTpqbTAHAzfOsHQPE6cMCe6K1WLSkoyHQaAG6GogKgeHXoIP31l3TypOkkANwQu34AFD+HQypXznQKAG6IogKg+FiW6QQA3BxFBUDx2bxZql5d+sc/TCcB4KYoKgCKz8aNUmws5/cBUGgUFQDFJ33+lNatzeYA4LYoKgCKDxO9AbhCFBUAxePwYXuyNx8fqVkz02kAuCmKCoDisXmzfVm/vhQWZjYLALdFUQFQPDgRIYAiQFEBUDwiI6XmzaX27U0nAeDGHJblvjMyJSQkKCIiQvHx8QoPDzcdBwAA5ENBPr/ZogIAAFwWRQVA0Tt0SDp3znQKAB6AogKg6I0cKUVESB9+aDoJADdHUQFQtCxLWrdOSkmR/vY302kAuDmKCoCitXevFBcnBQZKTZqYTgPAzVFUABSt9evtyxYtJH9/s1kAuD2KCoCitW6dfdmundkcADwCRQVA0aKoAChCFBUARefYMenXXyWHQ2rd2nQaAB6glOkAADyIn5/02mvSn39KpUubTgPAA1BUABSdsmWl0aNNpwDgQdj1AwAAXBZFBUDROHdOmjVL2r/fnvQNAIqA0aIyYcIEORwOp6Vy5comIwEorM2bpaFDpZtvtg+mBYAiYPwYlXr16mnlypUZt319fQ2mAVBo6V9LbtvWbA4AHsV4USlVqhRbUQBPwPwpAIqB8WNU9u/fr6pVqyoqKkp///vf9fvvv+c6NikpSQkJCU4LABeQmipt3Ghfp6gAKEJGi0rLli314YcfatmyZZoxY4aOHj2qNm3aKC4uLsfxMTExioiIyFgiIyNLODGAHO3eLSUmSuHhUv36ptMA8CAOy3Kdw/PPnj2r2rVr6/HHH9fYsWOz3Z+UlKSkpKSM2wkJCYqMjFR8fLzCw8NLMiqArN54w54/pUsXaelS02kAuLiEhARFRETk6/Pb+DEqWYWEhKhBgwbav39/jvcHBAQoICCghFMBuKxVq+zLjh1NpgDggYwfo5JVUlKS9uzZoypVqpiOAqAgZs6UFi6Uevc2nQSAhzG6RWXcuHHq3r27qlevruPHj2vy5MlKSEjQoEGDTMYCUFBly0p33mk6BQAPZLSo/Pe//1Xfvn31119/qUKFCmrVqpU2bdqkGjVqmIwFAABchNGiMm/ePJMvD6AoREfbZ02+7z6JXzIAFDGXOpgWgJtJSZHeftv+anLPnhQVAEXOpQ6mBeBmtm+3S0qZMlKjRqbTAPBAFBUAhffdd/Zlx46SDz9OABQ9frIAKLxvv7Uvb7zRbA4AHouiAqBwkpMzT0RIUQFQTCgqAApn82bp/HmpQgWpXj3TaQB4KIoKgMI5fFgqXdremuJwmE4DwEPx9WQAhdOnj3TPPVJ8vOkkADwYW1QAFJ6vrz19PgAUE4oKgIK7eFGyLNMpAHgBigqAgps4UapZU5o+3XQSAB6OogKg4L77Tjp40N71AwDFiKICoGDi46UtW+zrN91kNgsAj0dRAVAwK1faJyP829+kqCjTaQB4OIoKgIJZssS+7NrVbA4AXoGiAiD/LEtavNi+TlEBUAIoKgDy7/vvpSNHpOBg6YYbTKcB4AWYmRZA/oWGSg8/LKWmSgEBptMA8AIUFQD5d/XV0uuvm04BwIuw6wcAALgsigqA/Nm1y57oLTnZdBIAXoSiAiB/XnvNnuDt6adNJwHgRSgqAC4vLU1autS+3qWL2SwAvApFBcDl7dwpHTtmf+unXTvTaQB4EYoKgMtLn432llskf3+zWQB4FYoKgMtLn432ttvM5gDgdSgqAPIWFydt2mRfp6gAKGEUFQB5W7XKPsdPgwZSZKTpNAC8DDPTAsjbXXdJO3ZIJ06YTgLAC1FUAOTN4ZAaNzadAoCXYtcPAABwWRQVALl78EFp4EDpp59MJwHgpSgqAHJ27pw0d6700Uf2dQAwgKICIGdLltgFpWZNqVkz02kAeCmKCoCcffaZfXnPPfYBtQBgAEUFQHbnz0tff21f79XLbBYAXo2iAiC7JUuks2elGjWk5s1NpwHgxSgqALJjtw8AF0FRAZBdkyZS3brs9gFgnMOyLMt0iMJKSEhQRESE4uPjFR4ebjoO4Hksiy0qAIpcQT6/XWaLSkxMjBwOh8aMGWM6CoB0lBQAhrlEUdm6daumT5+uhg0bmo4CeLdz56T//IcJ3gC4DONF5cyZM+rXr59mzJihMmXKmI4DeLfPPrOPS2nVynQSAJDkAkVl5MiR6tatm2655ZbLjk1KSlJCQoLTAqAIzZxpX/7972ZzAMD/lDL54vPmzdOOHTu0devWfI2PiYnRxIkTizkV4KX27JHWrZN8faXBg02nAQBJBreoxMbGavTo0Zo7d64CAwPz9Zjo6GjFx8dnLLGxscWcEvAi779vX3brJlWtajYLAPyPsa8nf/HFF+rZs6d8fX0z1qWmpsrhcMjHx0dJSUlO9+WErycDRSQpSapWTfrrL2nRIql7d9OJAHiwgnx+G9v1c/PNN2v37t1O64YMGaI6depo/Pjxly0pAIrQokV2SalaVbrtNtNpACCDsaISFham+vXrO60LCQlRuXLlsq0HUMzSjxMbMkQqZfTQNQBwwk8kANKUKdL990vsQgXgYlyqqKxatcp0BMB7/e1vphMAQDbG51EBYFBKinT8uOkUAJArigrgzZYula66Sho+3HQSAMgRRQXwZm+8YW9VCQkxnQQAckRRAbzVtm3SihX2TLSjRplOAwA5oqgA3iomxr68916pZk2jUQAgNxQVwBvt2SMtWGBff+IJs1kAIA8UFcAbvfiifdmzp1S3rtksAJAHigrgbc6ckb76yr4eHW02CwBchktN+AagBISGSr/9ZpeV5s1NpwGAPLFFBfBGpUtLAwaYTgEAl0VRAbzJH39IlmU6BQDkG0UF8BZxcVKjRlKrVtKxY6bTAEC+UFQAb/H001JiopSUJFWoYDoNAOQLRQXwBrt2Se+9Z19//XXJh//6ANwDP60AT2dZ9hT5aWlSnz5Shw6mEwFAvlFUAE/3ySfSunVScLD00kum0wBAgVBUAE925oz02GP29SeflCIjzeYBgAJiwjfAkx09ah84GxgoPfqo6TQAUGAUFcCTXX21tG2bFBtrlxUAcDPs+gE8XalSUlSU6RQAUCgUFcATPfecfcLBlBTTSQDgirDrB/A0q1dLzzxjfx25TRupe3fTiQCg0NiiAniSEyekvn3tkjJoECUFgNujqACeIi3NPiPykSPSdddJb79tOhEAXDGKCuAppkyRli2zv90zf74UEmI6EQBcMYoK4AkWL5b++U/7+ptvSvXrm80DAEWEg2kBT1CqlOTvL/XqJd13n+k0AFBkKCqAJ+jcWdqwQapXT3I4TKcBgCLDrh/AXW3dKu3dm3n7+uslPz9jcQCgOFBUAHe0c6e9FaVDB+eyAgAehqICuJvPPpNuuEE6fVqqXVu66irTiQCg2FBUAHeRkiI99pjUu7d05ozUsaP9bZ/QUNPJAKDYUFQAd3D8uNSpk/Tyy/btceOkFSukiAizuQCgmPGtH8AdTJ4srVplbz2ZNUu65x7TiQCgRFBUAFeVkmLPjyJJMTHS4cPSv/5lT48PAF6CXT+Aqzl5Unr4YenGG+3z90j2dPj/+Q8lBYDXYYsK4Cp++ME+keDcudK5c/a6Vaukm24yGgsATKKoACadOyd9+aX0zjvSunWZ6xs1kl55hZICwOtRVICSlpxsn5dHkn75Rbr3Xvt6qVLSXXdJI0dK7dszFT4AyPAxKu+++64aNmyo8PBwhYeHq3Xr1lqyZInJSEDROnNGWrtWmjrVLiTXXCM98EDm/Y0bS23bShMmSAcPSp9+ak/mRkkBAEmGt6hUq1ZNL7zwgq6++mpJ0pw5c9SjRw/t3LlT9erVMxkNyJ/EROnoUen8ealhw8z1PXtKO3ZIsbGSZTk/5tw5e53DYS9Zd/kAAJw4LOvSn6JmlS1bVi+99JLuy8ep6hMSEhQREaH4+HiFh4cXbZBz56QTJ3K/v2xZKSzMvn7+vD0hV27KlJHS8yUl2R9suYmIkEqXtq8nJ0tHjuQ+Njzcfm5JunjR/vpqbsLC7MySlJoq/fe/uY8NCZHKl7evp6XZH7a5CQ6WKlSwr1uWvVUgq6z/vIKCpEqVMm8fOGDfn9M/wcBAqWrVzNu//27nTh+fdQkKkqKiMsf+9JP9Pqel2UtqauYSHCw1b545dtkyeyr6pCR7SU62Ly9csN+zUaMyxz7wgL2r5vRpe4mLyzzo9W9/cz7nTtOmdlGR7CnumzeXmjWzL1u0yPw7BgAvVKDPb8tFpKSkWJ988onl7+9v/fTTTzmOuXDhghUfH5+xxMbGWpKs+Pj4og+0aFFOH4mZy7vvZo5duTLvsS+/nDl248a8x06alDn2hx/yHjt+fObYX3/Ne+xDD2WOPXIk77FDh2aOTUjIe2yfPpljU1LyHnv77c7vcUBA7mNvusl5bJkyuY9t2dJ57FVX5T62QQPnsddck/vY2rWdxzZunPO40FD7vqxWrbKs9est69gxCwDgLD4+Pt+f38YPpt29e7dat26tCxcuKDQ0VAsXLlTdunVzHBsTE6OJEyeWTDAfH/u3+tz4+uZ/bKksb7PDYWasn5/z7cKOvfTYifSDQtMFBWV/vvTHBAQ4rw8JyfwzXPq8lz5PeHjmnCLpu0zSr186jXyVKvZ6Hx/70tfXXnx8nLe8SPbWjapV7WyXLlm36Ej2ZGvnz9tbQ0qXtrdmVaqU87l2OnTIvg4AUGDGd/0kJyfr4MGDOn36tD7//HPNnDlTq1evzrGsJCUlKSkpKeN2QkKCIiMji2fXDwAAKBYF2fVjvKhc6pZbblHt2rX13nvvXXZssR6jAgAAikVBPr9dbgp9y7KctpoAAADvZfQYlSeffFK33XabIiMjlZiYqHnz5mnVqlVaunSpyVgAAMBFGC0qx44d04ABA3TkyBFFRESoYcOGWrp0qTp16mQyFgAAcBFGi8r7779v8uUBAICLc7ljVAAAANJRVAAAgMuiqAAAAJdFUQEAAC6LogIAAFwWRQUAALgsigoAAHBZFBUAAOCyKCoAAMBlGZ2Z9kqln/g5ISHBcBIAAJBf6Z/b6Z/jeXHropKYmChJioyMNJwEAAAUVGJioiIiIvIc47DyU2dcVFpamg4fPqywsDA5HI4ife6EhARFRkYqNjZW4eHhRfrcyMT7XDJ4n0sG73PJ4H0uOcX1XluWpcTERFWtWlU+PnkfheLWW1R8fHxUrVq1Yn2N8PBw/iOUAN7nksH7XDJ4n0sG73PJKY73+nJbUtJxMC0AAHBZFBUAAOCyKCq5CAgI0LPPPquAgADTUTwa73PJ4H0uGbzPJYP3ueS4wnvt1gfTAgAAz8YWFQAA4LIoKgAAwGVRVAAAgMuiqAAAAJdFUSmApKQkXX/99XI4HNq1a5fpOB7lwIEDuu+++xQVFaWgoCDVrl1bzz77rJKTk01Hc3vvvPOOoqKiFBgYqKZNm2rt2rWmI3mcmJgYNW/eXGFhYapYsaLuvPNO7d2713QsjxcTEyOHw6ExY8aYjuJxDh06pP79+6tcuXIKDg7W9ddfr+3btxvJQlEpgMcff1xVq1Y1HcMj/fLLL0pLS9N7772nn376Sa+++qqmTZumJ5980nQ0t/bpp59qzJgxeuqpp7Rz5061b99et912mw4ePGg6mkdZvXq1Ro4cqU2bNmnFihVKSUlR586ddfbsWdPRPNbWrVs1ffp0NWzY0HQUj3Pq1Cm1bdtWfn5+WrJkiX7++We98sorKl26tJlAFvJl8eLFVp06dayffvrJkmTt3LnTdCSPN2XKFCsqKsp0DLfWokULa/jw4U7r6tSpYz3xxBOGEnmH48ePW5Ks1atXm47ikRITE61rrrnGWrFihdWhQwdr9OjRpiN5lPHjx1vt2rUzHSMDW1Ty4dixYxo2bJg++ugjBQcHm47jNeLj41W2bFnTMdxWcnKytm/frs6dOzut79y5szZs2GAolXeIj4+XJP79FpORI0eqW7duuuWWW0xH8UiLFi1Ss2bN1KtXL1WsWFGNGzfWjBkzjOWhqFyGZVkaPHiwhg8frmbNmpmO4zV+++03vfnmmxo+fLjpKG7rr7/+UmpqqipVquS0vlKlSjp69KihVJ7PsiyNHTtW7dq1U/369U3H8Tjz5s3Tjh07FBMTYzqKx/r999/17rvv6pprrtGyZcs0fPhwPfzww/rwww+N5PHaojJhwgQ5HI48l23btunNN99UQkKCoqOjTUd2S/l9n7M6fPiwbr31VvXq1Uv333+/oeSew+FwON22LCvbOhSdhx56SD/88IM++eQT01E8TmxsrEaPHq25c+cqMDDQdByPlZaWpiZNmuj5559X48aN9eCDD2rYsGF69913jeQpZeRVXcBDDz2kv//973mOqVmzpiZPnqxNmzZlO89Bs2bN1K9fP82ZM6c4Y7q9/L7P6Q4fPqwbb7xRrVu31vTp04s5nWcrX768fH19s209OX78eLatLCgao0aN0qJFi7RmzRpVq1bNdByPs337dh0/flxNmzbNWJeamqo1a9borbfeUlJSknx9fQ0m9AxVqlRR3bp1ndZdd911+vzzz43k8dqiUr58eZUvX/6y49544w1Nnjw54/bhw4fVpUsXffrpp2rZsmVxRvQI+X2fJfvrcDfeeKOaNm2qWbNmycfHazf4FQl/f381bdpUK1asUM+ePTPWr1ixQj169DCYzPNYlqVRo0Zp4cKFWrVqlaKiokxH8kg333yzdu/e7bRuyJAhqlOnjsaPH09JKSJt27bN9vX6ffv2qUaNGkbyeG1Rya/q1as73Q4NDZUk1a5dm9+YitDhw4fVsWNHVa9eXS+//LJOnDiRcV/lypUNJnNvY8eO1YABA9SsWbOMrVQHDx7k2J8iNnLkSH388cf68ssvFRYWlrEVKyIiQkFBQYbTeY6wsLBsx/2EhISoXLlyHA9UhB555BG1adNGzz//vHr37q0tW7Zo+vTpxrZyU1TgEpYvX65ff/1Vv/76a7YCaHGC70Lr06eP4uLiNGnSJB05ckT169fX4sWLjf1m5KnS99137NjRaf2sWbM0ePDgkg8EXIHmzZtr4cKFio6O1qRJkxQVFaXXXntN/fr1M5LHYfEpAAAAXBQHAQAAAJdFUQEAAC6LogIAAFwWRQUAALgsigoAAHBZFBUAAOCyKCoAAMBlUVQAAIDLoqgAHqhjx44aM2aM6Rg5iouLU8WKFXXgwAFJ0qpVq+RwOHT69Olifd3Cvs7s2bNVunTpAj2mefPmWrBgQYEeAyBnFBUAl3XkyBHde++9uvbaa+Xj45NrCfr8889Vt25dBQQEqG7dulq4cGG2MTExMerevbvTWbM9zdNPP60nnnhCaWlppqMAbo+iAuCykpKSVKFCBT311FNq1KhRjmM2btyoPn36aMCAAfr+++81YMAA9e7dW5s3b84Yc/78eb3//vu6//77Syq6Ed26dVN8fLyWLVtmOgrg9igqgIc7deqUBg4cqDJlyig4OFi33Xab9u/f7zRmxowZioyMVHBwsHr27KmpU6c67e6oWbOmXn/9dQ0cOFARERE5vs5rr72mTp06KTo6WnXq1FF0dLRuvvlmvfbaaxljlixZolKlSql169a55o2Li1Pfvn1VrVo1BQcHq0GDBvrkk0+cxnTs2FGjRo3SmDFjVKZMGVWqVEnTp0/X2bNnNWTIEIWFhal27dpasmRJtudfv369GjVqpMDAQLVs2VK7d+92un/27NmqXr16xnsRFxfndP9vv/2mHj16qFKlSgoNDVXz5s21cuVKpzG+vr7q2rVrttwACo6iAni4wYMHa9u2bVq0aJE2btwoy7LUtWtXXbx4UZL9wT18+HCNHj1au3btUqdOnfTcc88V+HU2btyozp07O63r0qWLNmzYkHF7zZo1atasWZ7Pc+HCBTVt2lRff/21fvzxRz3wwAMaMGCA05YZSZozZ47Kly+vLVu2aNSoUfrHP/6hXr16qU2bNtqxY4e6dOmiAQMG6Ny5c06Pe+yxx/Tyyy9r69atqlixou64446M92Lz5s0aOnSoRowYoV27dunGG2/U5MmTnR5/5swZde3aVStXrtTOnTvVpUsXde/eXQcPHnQa16JFC61duzZ/bx6A3FkAPE6HDh2s0aNHW/v27bMkWevXr8+476+//rKCgoKs+fPnW5ZlWX369LG6devm9Ph+/fpZEREReT73pfz8/Kx///vfTuv+/e9/W/7+/hm3e/ToYQ0dOtRpzHfffWdJsk6dOpXrn6dr167Wo48+6pShXbt2GbdTUlKskJAQa8CAARnrjhw5YkmyNm7c6PQ68+bNyxgTFxdnBQUFWZ9++qllWZbVt29f69Zbb3V67T59+uT6XqSrW7eu9eabbzqt+/LLLy0fHx8rNTU1z8cCyBtbVAAPtmfPHpUqVUotW7bMWFeuXDlde+212rNnjyRp7969atGihdPjLr2dXw6Hw+m2ZVlO686fP6/AwMA8nyM1NVXPPfecGjZsqHLlyik0NFTLly/PtsWiYcOGGdd9fX1Vrlw5NWjQIGNdpUqVJEnHjx93elzW3U5ly5Z1ei/27NmTbbfUpbfPnj2rxx9/XHXr1lXp0qUVGhqqX375JVu+oKAgpaWlKSkpKc8/L4C8lTIdAEDxsSwr1/XpBeLSMpHX4/JSuXJlHT161Gnd8ePHMwqDJJUvX16nTp3K83leeeUVvfrqq3rttdfUoEEDhYSEaMyYMUpOTnYa5+fn53Tb4XA4rUv/M+XnmzdZ34vLeeyxx7Rs2TK9/PLLuvrqqxUUFKR77rknW76TJ08qODhYQUFBl31OALljiwrgwerWrauUlBSn4zvi4uK0b98+XXfddZKkOnXqaMuWLU6P27ZtW4Ffq3Xr1lqxYoXTuuXLl6tNmzYZtxs3bqyff/45z+dZu3atevToof79+6tRo0aqVatWtoN/r8SmTZsyrp86dUr79u1TnTp1JNnvV9b7Lx2fnm/w4MHq2bOnGjRooMqVK2fMCZPVjz/+qCZNmhRZbsBbUVQAD3bNNdeoR48eGjZsmNatW6fvv/9e/fv311VXXaUePXpIkkaNGqXFixdr6tSp2r9/v9577z0tWbIk21aWXbt2adeuXTpz5oxOnDihXbt2OZWO0aNHa/ny5XrxxRf1yy+/6MUXX9TKlSud5lzp0qWLfvrppzy3qlx99dVasWKFNmzYoD179ujBBx/MtqXmSkyaNEnffPONfvzxRw0ePFjly5fXnXfeKUl6+OGHtXTpUk2ZMkX79u3TW2+9paVLl2bLt2DBAu3atUvff/+97r333hy32qxduzbbwcUACo6iAni4WbNmqWnTprr99tvVunVrWZalxYsXZ+wmadu2raZNm6apU6eqUaNGWrp0qR555JFsx5I0btxYjRs31vbt2/Xxxx+rcePG6tq1a8b9bdq00bx58zRr1iw1bNhQs2fP1qeffup0fEyDBg3UrFkzzZ8/P9e8Tz/9tJo0aaIuXbqoY8eOqly5ckaRKAovvPCCRo8eraZNm+rIkSNatGiR/P39JUmtWrXSzJkz9eabb+r666/X8uXL9c9//tPp8a+++qrKlCmjNm3aqHv37urSpUu2LSeHDh3Shg0bNGTIkCLLDXgrh1WYndEAPNqwYcP0yy+/FMvXaxcvXqxx48bpxx9/lI+PZ/6u9Nhjjyk+Pl7Tp083HQVwexxMC0Avv/yyOnXqpJCQEC1ZskRz5szRO++8Uyyv1bVrV+3fv1+HDh1SZGRksbyGaRUrVtS4ceNMxwA8AltUAKh3795atWqVEhMTVatWLY0aNUrDhw83HQsAKCoAAMB1eeYOYgAA4BEoKgAAwGVRVAAAgMuiqAAAAJdFUQEAAC6LogIAAFwWRQUAALgsigoAAHBZ/w+lWe1SRPeXBwAAAABJRU5ErkJggg==\n", - "text/plain": [ - "
    " - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": 1, + "id": "2e3cf910", + "metadata": { + "collapsed": false, + "editable": true + }, + "outputs": [], "source": [ "%matplotlib inline\n", "\n", @@ -1104,7 +1299,7 @@ "# Decide which values of lambda to use\n", "nlambdas = 100\n", "MSEPredict = np.zeros(nlambdas)\n", - "lambdas = np.logspace(-4, 6, nlambdas)\n", + "lambdas = np.logspace(-4, 4, nlambdas)\n", "for i in range(nlambdas):\n", " lmb = lambdas[i]\n", " Ridgebeta = np.linalg.inv(X.T @ X+lmb*I) @ X.T @ y\n", @@ -1124,246 +1319,33 @@ }, { "cell_type": "markdown", - "id": "d72ebdaa", - "metadata": {}, + "id": "03cc93ef", + "metadata": { + "editable": true + }, "source": [ "We see here that we reach a plateau. What is actually happening?" ] }, { "cell_type": "markdown", - "id": "59d5a66e", - "metadata": {}, + "id": "49b48768", + "metadata": { + "editable": true + }, "source": [ "## With Lasso Regression" ] }, { "cell_type": "code", - "execution_count": 3, - "id": "0477b499", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[2. 2.]\n", - "Training MSE for OLS\n", - "3.0\n", - "[1.99995 1.99980002]\n", - "[1.999925 1.9997 ]\n", - "[1.99993978 1.99975913]\n", - "[1.99990966 1.99963865]\n", - "[1.99992746 1.99970988]\n", - "[1.99989119 1.99956475]\n", - "[1.99991263 1.99965056]\n", - "[1.99986894 1.99947574]\n", - "[1.99989476 1.99957911]\n", - "[1.99984213 1.99936853]\n", - "[1.99987324 1.99949306]\n", - "[1.99980985 1.99923939]\n", - "[1.99984732 1.99938942]\n", - "[1.99977096 1.99908384]\n", - "[1.9998161 1.99926459]\n", - "[1.99972412 1.99889649]\n", - "[1.99977849 1.99911427]\n", - "[1.9996677 1.99867081]\n", - "[1.9997332 1.99893323]\n", - "[1.99959975 1.99839899]\n", - "[1.99967865 1.99871521]\n", - "[1.99951789 1.99807158]\n", - "[1.99961294 1.99845267]\n", - "[1.9994193 1.99767721]\n", - "[1.99953381 1.99813653]\n", - "[1.99930055 1.99720219]\n", - "[1.9994385 1.99775587]\n", - "[1.99915751 1.99663003]\n", - "[1.9993237 1.99729756]\n", - "[1.99898521 1.99594086]\n", - "[1.99918546 1.9967458 ]\n", - "[1.99877769 1.99511075]\n", - "[1.99901896 1.99608161]\n", - "[1.99852772 1.99411088]\n", - "[1.99881845 1.99528218]\n", - "[1.99822663 1.99290653]\n", - "[1.998577 1.9943201]\n", - "[1.99786397 1.99145589]\n", - "[1.99828624 1.99316252]\n", - "[1.99742715 1.98970859]\n", - "[1.99793613 1.99176998]\n", - "[1.99690099 1.98760396]\n", - "[1.99751458 1.99009525]\n", - "[1.99626723 1.98506893]\n", - "[1.99700706 1.98808176]\n", - "[1.99550387 1.98201547]\n", - "[1.9963961 1.98566191]\n", - "[1.99458439 1.97833757]\n", - "[1.99566069 1.98275501]\n", - "[1.99347688 1.97390753]\n", - "[1.9947756 1.97926491]\n", - "[1.99214288 1.96857153]\n", - "[1.99371056 1.97507735]\n", - "[1.99053607 1.96214429]\n", - "[1.99242921 1.97005689]\n", - "[1.98860067 1.95440267]\n", - "[1.99088801 1.9640435 ]\n", - "[1.98626946 1.94507785]\n", - "[1.9890348 1.95684892]\n", - "[1.98346152 1.93384608]\n", - "[1.98680716 1.9482527 ]\n", - "[1.98007934 1.92031737]\n", - "[1.98413059 1.93799826]\n", - "[1.9760055 1.90402199]\n", - "[1.98091621 1.92578916]\n", - "[1.97109854 1.88439414]\n", - "[1.97705827 1.91128596]\n", - "[1.96518808 1.86075233]\n", - "[1.97243128 1.89410423]\n", - "[1.95806892 1.83227569]\n", - "[1.96688672 1.87381451]\n", - "[1.94949387 1.79797548]\n", - "[1.96024953 1.84994524]\n", - "[1.93916519 1.75666075]\n", - "[1.95231424 1.82198978]\n", - "[1.92672425 1.70689701]\n", - "[1.94284104 1.78941903]\n", - "[1.9117391 1.64695641]\n", - "[1.93155188 1.75170092]\n", - "[1.89368944 1.57475775]\n", - "[1.91812702 1.70832814]\n", - "[1.87194855 1.48779421]\n", - "[1.90220243 1.65885453]\n", - "[1.84576158 1.38304631]\n", - "[1.88336879 1.60293962]\n", - "[1.81421927 1.25687709]\n", - "[1.86117291 1.54039921]\n", - "[1.77622646 1.10490583]\n", - "[1.83512277 1.47125748]\n", - "[1.73046398 0.9218559 ]\n", - "[1.80469739 1.39579407]\n", - "[1.6753429 0.70137162]\n", - "[1.76936315 1.31457796]\n", - "[1.60894938 0.43579751]\n", - "[1.72859758 1.22847924]\n", - "[1.52897814 0.11591257]\n", - "[1.68192193 1.13865173]\n", - "[1.4326525 0. ]\n", - "[1.62894215 1.04648335]\n", - "[1.31662793 0. ]\n", - "[1.56939714 0.95351665]\n", - "[1.17687593 0. ]\n", - "[1.50321091 0.86134827]\n", - "[1.00854414 0. ]\n", - "[1.43054282 0.77152076]\n", - "[0.8057879 0. ]\n", - "[1.35182854 0.68542204]\n", - "[0.5615673 0. ]\n", - "[1.26780278 0.60420593]\n", - "[0.26740272 0. ]\n", - "[1.17949575 0.52874252]\n", - "[0. 0.]\n", - "[1.0881981 0.45960079]\n", - "[0. 0.]\n", - "[0.99539415 0.39706038]\n", - "[0. 0.]\n", - "[0.90266948 0.34114547]\n", - "[0. 0.]\n", - "[0.81160425 0.29167186]\n", - "[0. 0.]\n", - "[0.7236674 0.24829908]\n", - "[0. 0.]\n", - "[0.64012627 0.21058097]\n", - "[0. 0.]\n", - "[0.56198284 0.17801022]\n", - "[0. 0.]\n", - "[0.48994188 0.15005476]\n", - "[0. 0.]\n", - "[0.42441033 0.12618549]\n", - "[0. 0.]\n", - "[0.3655222 0.10589577]\n", - "[0. 0.]\n", - "[0.31318084 0.08871404]\n", - "[0. 0.]\n", - "[0.26710969 0.07421084]\n", - "[0. 0.]\n", - "[0.22690428 0.06200174]\n", - "[0. 0.]\n", - "[0.19207979 0.0517473 ]\n", - "[0. 0.]\n", - "[0.16211139 0.04315108]\n", - "[0. 0.]\n", - "[0.13646574 0.0359565 ]\n", - "[0. 0.]\n", - "[0.11462415 0.02994311]\n", - "[0. 0.]\n", - "[0.09609807 0.02492265]\n", - "[0. 0.]\n", - "[0.08043851 0.02073509]\n", - "[0. 0.]\n", - "[0.06724062 0.01724499]\n", - "[0. 0.]\n", - "[0.05614483 0.01433809]\n", - "[0. 0.]\n", - "[0.04683565 0.01191824]\n", - "[0. 0.]\n", - "[0.039039 0.00990475]\n", - "[0. 0.]\n", - "[0.03251863 0.00823002]\n", - "[0. 0.]\n", - "[0.02707227 0.00683748]\n", - "[0. 0.]\n", - "[0.02252765 0.0056799 ]\n", - "[0. 0.]\n", - "[0.01873869 0.00471782]\n", - "[0. 0.]\n", - "[0.01558197 0.00391839]\n", - "[0. 0.]\n", - "[0.01295356 0.0032542 ]\n", - "[0. 0.]\n", - "[0.01076611 0.00270244]\n", - "[0. 0.]\n", - "[0.00894639 0.00224413]\n", - "[0. 0.]\n", - "[0.0074331 0.00186347]\n", - "[0. 0.]\n", - "[0.00617499 0.00154733]\n", - "[0. 0.]\n", - "[0.00512927 0.00128479]\n", - "[0. 0.]\n", - "[0.00426027 0.00106677]\n", - "[0. 0.]\n", - "[0.00353823 0.00088573]\n", - "[0. 0.]\n", - "[0.00293838 0.00073541]\n", - "[0. 0.]\n", - "[0.0024401 0.00061058]\n", - "[0. 0.]\n", - "[0.00202624 0.00050694]\n", - "[0. 0.]\n", - "[0.00168251 0.00042089]\n", - "[0. 0.]\n", - "[0.00139705 0.00034944]\n", - "[0. 0.]\n", - "[0.00115999 0.00029012]\n", - "[0. 0.]\n", - "[0.00096314 0.00024087]\n", - "[0. 0.]\n", - "[0.00079968 0.00019998]\n", - "[0. 0.]\n" - ] - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAioAAAGwCAYAAACHJU4LAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAA9hAAAPYQGoP6dpAABVRklEQVR4nO3dd3hT1RsH8G+6dxmllNFCAWXJHhVEhmwqVkBAQLYoQwRRhKJgHYjwQ0BBZEorIIgsESkUZMpeRWTvvUfaQmlpe35/HJM2dNCUpCfj+3me+9ybm5N734SWvD1TI4QQICIiIrJADqoDICIiIsoOExUiIiKyWExUiIiIyGIxUSEiIiKLxUSFiIiILBYTFSIiIrJYTFSIiIjIYjmpDuBZpKWl4erVq/D29oZGo1EdDhEREeWCEALx8fEoXrw4HBxyrjOx6kTl6tWrCAwMVB0GERER5cGlS5dQsmTJHMtYdaLi7e0NQL5RHx8fxdEQERFRbsTFxSEwMFD/PZ4Tq05UdM09Pj4+TFSIiIisTG66bbAzLREREVksJipERERksZioEBERkcWy6j4quZWamorHjx+rDoPsjLOzMxwdHVWHQURk1Ww6URFC4Pr167h//77qUMhOFShQAAEBAZznh4goj2w6UdElKf7+/vDw8OCXBeUbIQQePnyImzdvAgCKFSumOCIiIutks4lKamqqPkkpXLiw6nDIDrm7uwMAbt68CX9/fzYDERHlgc12ptX1SfHw8FAcCdkz3c8f+0gREeWNzSYqOmzuIZX480dE9GxsPlEhIiIi68VEhYiIiCwWExV6JufPn4dGo0FsbGy2ZTZv3gyNRmOVw8R79eqF119/XXUYRER2y2ZH/VizXr16ISoqCu+++y5mzJhh8NzAgQPx448/omfPnoiMjAQgR5WMHj0a0dHRuHHjBgoWLIhq1aohIiIC9erVAwCULl0aFy5cyHSvcePGYeTIkVnG0bhxY2zZsgWAnLwsMDAQnTp1QkREBFxdXQEAgYGBuHbtGvz8/Ez19vMku/en06hRI2zevNno63733XcQQjxDZGRVrl8HkpKyfs7BAQgMTH984wbw6FH21ypVKv345k0gMTH7skFBgK4/061bwMOH2ZcNDJSxAMCdO0BCQvZlS5QAnP77b/7uXSA+PvuyxYsDzs7y+N49IC4u+7LFigEuLvJYqwVy+iOkaFHAzU0ex8XJa2fH3x/4b7Qc4uNlzNkpUgTQDZZISJCfRXb8/ABPT3n88KH8jLNTqBCgW9E3MVH+22WnYEFAtyDuo0fyZyI7BQoAvr7yODkZuHYt+7I+PvLaAPD4MXD1avZlvb1lzACQkgJcuZJ9WS8vQDcKNjUVuHw5+7KenvJzswTCimm1WgFAaLXaTM8lJiaKo0ePisTERAWRPZuePXuKwMBA4evrKx4+fKg/n5iYKAoUKCCCgoJEz5499ecbNGggQkJCxMaNG8X58+fF7t27xddffy1Wr16tL1OqVCnxxRdfiGvXrhlsCQkJ2cbRqFEj0a9fP3Ht2jVx4cIFsXTpUuHt7S1Gjhxp1PvZtGmTACDu3btn1OuMcfPmTf17WrZsmQAgTpw4oT93584dg/LJyclmiyUja/45tEv16gkBZL0VKGBYtlmz7Ms6OxuWfe217MsCQmT8eezSJeey9++nl3377ZzLXrmSXvb993Mue/Jketnw8JzLxsaml/3qq5zLbt+eXnbSpJzLxsSkl505M+eyK1eml12wIOeyCxeml12+POeys2all123LueyU6akl/3775zLjh2bXvbAgZzLjholy127JsScOTmXHTpUlj10SIiePXMu26+fLLtnjxC1a+dctmtXYU45fX8/yT6bfh48yH578i+knMo++RdSVmXyqGbNmggKCsLy5cv155YvX47AwEDUqFFDf+7+/fv4+++/MX78eDRp0gSlSpVC3bp1ER4ejtDQUINrent7IyAgwGDz1P2VkQ0PDw8EBAQgKCgIHTp0QPPmzRETE6N/PqumnzVr1uD555+Hu7s7mjRpgvPnz2e67uzZsxEYGAgPDw+0a9cOkyZNQoECBQzK/PHHH6hVqxbc3NxQpkwZfP7550hJSckyziJFiujfU6H//rrw9/fXnytcuDBmzJiBsLAweHp64quvvkJqair69u2L4OBguLu7o3z58vjuu+8Mrvtk00/jxo3x/vvv4+OPP0ahQoUQEBCAiIiIHD9DsmAtWwKNGwOnT8vHLi7yr//stozMVdbZWU3ZjCPUnJxMV9Yhw9eMo6OashnnMHJwMH1ZIeR3h6ur3Fxc5Oft7Cw/H1fX9JqtP/4A3nrL8DN88mdFV3bTJuDtt7Mup/s31dWCXbwIREXlrmxCArBvX/ZlnZzSy1oCs6ZMZpbnGpWcssg2bQzLenhkX7ZRI8Oyfn6Zy+RBz549RVhYmJg0aZJo2rSp/nzTpk3F5MmTRVhYmL5G5fHjx8LLy0sMHTpUPHr0KNtrlipVSkyePNmoOBo1aiSGDBmifxwbGyuKFi0qQkJC9OfOnTsnAIiDBw8KIYS4ePGicHV1FUOGDBHHjx8XCxYsEEWLFhVAeo3K33//LRwcHMT//vc/ceLECfHDDz+IQoUKCV9fX/11165dK3x8fERkZKQ4c+aMiImJEaVLlxYRERFPjTurGhwAwt/fX8ydO1ecOXNGnD9/XiQnJ4sxY8aIPXv2iLNnz4oFCxYIDw8P8euvv+pfp/u3yPiZ+Pj4iIiICHHy5EkRFRUlNBqNiMn4l2AGrFGxcF5e8vf01CnVkZClevRIiNOnhdi0SYiff5Y1IwMGCHH8eHqZ77/Pfe3PwoU5l/3ll/SyGzYIUa2aEC++KETjxkK0aiXE668L0bmzED16CLF1a3rZEyeEGD1aiC+/FOKbb4T49lsZ148/ylqZo0fTy966JWP64w8h1qyRNVl//SXE5s1CbNsmxOXL5vo09YypUWGiYsGJyq1bt4Srq6s4d+6cOH/+vHBzcxO3bt0ySFSEEGLp0qWiYMGCws3NTdSvX1+Eh4eLQ4cOGVyzVKlSwsXFRXh6ehpsmzZtyjaORo0aCWdnZ+Hp6SlcXFwEAOHg4CCWLl2qL/NkohIeHi4qVqwo0tLS9GVGjBhhkDh07txZhIaGGtyrW7duBonKyy+/LL7++muDMvPnzxfFihV76ueXXaIyVFdFmoOBAweKDh066B9nlag0aNDA4DV16tQRI0aMyPJ6TFQsWHJy+u/pE02DZGceP5aJR1xc+rlffhEiMFAIjSbr//9XrEgv++uv8pybm3xNjRqyabBTJyHeeUeIffvSy16+LBOELVtkE9CpU7KJJy5OiJSUfHvLqhmTqNhnZ9qcOp89Oc15Th2pHJ5oOcuiieNZ+Pn5ITQ0FFFRURBCIDQ0NMtOqx06dEBoaCi2bduGnTt3Yu3atZgwYQLmzJmDXr166csNHz7c4DEAlChRIscYunXrhk8++QRxcXEYP348fHx80KFDh2zLHzt2DC+++KLBRGe6Dr06J06cQLt27QzO1a1bF6tXr9Y/3r9/P/bu3YuxY8fqz6WmpuLRo0d4+PBhnmYcrl27dqZzM2bMwJw5c3DhwgUkJiYiOTkZ1atXz/E6VatWNXhcrFgx/Zo+ZEUydurUdXIk23fnDnDggNwOHgSOHAFOnJCdVn//HXjtNVnOxQW4dEkeu7nJDs+BgXIrWRJ47rn0a4aFye+VpzSlA5AdnJ/y/y4Zss9EJTc/TOYum0t9+vTBe++9BwD44Ycfsi3n5uaG5s2bo3nz5hgzZgzefvttfPbZZwaJiZ+fH8qVK2fU/X19ffWvWbBgASpXroy5c+eib9++WZYXQjz1mkKITDO2Pvm6tLQ0fP7552jfvn2m17s92faeS0/2x1myZAk++OADfPvtt6hXrx68vb3xv//9D7t3787xOs5PtN1qNBqkpaXlKSZSSJeo+Ppm/gOFbENqqhxhoxtJtHQp0LFj1mU9PIDbt9MfN24M7NgBlCkjRyPlNMu0rm8KmYV9JipWpFWrVkhOTgYAtGzZMtevq1SpElauXGnSWJydnTFq1CiEh4ejS5cuWdZqZHXfXbt2GTyuUKEC9uzZY3Bu3xMdu2rWrIkTJ04YnVgZY9u2bahfvz4GDhyoP3fmzBmz3Y8sjG7oq25oJ1k/IYDjx4G1a4GYGGD7dmDsWGDwYPl8pUpyX64cULOm3KpWleczDvsG5DDeJ2qDSQ0mKhbO0dERx44d0x8/6c6dO+jYsSP69OmDqlWrwtvbG/v27cOECRMQFhZmUDY+Ph7Xr183OOfh4QEf3TwAudC1a1eMGjUK06dPx0cffZTp+f79++Pbb7/FsGHD8O6772L//v36+V50Bg8ejIYNG2LSpElo27YtNm7ciOjoaINaljFjxuDVV19FYGAgOnbsCAcHB/zzzz84fPgwvvrqq1zHm5Ny5crh559/xrp16xAcHIz58+dj7969CA4ONsn1ycLpalR081WQdUpOBtatA1avlgnKxYuGz+/enZ6oVKgg/92fGGFIls0+hydbGR8fn2yTCS8vL4SEhGDy5Mlo2LAhXnjhBYwePRr9+vXDtGnTDMqOGTMGxYoVM9g+/vhjo2JxcXHBe++9hwkTJiAhi74+QUFBWLZsGf744w9Uq1YNM2bMwNdff21Q5qWXXsKMGTMwadIkVKtWDWvXrsUHH3xg0KTTsmVLrF69GuvXr0edOnXw4osvYtKkSSiVcRKtZ9S/f3+0b98enTt3RkhICO7cuWNQu0I2LiVFVukXLao6EnoW8fFAu3bArFkySXF1BVq0ACZNkn1QMg7ZdXBgkmKFNCI3nQosVFxcHHx9faHVajN9kT969Ajnzp1DcHBwnvs0UP7p168fjh8/jm3btqkOxaT4c0hkQv/+C8ycCVy4AKxalX6+d2/ZR7BNG9m3JA+d7Sl/5fT9/SQ2/ZASEydORPPmzeHp6Yno6GhERUVh+vTpqsMiIkuTmAj89ptMUHbsSD9/+bIcfQMA8+apiY3yBRMVUmLPnj2YMGEC4uPjUaZMGXz//fd4O6cZGInIvty/D/zwAzBlSvpoHCcnORT4nXfkekNkF5iokBJLlixRHQLZu9GjgW3bgCFDZB8HsixLlgCffiqPS5WSyUnv3kxQ7BA70xKRfYqNBbZsyXnVXco/8fHA0aPpj3v0AJo1AxYulGsxjRrFJMVOsUaFiOwThydbhrQ0OTJn1CjAx0d2mNUtoLh+veroyAKwRoWI7JNuwjcmKups2wbUrg306QNcvy6TlifnQSG7x0SFiOwTa1TUSUgABg0CGjaUc534+AATJ8p1d8qWVR0dWRg2/RCR/RGCU+ircu0a0KABcPasfNyvn5zmvkgRtXGRxWKNClE2IiMjUYCzWNqmxEQ59TrAGpX8FhAgVx4OCgI2bJAzyjJJoRwwUbFAvXr1gkajQf/+/TM9N3DgQGg0GoNVkW/evIl3330XQUFBcHV1RUBAAFq2bImdO3fqy5QuXRoajSbT9s0332QbR+PGjTF06FBTvjWz0H1eOW150blzZ5w8edLE0ZJFiI+X0+e7uwPe3qqjsX0XLgCPHsljjUZ2nj18GGjaVG1cZBWYqFiowMBALF68GImJifpzjx49wqJFixAUFGRQtkOHDjh06BCioqJw8uRJrFq1Co0bN8ZdXdX2f7744gtcu3bNYBusW6zLin333XcG7wkA5s2bl+mcjm416qdxd3eHv7+/yeMlC1C0KHDjBvDggfziJPPZsEGuUty/v2xyA+Tnb8RiqGTfmKhYqJo1ayIoKAjLly/Xn1u+fDkCAwNRo0YN/bn79+/j77//xvjx49GkSROUKlUKdevWRXh4OEJDQw2u6e3tjYCAAIPN09MzzzGOGDECzz//PDw8PFCmTBmMHj0ajx8/1j9/6NAhNGnSBN7e3vDx8UGtWrWwb98+AMCFCxfQtm1bFCxYEJ6enqhcuTLWrFmjf+2WLVtQt25duLq6olixYhg5ciRSUlKyjMPX19fgPQFAgQIF9I/ffPNNvPfeexg2bBj8/PzQvHlzAMCkSZNQpUoVeHp6IjAwEAMHDjRYaPHJpp+IiAhUr14d8+fPR+nSpeHr64s333wT8fHxef4MSTEmKeYjhOwg27Kl7A909KjsREtkJPtMVB48yH7TVU/mpmyG2o5syz6D3r17Y16GNSx++ukn9OnTx6CMl5cXvLy8sHLlSiQlJT3T/Yzl7e2NyMhIHD16FN999x1mz56NyZMn65/v1q0bSpYsib1792L//v0YOXIknJ2dAQCDBg1CUlIStm7disOHD2P8+PHw8vICAFy5cgVt2rRBnTp1cOjQIfz444+YO3cuvvrqqzzHGhUVBScnJ2zfvh0zZ84EADg4OOD777/Hv//+i6ioKGzcuPGpq0mfOXMGK1euxOrVq7F69Wps2bIlx+YzIrv06BHw1lvA8OFyyHHv3sDWrWxmo7wRVkyr1QoAQqvVZnouMTFRHD16VCQmJmZ+ocz1s97atDEs6+GRfdlGjQzL+vllLpMHPXv2FGFhYeLWrVvC1dVVnDt3Tpw/f164ubmJW7duibCwMNGzZ099+aVLl4qCBQsKNzc3Ub9+fREeHi4OHTpkcM1SpUoJFxcX4enpabBt2rQp2zgaNWokhgwZkuu4J0yYIGrVqqV/7O3tLSIjI7MsW6VKFREREZHlc6NGjRLly5cXaWlp+nM//PCD8PLyEqmpqU+NA4BYsWKF/nGjRo1E9erVn/q6JUuWiMKFC+sfz5s3T/j6+uoff/bZZ8LDw0PExcXpzw0fPlyEhIRke80cfw5Jnd9/l7+/X3yhOhLbEx8vxCuvyP//nJyEmDZNiAy/y0RC5Pz9/SQOT7Zgfn5+CA0NRVRUFIQQCA0NhZ+fX6ZyHTp0QGhoKLZt24adO3di7dq1mDBhAubMmWPQ6Xb48OEGjwGgRIkSeY5v6dKlmDJlCk6fPo2EhASkpKQYLNc9bNgwvP3225g/fz6aNWuGjh07oux/cyS8//77GDBgAGJiYtCsWTN06NABVatWBQAcO3YM9erVM+gE+9JLLyEhIQGXL1/O1EcnN2rXrp3p3KZNm/D111/j6NGjiIuLQ0pKCh49eoQHDx5k2yRWunRpeGf4q7BYsWK4efOm0fGQYqdPy+nzn+Hnn7IghFw3aeNGwMsLWLUKaNJEdVRk5eyz6SchIftt2TLDsjdvZl82Otqw7Pnzmcs8oz59+iAyMhJRUVGZmn0ycnNzQ/PmzTFmzBjs2LEDvXr1wmeffWZQxs/PD+XKlTPY3N3d8xTXrl278Oabb6J169ZYvXo1Dh48iE8++cSgo2pERASOHDmC0NBQbNy4EZUqVcKKFSsAAG+//TbOnj2L7t274/Dhw6hduzamTp0KABBCZBqpI/7rhJfXETxPJh4XLlxAmzZt8MILL2DZsmXYv38/fvjhBwAw6GfzJF3TlY5Go0FaWlqeYiKFONmbeWg0wMiRQPHiwF9/MUkhk7DPRMXTM/vNzS33ZZ/8ks+qzDNq1aoVkpOTkZycjJYtW+b6dZUqVcKDZ+wjk5Pt27ejVKlS+OSTT1C7dm0899xzuHDhQqZyzz//PD744APExMSgffv2Bn1uAgMD0b9/fyxfvhwffvghZs+erY99x44d+uQEAHbs2AFvb+9nqgHKaN++fUhJScG3336LF198Ec8//zyuXr1qkmuTFWCiYloZflfRtKmssapbV108ZFPY9GPhHB0dcezYMf3xk+7cuYOOHTuiT58+qFq1Kry9vbFv3z5MmDABYWFhBmXj4+Nx/fp1g3MeHh4GzTVPunXrFmJjYw3OBQQEoFy5crh48SIWL16MOnXq4M8//9TXlgBAYmIihg8fjjfeeAPBwcG4fPky9u7diw4dOgAAhg4ditatW+P555/HvXv3sHHjRlSsWBGAnCtmypQpGDx4MN577z2cOHECn332GYYNGwYHB9Pk1mXLlkVKSgqmTp2Ktm3bYvv27ZgxY4ZJrk1WgLPSms6NG8DrrwM//QT89zuc6Y84omdgnzUqVsbHxyfbZMLLywshISGYPHkyGjZsiBdeeAGjR49Gv379MG3aNIOyY8aMQbFixQy2p41y+eWXX1CjRg2DbcaMGQgLC8MHH3yA9957D9WrV8eOHTswevRo/escHR1x584d9OjRA88//zw6deqE1q1b4/PPPwcApKamYtCgQahYsSJatWqF8uXLY/r06QBkv5k1a9Zgz549qFatGvr374++ffvi008/fZaP0UD16tUxadIkjB8/Hi+88AIWLlyIcePGmez6ZOFYo2Ia8fFAmzbArl3Au++qjoZslEZkrF+3MnFxcfD19YVWq830Rf7o0SOcO3cOwcHBcHuyOYcon/Dn0EKFhAB79gC//w689prqaKxTcjIQGiondCtSBNixAyhXTnVUZCVy+v5+EmtUiMj+ODoCzs6sUcmrtDSgVy+ZpHh6AmvWMEkhs1GaqMTHx2Po0KEoVaoU3N3dUb9+fezdu1dlSERkD3bsAJKS5Cq+ZBwhgA8/BBYtApycgOXLgSyG/xOZitJE5e2338b69esxf/58HD58GC1atECzZs1w5coVlWERkT3QaDiFfl789BMwZYo8jowEWrRQGQ3ZAWWJSmJiIpYtW4YJEyagYcOGKFeuHCIiIhAcHIwff/xRVVhERJST114DmjcHvvkG6NZNdTRkB5QNT05JSUFqamqmDobu7u74+++/s3xNUlKSwXo2cXFxT72PFfcVJhvAnz8LdPUq0LUrUKyYbL4g4xQpIie7NNFUAURPo+wnzdvbG/Xq1cOXX36Jq1evIjU1FQsWLMDu3btx7dq1LF8zbtw4+Pr66rfAwMBsr6+bQfThw4dmiZ8oN3Q/f0/OaEsK3bghp8/fskV1JNYjNdVwJm5HRzabUb5ROuHb/Pnz0adPH5QoUQKOjo6oWbMmunbtigMHDmRZPjw8HMOGDdM/jouLyzZZcXR0RIECBfTrsHh4eOR5+nUiYwkh8PDhQ9y8eRMFChTIcrI+UkQ32RtH/OTeF1/IbciQ9P4pRPlEaaJStmxZbNmyBQ8ePEBcXByKFSuGzp07Izg4OMvyrq6ucHV1zfX1AwICAICLxpEyBQoU0P8ckoXQTfbGWWlzZ80amaQAQJ06amMhu2QRU+h7enrC09MT9+7dw7p16zBhwgSTXFej0aBYsWLw9/fPcaE5InNwdnZmTYolYo1K7t24AfToIY8HDGDnWVJCaaKybt06CCFQvnx5nD59GsOHD0f58uXRu3dvk97H0dGRXxhEJHH6/NwRQiYnd+4A1aoBkyerjojslNJu21qtFoMGDUKFChXQo0cPNGjQADExMex4SETmw0Qld5YsAVaskJO6RUYCRjS7E5mS0hqVTp06oVOnTipDICJ78/ixnD6ffVSyFx8PvPeePP7kE6B6daXhkH2z2UUJiYiyJYQccutkEd30LFN0NDBtmqxVcXFRHQ3ZGGO+v/lbSkT2R6NhkvI0rVvLjUgxTi1IRETSrVvA5cuqoyAywESFiOzLm28C7doBZ8+qjsTyDB8OVK4MrFypOhIiPSYqRGRf1q6VX8ScW8nQ3r1AVBQQFyfXQSKyEExUiMh+pKYCWq085vDkdELI6fEBOcFbSIjaeIgyYKJCRPbj/v30YyYq6RYtAnbuBDw9gXHjVEdDZICJChHZD91kb15eci4VAh48AD7+WB6PGgUUL642HqInMFEhIvuhW+eHk72lmzABuHIFKF0ayLA6PZGlYKJCRPaD0+dnlpgIODoCEycCbm6qoyHKhIkKEdmPBw9kkw8TlXQTJgAnTgDt26uOhChLnJqRiOxH+/ZAUhKQnKw6EstStqzqCIiyxRoVIrIvGg1XAgaAb74B/vlHdRRET8VEhYjI3uzZA4SHA7VqAdevq46GKEdMVIjIfowfD7z+OvDnn6ojUSsiQu67dgUCApSGQvQ0TFSIyH7s3An8/rt9L7y3ezcQHS1H+owerToaoqdiokJE9oPDk9NrU7p3B8qVUxoKUW4wUSEi+2HvE77t3CkXZXR0BD79VHU0RLnCRIWI7Ie916joalN69OCQZLIaTFSIyH7Yc6IiBNCyJVCiBGtTyKowUSEi+5CUBDx8KI/tselHo5Fr+Zw/D5QpozoaolxjokJE9kGrldPnazSAj4/qaNRx4oTkZF2YqBCRffD3l7Uq8fGAg5391zd5MvDbb0BKiupIiIzG1JqI7IdGA3h6qo4if925A3zyiVwleetW4OWXVUdEZBQ7+7OCiMjOzJwpk5SaNYEGDVRHQ2Q0JipEZB9iYuT0+d9+qzqS/JOUBEydKo+HDZM1SkRWhk0/RGQfjh6V0+e7uamOJP/8+qtcdLB4caBjR9XREOUJa1SIyD7oZqUtXFhtHPlFCNmJFgAGDwZcXNTGQ5RHTFSIyD7cuSP39pKobN4MxMYCHh7AO++ojoYoz9j0Q0T2QZeo2Mtkb87OQP36QPXq9vOeySYxUSEi+2BvTT8NGgDbtwPJyaojIXombPohIvtgbzUqOuybQlaOiQoR2Yf4eLm39RqVx4/lkOTbt1VHQmQSTFSIyD6cOAEkJAC1a6uOxLxWrQLef1++TyFUR0P0zNhHhYjsg71Mnz97ttx37coJ3sgmsEaFiMhWnD8vZ+AFgL59lYZCZCpMVIjI9p0/D4SFAUOGqI7EvH76STb3NG0KlC2rOhoik2CiQkS27/Jl2Xfjzz9VR2I+KSnA3LnymBO8kQ1hokJEts8e5lCJjgauXgX8/GTtEZGNUJqopKSk4NNPP0VwcDDc3d1RpkwZfPHFF0hLS1MZFhHZGnuYPv/ECTkbbc+egKur6miITEbpqJ/x48djxowZiIqKQuXKlbFv3z707t0bvr6+GGLrbclElH/sYbK3jz4CevTgkGSyOUoTlZ07dyIsLAyhoaEAgNKlS2PRokXYt2+fyrCIyNbYQ9MPAPj7q46AyOSUNv00aNAAf/31F06ePAkAOHToEP7++2+0adMmy/JJSUmIi4sz2IiInsqWa1SEAM6dUx0FkdkoTVRGjBiBLl26oEKFCnB2dkaNGjUwdOhQdOnSJcvy48aNg6+vr34LDAzM54iJyCpptXJvizUqu3cDZcoAzZuz2YdsktJE5ddff8WCBQvwyy+/4MCBA4iKisLEiRMRFRWVZfnw8HBotVr9dunSpXyOmIis0qJFwIMHtjkJ2sKFcl+0KGeiJZuktI/K8OHDMXLkSLz55psAgCpVquDChQsYN24cevbsmam8q6srXNmbnYiMpdEAHh6qozC9x4+BX3+Vx926qY2FyEyU1qg8fPgQDg6GITg6OnJ4MhFRbmzYANy6BRQpIpt+iGyQ0hqVtm3bYuzYsQgKCkLlypVx8OBBTJo0CX369FEZFhHZmo4dAW9vYOJE2+pQu2CB3L/5JuDENWbJNmmEUNf7Kj4+HqNHj8aKFStw8+ZNFC9eHF26dMGYMWPg4uLy1NfHxcXB19cXWq0WPj4++RAxEVmdxMT0Zp/79wFfX6XhmExCguyX8vAhsGsXEBKiOiKiXDPm+1tpovKsmKgQ0VNduQKULAk4Oso+HbbS4XThQuCtt+Tig6dO2c77IrtgzPc36wqJyLZlnEPFlr7M27UDFi+WQ5Jt6X0RPYGJChHZNludldbDA+jcWXUURGbH1ZOJyLbZw4KERDaMiQoR2TZdjYotjfbp0QMYOzY9CSOyYWz6ISLbdu+e3NtKjcrp08D8+YCDg23OtEv0BCYqRGTbhg8H3ntPjvixBb/9JvdNmwIBAWpjIcoHTFSIyLbZ2vT5y5bJfceOauMgyifso0JEZC3OnQP275fNPq+/rjoaonzBRIWIbFt4ONC7N3DokOpInt3y5XLfsKFc34fIDjBRISLbtmoVEBlpGyNkdM0+b7yhNg6ifMREhYhsW8aZaa1ZaipQpoxcq6hdO9XREOUbdqYlItslhO3MTOvoKFdLfvwYcHZWHQ1RvmGNChHZroSE9GHJ1p6o6DBJITvDRIWIbJeu2cfVFXB3VxvLs7h3D/j3X1lDRGRnmKgQke3K2OxjzSsM//YbUKUK0KmT6kiI8h0TFSKyXbayzs/SpXJfu7baOIgUYGdaIrJdzZoBDx7IzVrduQNs3CiPO3RQGwuRAkxUiMi2eXhY9xT6q1bJocnVqgHlyqmOhijfsemHiMiS6Zp9WJtCdoqJChHZrrlz5fT5f/6pOpK8SUgANmyQx+3bq42FSBEmKkRku7ZskdPnHz2qOpK8Wb8eSE4GypYFKlVSHQ2REuyjQkS2y9qnz2/dGoiOlp2BrXl4NdEzYKJCRLbL2qfPd3MDWrVSHQWRUmz6ISLbZe01KkTERIWIbJg116hMnQp8/DFw5IjqSIiUYtMPEdmmtDS5Rg5gnYnKzJkySaleHahcWXU0RMqwRoWIbJNWK5MVwPqafs6dk0mKo6PsUEtkx1ijQkS2qWBB4OFD2fzj4qI6GuP88YfcN2gg3weRHWONChHZLnd3oEQJ1VEYT5eovPaa2jiILAATFSIiS6LVAps3y+O2bZWGQmQJmKgQkW3avBno1Ut2SrUm69YBKSlA+fLAc8+pjoZIOSYqRGSbDh0CoqKAjRtVR2Kc+HigaFE2+xD9h51picg2WescKn37yoUUExNVR0JkEVijQkS2STcrrbUlKgDg4AB4eqqOgsgiMFEhIttkjdPnX7+ePvcLEQFgokJEtsoam35atACKFwd27lQdCZHFYB8VIrJN1lajcvkycPgwoNFwtA9RBqxRISLbZG01KmvXyn3duoCfn9pYiCwIa1SIyDYdOSKTFWv50tclKlzbh8gAExUisk3WNH3+48fA+vXymIkKkQGlTT+lS5eGRqPJtA0aNEhlWERE+WvnTiAuTtb+1K6tOhoii6I0Udm7dy+uXbum39b/9xdFx44dVYZFRNYuNhbo3h2YPFl1JLkTHS33LVrIOVSISE9p00+RIkUMHn/zzTcoW7YsGjVqlGX5pKQkJCUl6R/HxcWZNT4islL//gssWABcvQp88IHqaJ6uQwc5f0o2//cR2TOL6aOSnJyMBQsWYNiwYdBoNFmWGTduHD7//PN8joyIrM6VK3JvLX1Uatdmkw9RNiymjnHlypW4f/8+evXqlW2Z8PBwaLVa/Xbp0qX8C5CIrIe1JSpElC2LqVGZO3cuWrdujeLFi2dbxtXVFa6urvkYFRFZJWtKVH76CShWDGjcWI5UIiIDFpGoXLhwARs2bMDy5ctVh0JEtsBaEpWUFGDYMECrlSN/XnxRdUREFscimn7mzZsHf39/hIaGqg6FiGyBtSQqu3bJJKVwYaBOHdXREFkk5YlKWloa5s2bh549e8LJySIqeIjImqWlATdvymNLT1QyDkt2dFQbC5GFUp4ZbNiwARcvXkSfPn1Uh0JEtsDBAUhIAK5dk30/LJkuUeFstETZ0gghhOog8iouLg6+vr7QarXw8fFRHQ4RUe7duAEEBMjj69eBokXVxkOUj4z5/lbe9ENEZJc2bJD76tWZpBDlgIkKEdmWVauAt94Cfv5ZdSQ5271b7lu0UBsHkYVjokJEtmX3bmDhwvREwFJ99x1w7BgwcKDqSIgsmvLOtEREJmUtQ5M1GqBCBdVREFk81qgQkW2xlkSFiHKFiQoR2RZrSFS6dgU6dQIOH1YdCZHFY6JCRLbF0hOVR4+AFSuA336Tc74QUY74W0JEtiMhAYiLk8eWmqj8/bdMVooXBypVUh0NkcVjZ1oish3Xr8tOqp6egKVOAhkTI/ctWshYiShHTFSIyHaUKwckJQG3b6uOJHsZExUieio2/RCRbXF2ttw1fq5fBw4dksfNmqmNhchKMFEhIsovumnza9YEihRRGwuRlWCiQkS2Y+pUoFs3YO1a1ZFkzcMDqFuXqyUTGYF9VIjIdmzaJIf+1q+vOpKstW8vN+tdtJ4o37FGhYhsh6XPoaLD0T5EucZEhYhshyUnKmfPps/xQkS5ZlSiMmHCBCQmJuofb926FUlJSfrH8fHxGMiVQIlIhdRUOaoGsMxEZcAAoFAhYPFi1ZEQWRWNELlvLHV0dMS1a9fg7+8PAPDx8UFsbCzKlCkDALhx4waKFy+O1NRU80T7hLi4OPj6+kKr1cLHUid3IqL8cfWqTFAcHeVcKo6OqiNK9+iRTFISE4F//wUqV1YdEZFSxnx/G1Wj8mROY0SOQ0RkXrpmn4AAy0pSAGDHDpmkBARw2nwiI7GPChHZhtu3ZSdVS2z20c2f0qwZO9ISGYnDk4nINrRuDSQnW2aH1fXr5b55c7VxEFkhoxOVOXPmwMvLCwCQkpKCyMhI+Pn5AZCdaYmIlHFykn1BLMndu8D+/fK4aVO1sRBZIaM605YuXRqaXFRbnjt37pmCyi12piUii7d0KdCxI1CxInD0qOpoiCyCMd/fRtWonD9//lniIiIyn48+kiN/PvwQqFVLdTTp6tcHpk8HXF1VR0JkldhHhYhsw5o1wLFjQN++qiMxVLy4nEOFiPLEqFE/u3fvRnR0tMG5n3/+GcHBwfD398c777xjMAEcEVG+seRZaYkoz4xKVCIiIvDPP//oHx8+fBh9+/ZFs2bNMHLkSPzxxx8YN26cyYMkIspRQkL6aB9LSlQ2bABmzAAuXFAdCZHVMipRiY2NRdMMvdYXL16MkJAQzJ49G8OGDcP333+PJUuWmDxIIqIc6WpTvL3lZilmz5bNPpGRqiMhslpGJSr37t1D0aJF9Y+3bNmCVq1a6R/XqVMHly5dMl10RES5YYnNPmlpwF9/yeNmzdTGQmTFjEpUihYtqh96nJycjAMHDqBevXr65+Pj4+Hs7GzaCImInsYSE5XYWODOHVnDU7eu6miIrJZRiUqrVq0wcuRIbNu2DeHh4fDw8MDLL7+sf/6ff/5B2bJlTR4kEVGOEhLk+j6WlKjoZqNt3BjgH3BEeWbU8OSvvvoK7du3R6NGjeDl5YXIyEi4uLjon//pp5/QokULkwdJRJSjAQPksOSEBNWRpMu4vg8R5ZlRM9PqaLVaeHl5wfGJFUrv3r0Lb2/vfGv+4cy0RGSREhOBggWBpCTgyBGumEz0BLPNTNunT59clfvpp5+MuSwRkW05fBhISZGTvVWsqDoaIqtmVKISGRmJUqVKoUaNGshDRQwRkendvQu0aQNUqAD89BPgYFTXO/OoW1d2pD17FsjF+mhElD2jEpX+/ftj8eLFOHv2LPr06YO33noLhSxtpVIisi8nTgC7d8uRP5aQpOj4+gI1aqiOgsjqGfVbPX36dFy7dg0jRozAH3/8gcDAQHTq1Anr1q1jDQsRqXHihNyXL682DiIyC6P//HB1dUWXLl2wfv16HD16FJUrV8bAgQNRqlQpJFhSj3sisg/Hj8u9pSQqK1YAL74oV0wmomf2TPWkGo0GGo0GQgikpaWZKiYiotyztBqVdetkU9SpU6ojIbIJRicqSUlJWLRoEZo3b47y5cvj8OHDmDZtGi5evAgvLy+jA7hy5QreeustFC5cGB4eHqhevTr2799v9HWIyE5ZWqKim+iN86cQmYRRnWkHDhyIxYsXIygoCL1798bixYtRuHDhPN/83r17eOmll9CkSRNER0fD398fZ86cQYECBfJ8TSKyIykpwOnT8tgSEpWzZ+Xm5AQ0bKg6GiKbYFSiMmPGDAQFBSE4OBhbtmzBli1bsiy3fPnyXF1v/PjxCAwMxLx58/TnSpcubUxIRGTPbt0CSpYEbt4EgoJUR5O+COGLL1rWKs5EVsyoRKVHjx7QmHBOgFWrVqFly5bo2LEjtmzZghIlSmDgwIHo169fluWTkpKQlJSkfxwXF2eyWIjIChUrJmswkpMtY2iybtr85s3VxkFkQ/I0hb6puLm5AQCGDRuGjh07Ys+ePRg6dChmzpyJHj16ZCofERGBzz//PNN5TqFPRMqlpQH+/nKit+3bgfr1VUdEZLGMmUJfaaLi4uKC2rVrY8eOHfpz77//Pvbu3YudO3dmKp9VjUpgYCATFSJS79494N13gb17gZMnuWIyUQ6MSVSU1pUWK1YMlZ5YrKtixYq4ePFiluVdXV3h4+NjsBGRHWvfHmjQANi1S3UkchHCJUtkUxSTFCKTMaqPiqm99NJLOKEbWvifkydPolSpUooiIiKrsmMHcOMG8MRK7kpxbR8ik1Jao/LBBx9g165d+Prrr3H69Gn88ssvmDVrFgYNGqQyLCKyBvfvyyQFUD80OSlJzufCpUSITE5polKnTh2sWLECixYtwgsvvIAvv/wSU6ZMQbdu3VSGRUTWQFcbW6wYoLoZ+O+/5erN9eqpjYPIBilt+gGAV199Fa+++qrqMIjI2ljSjLS62WgtIRYiG2MBEw8QEeWBJSYqnDafyOSYqBCRddIlKhUqqI3j1i3g4EF5zESFyOSYqBCRdfLzAwID1Scqf/0lO9FWqSL7yxCRSSnvo0JElCczZqiOQNI1+7RooTYOIhvFGhUiorwSgokKkZmxRoWIrI8QljGxmhCyZicmBnj5ZdXRENkk1qgQkfX5/nvZH2TMGLVxODgAbdoAU6YA7u5qYyGyUUxUiMj6HD8OXL8OpKaqjoSIzIyJChFZH0uYQyU5Gfj0U2DTJiAtTV0cRDaOiQoRWR9LSFR27QLGjgU6d1YXA5EdYKJCRNZFqwWuXpXHKhOVmBi5b9ZM9lUhIrPgbxcRWZc9e+Q+OBgoUEBdHByWTJQvmKgQkXXZuVPuVa5UfPcusHevPG7eXF0cRHaAiQoRWZegIKBhQ6BJE3UxbNwo51CpVAkoUUJdHER2gBO+EZF16dVLbirp+qewNoXI7FijQkRkrNhYuWf/FCKzY40KEVmPq1cBDw+1nWgBOTT50CG1o46I7ARrVIjIekREAAULAhMmqI3DwQGoUUMmTURkVkxUiMh66Eb8PPec2jiIKN8wUSEi66DVAkeOyGNVQ5MfPQLKlZOdeePj1cRAZGfYR4WIrMOePXJIcOnSQECAmhi2bQPOnAEePgS8vNTEQGRnWKNCRNbBEiZ6W7tW7lu1AjQadXEQ2REmKkRkHSwtUSGifMFEhYgsX1oasHu3PFaVqFy8CBw9Kkf8NGumJgYiO8Q+KkRk+VJSgG++kf1UqlVTE8O6dXIfEgIUKqQmBiI7xESFiCyfiwvwzjtyU4XNPkRKMFEhIsqN6tWBU6eA1q1VR0JkVzRCCKE6iLyKi4uDr68vtFotfHx8VIdDROayaBFQoQJQtSrg6Kg6GiJ6RsZ8f7NGhYgsW1wc0K2bnEPl2jV1c6gQkRIc9UNEls1SJnpLTFRzbyI7x0SFiCyb6vlTrl8HGjYECheWtTtElK+YqBCRZdMNC27QQM39Y2LkvlIlgH3hiPIdExUislzXrgE7dsjj115TE0N0tNxzWDKREkxUiMhyrVwp+6eEhAAlS+b//VNS0udP4bBkIiWYqBCR5dIlCR06qLn/9u3A/fuyf8qLL6qJgcjOcXgyEVmu334DNm4EqlRRc//Vq+W+TRvO30KkCBMVIrJcLi5q+4b88Yfct22rLgYiO8dEhYgoOytWAH/+CbRooToSIrvFPipEZHni44HKlYHhw4GkJHVxVKwIfPQR4OurLgYiO8dEhYgsz59/AkePAqtWyeYfIrJbShOViIgIaDQagy2A63gQ0fLlct++PaDR5P/9tVqgc2fg55+BtLT8vz8R6Snvo1K5cmVs2LBB/9iRPeuJ7FtiIrBmjTxu315NDOvWAUuWAP/8A/TooSYGIgJgAYmKk5NTrmtRkpKSkJShvTqO624Q2Z6YGODBAyAwEKhdW00MumHJr76q5v5EpKe8j8qpU6dQvHhxBAcH480338TZs2ezLTtu3Dj4+vrqt8DAwHyMlIjyxbJlcq+q2Sc1Nb1Gh4kKkXIaIYRQdfPo6Gg8fPgQzz//PG7cuIGvvvoKx48fx5EjR1C4cOFM5bOqUQkMDIRWq4UPFwsjsn7JyUDRonI22K1bgZdfzv8Ytm+XCyAWKADcugU4Ka94JrI5cXFx8PX1zdX3t9LfwNYZ1s6oUqUK6tWrh7JlyyIqKgrDhg3LVN7V1RWurq75GSIR5aekJKBXL9n0Ur++mhh0zT6tWzNJIbIAypt+MvL09ESVKlVw6tQp1aEQkQre3sDkycCxY+qmrGf/FCKLYlGJSlJSEo4dO4ZixYqpDoWIVFJVk/HgAeDhATg7q526n4j0lCYqH330EbZs2YJz585h9+7deOONNxAXF4eePXuqDIuIVPjkE9k/RCVPT2D3buDGDaBQIbWxEBEAxYnK5cuX0aVLF5QvXx7t27eHi4sLdu3ahVKlSqkMi4jy286dwNdfA02ayCRBtYIFVUdARP9R2lNs8eLFKm9PRJZi/Hi5795djvpRIT4eSElhkkJkYSyqjwoR2aFjx4Dff5dzpgwfri6OhQsBf3/ggw/UxUBEmTBRISK1JkyQ+9dfBypUUBfHsmWyRoXrjRFZFCYqRKTOpUuyJgMARoxQF8fdu8CmTfK4Qwd1cRBRJkxUiEid774DHj8GGjcGQkLUxbFqlZw6v2pVoFw5dXEQUSZMVIhInVKlgGrVgCxmos5Xy5fLvarVmokoW0rX+nlWxqwVQEQWTAg1CxACcrRPkSJy+v7Dh4EXXlATB5EdMeb7mzUqRKSeqiQFkCslJyUBzz8PVK6sLg4iyhJX3CKi/CcE8McfwCuvAF5eamNp0gSYPh1wc1ObMBFRlpioEFH+++cfICxMTq52/Trg4qIuFn9/YMAAdfcnohyx6YeI8t+KFXLfsKHaJIWILB5rVIgo/+lG2bRrpzaOiRPlQoQdOwJ+fmpjIaIsMVEhovx1+rQcXePoCLRtqy6OR4+AL78E4uLkSJ+XX1YXCxFli00/RJS/dM0+TZoAhQqpi2P1apmkBAYCL72kLg4iyhETFSLKX5bS7PPLL3LfpQvgwP8KiSwVfzuJKP/cvAns3i2PX39dXRz37gF//imPu3VTFwcRPRX7qBBR/vH3B86cAbZvB4oXVxfHsmVAcrLsm1K1qro4iOipmKgQUf4KDpabSroVm1mbQmTx2PRDRPYlJUXOhuvsLPunEJFFY6JCRPlj/XqgUyfg55/VxuHkJKfvv3VLrt5MRBaNiQoR5Y/Nm4HffgO2bVMdieTrqzoCIsoFJipElD8OHJD7mjXVxXD9OnDhgrr7E5HRmKgQkfkJAezfL49r1VIXx7RpQOnSQHi4uhiIyChMVIjI/K5ckX1CHB2BKlXUxCBE+iRv1aqpiYGIjMZEhYjMT9fsU6kS4O6uJoadO4Fz5+QihK+9piYGIjIaExUiMj9LaPb56Se5b98e8PBQFwcRGYWJChGZn1Yr5y1R1ZE2Lg5YvFge9+unJgYiyhMmKkRkflOmAPHxQJ8+au6/aBHw4AFQoQLQoIGaGIgoTziFPhHlD1dXuamQsTZFo1ETAxHlCRMVIrJ9f/4pJ5sLDVUdCREZiU0/RGReEybITrRz56qLwcMD6NkT8PNTFwMR5QkTFSIyr5075fBkrTb/752aKudPISKrxUSFiMxLN4eKiqHJUVFA5crAggX5f28iMgkmKkRkPrdvAxcvyuPq1fP//rNnA8eOyZlxicgqMVEhIvM5eFDuy5XL/9WKDx8Gdu0CnJyAXr3y995EZDJMVIjIfFTOSDt7ttyHhQFFi+b//YnIJJioEJH56Pqn5PeMtPHxwM8/y2POREtk1ZioEJH5lCgBlC2b/zUqP/0kRxk99xzQvHn+3puITIqJChGZz+TJwOnTQNOm+XfP1FQ5ZT8ADBsGOPC/OSJrxt9gIrItDg5AZCTQpQvQo4fqaIjoGVlMojJu3DhoNBoMHTpUdShEZArx8WomW9NogEaNgF9+kTPSEpFVs4hEZe/evZg1axaqVq2qOhQiMpV33wUKFgTmz1cdCRFZMeWJSkJCArp164bZs2ejYMGCqsMhIlMQQs5hotUCRYrk33179wY+/BC4di3/7klEZqU8URk0aBBCQ0PRrFmzp5ZNSkpCXFycwUZEFujUKeDcOcDZGXjppfy55+nTcsr8SZOA+/fz555EZHZOKm++ePFiHDhwAHv37s1V+XHjxuHzzz83c1RE9Myio+X+5ZcBb+/8uefkybImp00boGLF/LknEZmdshqVS5cuYciQIViwYAHc3Nxy9Zrw8HBotVr9dunSJTNHSUR5oktU2rTJn/vduQPMmyePP/oof+5JRPlCWY3K/v37cfPmTdTKMBFUamoqtm7dimnTpiEpKQmOjo4Gr3F1dYWrq2t+h0pExnj4ENi8WR63bp0/95w2DUhMBGrUABo3zp97ElG+UJaoNG3aFIcPHzY417t3b1SoUAEjRozIlKQQkZXYvBlISgKCgvKnCebuXdkvBQBGjpTDk4nIZihLVLy9vfHCCy8YnPP09EThwoUznSciK1K+PPDZZ4CnZ/4kDRMnAnFxQLVqwBtvmP9+RJSvlHamJSIbVLYsEBGRf/cbPBhISABateJ0+UQ2SCOEiqkjTSMuLg6+vr7QarXw8fFRHQ4RERHlgjHf3/zzg4hM56+/gOXLZVOMuT1+bP57EJFyTFSIyHT+9z+gQwdg1izz32vAADn8+cgR89+LiJRhokJEppGfw5JPnZIrJEdHy2n6ichmMVEhItPQDUsODAQqVTLvvSIigNRUWaNSv75570VESjFRISLT0M1G27q1eYcl79kDLFokj7/6ynz3ISKLwESFiEwjY6JiLqmpsm+KEED37nImWiKyaUxUiOjZnToFnDkjV0tu2tR895kxAzhwAPD1lR13icjmMVEhomenWwG9QQPzrZYsBLBwoTweOxYoWtQ89yEii8KZaYno2XXtCtStK9fdMReNRnbY/flnoHdv892HiCwKExUiMo1y5cx/DxcX4O23zX8fIrIYbPohomdjzloUQM5AO2sWkJxs3vsQkUViokJEeXfmDBAQIFctTkkxzz0mTwbefRdo3lz2UyEiu8JEhYjybuZMWePx4AHgZIaW5AMHgE8/lce9epl3fhYiskhMVIgobxITgblz5fHAgaa//oMHQJcuMhFq104mKkRkd5ioEFHe/Pab7J8SFCSnsje1oUOBkyeBEiWA2bNZm0Jkp5ioEFHeTJ8u9+++Czg6mvbaS5cCc+bI5GT+fKBwYdNen4isBhMVIjLe/v3A7t1yJtq+fU177eRkYNgweRweDjRpYtrrE5FVYaJCRMabNUvu33jD9DPEurgAGzcC77wjV0kmIrvGCd+IyHjjxwOVKwP165vn+uXKyRFFRGT3mKgQkfEKFADef9+01/zuO+C558zTMZeIrBabfogo986cMc/Ebr/9Jkf5tG0L/POP6a9PRFaLiQoR5c7Dh0DTpkDNmsDp06a77o4dQPfu8vi994CqVU13bSKyekxUiCh3vvkGuHABuH8fKFbMNNc8fRp47TUgKUnuJ00yzXWJyGYwUSGipzt9WnagBYApUwBPz2e/5smTwCuvAHfuALVqAb/8Yvr5WIjI6rEzLRHlTAjZcTY5GWjZUk5n/6yuXgUaNgRu3AAqVABWrzZN8kNENoc1KkSUs8WLgehoObnb99+bZir7YsVkwlO9OrB1q1yBmYgoC6xRIaLs7dkD9Okjj0eMAJ5/3jTX1WiAH36QCw96e5vmmkRkk1ijQkTZK1sWqF1bDht+1llio6KAzp3lasgA4ODAJIWInoo1KkSUvcKFgfXrgdTUvHd0TUwEBg8G5s6Vj+vVk3OmEBHlAmtUiMhQYiKwbFn6Yze3vHd0PXVKJiZz58rmni+/NP2MtkRk05ioEFG6xESgWze52OCXX+b9OkIAixbJZqNDh4AiRYCYGODTT2WTDxFRLvF/DCKSLlwAXn4ZWLFCrmDcsGHer/XJJ0DXrkBcHNCgAXDwINCsmeliJSK7wUSFiICNG2Xtx/79sl/KmjVAo0Z5v1737rKjbESEvHaJEiYLlYjsCzvTEtkzIYBvv5VDj9PS5Do+y5YBpUvn/hrJycDChXISt08+kecqVgQuXQJ8fc0SNhHZDyYqRPZMl1ykpQE9ewI//gi4u+futQ8fyk6y//ufTEocHIDWrWWyAzBJISKTYKJCZG9u3ZKdWwHZJPPRR0Dx4sDAgbmbdfbcOTknyvTp8lqAnFn2ww9NNyEcEdF/mKgQ2YtDh4BZs4A5c4Dt22WfFAAYOzb31/jzT+DVV9Mfly4tm4169ZLDmImITIyJCpEtu31b9h+JjARiY9PPr1iRnqhkRQg5B8ratUCBAkCPHvJ8w4ayk2xICNC7N9Cxo1wDiIjITJioENmihAQgLAzYti19ynoXF+C114ABA4BXXjEsn5ICHD0q1/bZvRv46y/ZxAPIPie6RMXbm51kiShfKU1UfvzxR/z44484f/48AKBy5coYM2YMWrdurTIsIuvw+DFw4gTwzz9yc3cHPvtMPuflBZw9K8vUqiWbZrp0AQoVAq5ckclISEj6tapVk4lKRi4ucl6V1q1lDYuu/wqTFCLKR0oTlZIlS+Kbb75BuXLlAABRUVEICwvDwYMHUblyZZWhEan1+LHsqKrVyqG+OhERwM6dwPnzssZDV1sCAD4+cgSP03+/1n37AjduyNE5v/8OTJ0qa0MSE2XNiFabnny88IJ8rk4doG5d4KWXgMaNZcJDRKSQRgghVAeRUaFChfC///0Pffv2fWrZuLg4+Pr6QqvVwsfHx7SB3L4t/1rNTnCwHCkBAHfvAseOZV+2VCmgZEl5HBcHHD6cfdmSJWV5QFbfHzqUfdnixWUcgPwyOngw+7IBAXIlXEDOe7F3b/Zl/fyA8uXlcUqK/Os7O4UKpX+RpqXJL9HsFCgAZExAt2+X+6x+BH19gSpV0h/v2mX4pZzxdV5e6UNiAWDHDiApKT2mjNzdgfr10x9v3Qo8eCCvlZaWXj4tTZZt2TK97Jo1wJ078rnUVLmlpMi9qyvQr1962enTgYsXZcxJSfIzT0qSm5MTsGBBetnOnYEDB2QciYly08Xv5iYf65QrB5w5k/5Yo5FbWpocHhwXl74uT7t2wMqVmT9bJyd5nR07gIIF5bn792Wiw+ntiSgfGPX9LSxESkqKWLRokXBxcRFHjhzJssyjR4+EVqvVb5cuXRIAhFarNX1An34qhPz6ynrr0iW97IQJOZd99dX0srNn51z2lVfSyy5dmnPZkJD0sn/9lXPZKlXSyx46lHPZ555LL3vlSs5lAwPTyyYl5VzW39/wM86pbMGChmU1muzLenoalnV0zL6sq6thWWfn7Ms6ORmWdXfPvqyDg2FZb++c319GhQvnXDY5Ob3siy/mXPb+/fSy06cL0a+fEF9+KURUlBCbNglx5ozh9YiIFNBqtbn+/lbemfbw4cOoV68eHj16BC8vL6xYsQKVKlXKsuy4cePw+eef509gT1vS3inDR/e0shlHRTxtnoqM133aX7fGlDVmZIaTET8Wz1JWo8m6NgXI/Jk6OMiai9xc19Ex+7JPfg7OzrJWJCsuLoaPPT1lTUfGf0NdjYarq2HZEiVkjYpGI+NxcJCbo2PmYbytW8saOTc3wMNDbl5eslbJ3d3ws/j6azlJm6enLFOwoCxXoIDcZ3x/AwZk/b6IiKyI8qaf5ORkXLx4Effv38eyZcswZ84cbNmyJctkJSkpCUm6KnHIqqPAwEDzNP0QERGRWRjT9KM8UXlSs2bNULZsWcycOfOpZc3aR4WIiIjMwpjvb4vrOSeEMKg1ISIiIvultI/KqFGj0Lp1awQGBiI+Ph6LFy/G5s2bsXbtWpVhERERkYVQmqjcuHED3bt3x7Vr1+Dr64uqVati7dq1aN68ucqwiIiIyEIoTVTmzp2r8vZERERk4SyujwoRERGRDhMVIiIislhMVIiIiMhiMVEhIiIii8VEhYiIiCwWExUiIiKyWExUiIiIyGIxUSEiIiKLxUSFiIiILJbSmWmflW7h57i4OMWREBERUW7pvrd13+M5sepEJT4+HgAQGBioOBIiIiIyVnx8PHx9fXMsoxG5SWcsVFpaGq5evQpvb29oNBqTXjsuLg6BgYG4dOkSfHx8THptS8D3Z/1s/T3a+vsDbP898v1ZP3O9RyEE4uPjUbx4cTg45NwLxaprVBwcHFCyZEmz3sPHx8dmfwABvj9bYOvv0dbfH2D775Hvz/qZ4z0+rSZFh51piYiIyGIxUSEiIiKLxUQlG66urvjss8/g6uqqOhSz4Puzfrb+Hm39/QG2/x75/qyfJbxHq+5MS0RERLaNNSpERERksZioEBERkcViokJEREQWi4kKERERWSwmKkZISkpC9erVodFoEBsbqzock3nttdcQFBQENzc3FCtWDN27d8fVq1dVh2Uy58+fR9++fREcHAx3d3eULVsWn332GZKTk1WHZjJjx45F/fr14eHhgQIFCqgOxySmT5+O4OBguLm5oVatWti2bZvqkExm69ataNu2LYoXLw6NRoOVK1eqDslkxo0bhzp16sDb2xv+/v54/fXXceLECdVhmdSPP/6IqlWr6idBq1evHqKjo1WHZTbjxo2DRqPB0KFDldyfiYoRPv74YxQvXlx1GCbXpEkTLFmyBCdOnMCyZctw5swZvPHGG6rDMpnjx48jLS0NM2fOxJEjRzB58mTMmDEDo0aNUh2aySQnJ6Njx44YMGCA6lBM4tdff8XQoUPxySef4ODBg3j55ZfRunVrXLx4UXVoJvHgwQNUq1YN06ZNUx2KyW3ZsgWDBg3Crl27sH79eqSkpKBFixZ48OCB6tBMpmTJkvjmm2+wb98+7Nu3D6+88grCwsJw5MgR1aGZ3N69ezFr1ixUrVpVXRCCcmXNmjWiQoUK4siRIwKAOHjwoOqQzOb3338XGo1GJCcnqw7FbCZMmCCCg4NVh2Fy8+bNE76+vqrDeGZ169YV/fv3NzhXoUIFMXLkSEURmQ8AsWLFCtVhmM3NmzcFALFlyxbVoZhVwYIFxZw5c1SHYVLx8fHiueeeE+vXrxeNGjUSQ4YMURIHa1Ry4caNG+jXrx/mz58PDw8P1eGY1d27d7Fw4ULUr18fzs7OqsMxG61Wi0KFCqkOg7KQnJyM/fv3o0WLFgbnW7RogR07diiKivJKq9UCgM3+vqWmpmLx4sV48OAB6tWrpzockxo0aBBCQ0PRrFkzpXEwUXkKIQR69eqF/v37o3bt2qrDMZsRI0bA09MThQsXxsWLF/H777+rDslszpw5g6lTp6J///6qQ6Es3L59G6mpqShatKjB+aJFi+L69euKoqK8EEJg2LBhaNCgAV544QXV4ZjU4cOH4eXlBVdXV/Tv3x8rVqxApUqVVIdlMosXL8aBAwcwbtw41aHYb6ISEREBjUaT47Zv3z5MnToVcXFxCA8PVx2yUXL7/nSGDx+OgwcPIiYmBo6OjujRoweEhU9abOx7BICrV6+iVatW6NixI95++21FkedOXt6fLdFoNAaPhRCZzpFle++99/DPP/9g0aJFqkMxufLlyyM2Nha7du3CgAED0LNnTxw9elR1WCZx6dIlDBkyBAsWLICbm5vqcOx3Cv3bt2/j9u3bOZYpXbo03nzzTfzxxx8G/0GmpqbC0dER3bp1Q1RUlLlDzZPcvr+sfggvX76MwMBA7Nixw6KrMo19j1evXkWTJk0QEhKCyMhIODhYdp6el3/DyMhIDB06FPfv3zdzdOaTnJwMDw8P/Pbbb2jXrp3+/JAhQxAbG4stW7YojM70NBoNVqxYgddff111KCY1ePBgrFy5Elu3bkVwcLDqcMyuWbNmKFu2LGbOnKk6lGe2cuVKtGvXDo6Ojvpzqamp0Gg0cHBwQFJSksFz5uaUb3eyMH5+fvDz83tque+//x5fffWV/vHVq1fRsmVL/PrrrwgJCTFniM8kt+8vK7rcNSkpyZQhmZwx7/HKlSto0qQJatWqhXnz5ll8kgI827+hNXNxcUGtWrWwfv16g0Rl/fr1CAsLUxgZ5YYQAoMHD8aKFSuwefNmu0hSAPm+Lf3/zNxq2rQpDh8+bHCud+/eqFChAkaMGJGvSQpgx4lKbgUFBRk89vLyAgCULVsWJUuWVBGSSe3Zswd79uxBgwYNULBgQZw9exZjxoxB2bJlLbo2xRhXr15F48aNERQUhIkTJ+LWrVv65wICAhRGZjoXL17E3bt3cfHiRaSmpurn+SlXrpz+Z9aaDBs2DN27d0ft2rVRr149zJo1CxcvXrSZfkUJCQk4ffq0/vG5c+cQGxuLQoUKZfo/x9oMGjQIv/zyC37//Xd4e3vr+xX5+vrC3d1dcXSmMWrUKLRu3RqBgYGIj4/H4sWLsXnzZqxdu1Z1aCbh7e2dqU+Rrg+jkr5GSsYaWbFz587Z1PDkf/75RzRp0kQUKlRIuLq6itKlS4v+/fuLy5cvqw7NZObNmycAZLnZip49e2b5/jZt2qQ6tDz74YcfRKlSpYSLi4uoWbOmTQ1v3bRpU5b/Xj179lQd2jPL7ndt3rx5qkMzmT59+uh/NosUKSKaNm0qYmJiVIdlViqHJ9ttHxUiIiKyfJbfUE9ERER2i4kKERERWSwmKkRERGSxmKgQERGRxWKiQkRERBaLiQoRERFZLCYqREREZLGYqBAREZHFYqJCZIMaN26MoUOHqg4jS3fu3IG/vz/Onz8PANi8eTM0Go3ZF1LM630iIyNRoEABo15Tp04dLF++3KjXEFHWmKgQ0VNdu3YNXbt2Rfny5eHg4JBtErRs2TJUqlQJrq6uqFSpElasWJGpzLhx49C2bVuULl3avEErNHr0aIwcORJpaWmqQyGyekxUiOipkpKSUKRIEXzyySeoVq1almV27tyJzp07o3v37jh06BC6d++OTp06Yffu3foyiYmJmDt3Lt5+++38Cl2J0NBQaLVarFu3TnUoRFaPiQqRjbt37x569OiBggULwsPDA61bt8apU6cMysyePRuBgYHw8PBAu3btMGnSJIPmjtKlS+O7775Djx494Ovrm+V9pkyZgubNmyM8PBwVKlRAeHg4mjZtiilTpujLREdHw8nJKceVue/cuYMuXbqgZMmS8PDwQJUqVbBo0SKDMo0bN8bgwYMxdOhQFCxYEEWLFsWsWbPw4MED9O7dG97e3ihbtiyio6MzXX/79u2oVq0a3NzcEBISkmk5+8jISAQFBek/izt37hg8f+bMGYSFhaFo0aLw8vJCnTp1sGHDBoMyjo6OaNOmTaa4ich4TFSIbFyvXr2wb98+rFq1Cjt37oQQAm3atMHjx48ByC/u/v37Y8iQIYiNjUXz5s0xduxYo++zc+dOtGjRwuBcy5YtsWPHDv3jrVu3onbt2jle59GjR6hVqxZWr16Nf//9F++88w66d+9uUDMDAFFRUfDz88OePXswePBgDBgwAB07dkT9+vVx4MABtGzZEt27d8fDhw8NXjd8+HBMnDgRe/fuhb+/P1577TX9Z7F792706dMHAwcORGxsLJo0aYKvvvrK4PUJCQlo06YNNmzYgIMHD6Jly5Zo27YtLl68aFCubt262LZtW+4+PCLKnpI1m4nIrHRLsp88eVIAENu3b9c/d/v2beHu7i6WLFkihBCic+fOIjQ01OD13bp1E76+vjle+0nOzs5i4cKFBucWLlwoXFxc9I/DwsJEnz59DMps2rRJABD37t3L9v20adNGfPjhhwYxNGjQQP84JSVFeHp6iu7du+vPXbt2TQAQO3fuNLjP4sWL9WXu3Lkj3N3dxa+//iqEEKJLly6iVatWBvfu3Llztp+FTqVKlcTUqVMNzv3+++/CwcFBpKam5vhaIsoZa1SIbNixY8fg5OSEkJAQ/bnChQujfPnyOHbsGADgxIkTqFu3rsHrnnycWxqNxuCxEMLgXGJiItzc3HK8RmpqKsaOHYuqVauicOHC8PLyQkxMTKYai6pVq+qPHR0dUbhwYVSpUkV/rmjRogCAmzdvGrwuY7NToUKFDD6LY8eOZWqWevLxgwcP8PHHH6NSpUooUKAAvLy8cPz48Uzxubu7Iy0tDUlJSTm+XyLKmZPqAIjIfIQQ2Z7XJRBPJhM5vS4nAQEBuH79usG5mzdv6hMGAPDz88O9e/dyvM63336LyZMnY8qUKahSpQo8PT0xdOhQJCcnG5RzdnY2eKzRaAzO6d5TbkbeZPwsnmb48OFYt24dJk6ciHLlysHd3R1vvPFGpvju3r0LDw8PuLu7P/WaRJQ91qgQ2bBKlSohJSXFoH/HnTt3cPLkSVSsWBEAUKFCBezZs8fgdfv27TP6XvXq1cP69esNzsXExKB+/fr6xzVq1MDRo0dzvM62bdsQFhaGt956C9WqVUOZMmUydf59Frt27dIf37t3DydPnkSFChUAyM8r4/NPltfF16tXL7Rr1w5VqlRBQECAfk6YjP7991/UrFnTZHET2SsmKkQ27LnnnkNYWBj69euHv//+G4cOHcJbb72FEiVKICwsDAAwePBgrFmzBpMmTcKpU6cwc+ZMREdHZ6pliY2NRWxsLBISEnDr1i3ExsYaJB1DhgxBTEwMxo8fj+PHj2P8+PHYsGGDwZwrLVu2xJEjR3KsVSlXrhzWr1+PHTt24NixY3j33Xcz1dQ8iy+++AJ//fUX/v33X/Tq1Qt+fn54/fXXAQDvv/8+1q5diwkTJuDkyZOYNm0a1q5dmym+5cuXIzY2FocOHULXrl2zrLXZtm1bps7FRGQ8JipENm7evHmoVasWXn31VdSrVw9CCKxZs0bfTPLSSy9hxowZmDRpEqpVq4a1a9figw8+yNSXpEaNGqhRowb279+PX375BTVq1ECbNm30z9evXx+LFy/GvHnzULVqVURGRuLXX3816B9TpUoV1K5dG0uWLMk23tGjR6NmzZpo2bIlGjdujICAAH0iYQrffPMNhgwZglq1auHatWtYtWoVXFxcAAAvvvgi5syZg6lTp6J69eqIiYnBp59+avD6yZMno2DBgqhfvz7atm2Lli1bZqo5uXLlCnbs2IHevXubLG4ie6UReWmMJiKb1q9fPxw/ftwsw2vXrFmDjz76CP/++y8cHGzzb6Xhw4dDq9Vi1qxZqkMhsnrsTEtEmDhxIpo3bw5PT09ER0cjKioK06dPN8u92rRpg1OnTuHKlSsIDAw0yz1U8/f3x0cffaQ6DCKbwBoVIkKnTp2wefNmxMfHo0yZMhg8eDD69++vOiwiIiYqREREZLlss4GYiIiIbAITFSIiIrJYTFSIiIjIYjFRISIiIovFRIWIiIgsFhMVIiIislhMVIiIiMhiMVEhIiIii/V/Q1S6XH7Exj0AAAAASUVORK5CYII=\n", - "text/plain": [ - "
    " - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": 2, + "id": "8401e900", + "metadata": { + "collapsed": false, + "editable": true + }, + "outputs": [], "source": [ "import os\n", "import numpy as np\n", @@ -1408,7 +1390,7 @@ " # and then make the prediction\n", " ypredictRidge = X @ Ridgebeta\n", " MSERidgePredict[i] = MSE(y,ypredictRidge)\n", - " RegLasso = linear_model.Lasso(lmb,fit_intercept=False)\n", + " RegLasso = linear_model.Lasso(lmb)\n", " RegLasso.fit(X,y)\n", " ypredictLasso = RegLasso.predict(X)\n", " print(RegLasso.coef_)\n", @@ -1425,40 +1407,23 @@ }, { "cell_type": "markdown", - "id": "4cb5ab6a", - "metadata": {}, + "id": "32a34305", + "metadata": { + "editable": true + }, "source": [ "## Another Example, now with a polynomial fit" ] }, { "cell_type": "code", - "execution_count": 8, - "id": "53eb2ec4", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[2.0000000e+00 7.1942452e-14 5.0000000e+00]\n", - "Training MSE for OLS\n", - "9.647251180883923e-28\n", - "Test MSE OLS\n", - "1.2167390602128887e-27\n" - ] - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGwCAYAAABVdURTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAA9hAAAPYQGoP6dpAABuKUlEQVR4nO3dd3gU5drH8e9m0wsJECBBEyAgvYNUUVAhFD2g2BFFJMeooMgBIXoE5KgoryLHglgBRQUVjhUQVJoUKRJACEUMPTHUJJTUnfePSZYsKSRAstnk97muuabds3vPbmRvn3nmGYthGAYiIiIiFYSbsxMQERERuZJU3IiIiEiFouJGREREKhQVNyIiIlKhqLgRERGRCkXFjYiIiFQoKm5ERESkQnF3dgJlzWazceTIEQICArBYLM5OR0RERIrBMAxSU1OpXbs2bm5Ft81UuuLmyJEjhIWFOTsNERERuQQHDx7k6quvLjKm0hU3AQEBgPnhVKlSxcnZiIiISHGkpKQQFhZm/x0vSqUrbnIvRVWpUkXFjYiIiIspTpcSdSgWERGRCkXFjYiIiFQoKm5ERESkQql0fW6KKzs7m8zMTGenIZWIh4cHVqvV2WmIiLg8FTcXMAyDxMRETp065exUpBIKCgoiJCREYzCJiFwGFTcXyC1satasia+vr35kpEwYhsHZs2dJSkoCIDQ01MkZiYi4LhU3eWRnZ9sLm+rVqzs7HalkfHx8AEhKSqJmzZq6RCUiconUoTiP3D42vr6+Ts5EKqvcvz319xIRuXQqbgqgS1HiLPrbExG5fCpuREREpEJRcSMiIiIVioobKXP79u3DYrEQGxtbaMzy5cuxWCwud0t+9+7dGTlypLPTEBGp1FTcVBBDhgzBYrEQHR2db99jjz2GxWJhyJAh9m1JSUk88sgjhIeH4+XlRUhICJGRkaxdu9YeU7duXSwWS77p5ZdfLjSP7t272+M8PT2pX78+MTExpKen22PCwsJISEigefPmV+bkL8OQIUMYMGDAFXu9BQsW8J///OeKvZ6IiEsxDDi4ATLTnJqGbgWvQMLCwpg7dy6vv/66/bbitLQ0Pv/8c8LDwx1iBw4cSGZmJrNnzyYiIoK///6bn3/+mRMnTjjETZo0iaioKIdtF3vcfFRUFJMmTSIjI4MNGzbw0EMPATB58mQArFYrISEhl3WuZS0zMxMPD4+LxlWrVq0MshERKaeSD8GHN4OHL4zdB+5eTklDLTcXYRgGZzOynDIZhlGiXNu2bUt4eDgLFiywb1uwYAFhYWG0adPGvu3UqVP8+uuvvPLKK/To0YM6derQoUMHYmJi6Nevn8NrBgQEEBIS4jD5+fkVmYevry8hISGEh4czcOBAevbsyZIlS+z7C7ostXDhQho2bIiPjw89evRg3759+V73/fffJywsDF9fX2677TamTp1KUFCQQ8x3331Hu3bt8Pb2JiIigueff56srKwC85w4cSKzZ8/mm2++sbc2LV++3J7fF198Qffu3fH29mbOnDkcP36ce++9l6uvvhpfX19atGjB559/7vCaF16Wqlu3Li+99BJDhw4lICCA8PBw3nvvvSI/PxERl3XwN3Neo5HTChtQy81FncvMpun4H53y3jsmReLrWbKv6KGHHmLmzJkMGjQIgI8++oihQ4eyfPlye4y/vz/+/v58/fXXdOrUCS+v0vsD3LJlC6tXr6Zu3bqFxhw8eJDbb7+d6OhoHn30UTZu3Mi//vUvh5jVq1cTHR3NK6+8wj/+8Q9++uknnnvuOYeYH3/8kfvvv5833niDbt26sXfvXv75z38CMGHChHzvO3r0aOLi4khJSWHmzJmA2fJy5MgRAMaOHctrr73GzJkz8fLyIi0tjXbt2jF27FiqVKnCDz/8wODBg4mIiKBjx46Fnt9rr73Gf/7zH5555hm++uorHn30Ua6//noaN25crM9QRMRlHNpgzsMK/zexLKjlpoIZPHgwv/76K/v27WP//v2sXr2a+++/3yHG3d2dWbNmMXv2bIKCgujatSvPPPMMW7duzfd6Y8eOtRdDuVPeQqkg06dPx9/fHy8vL1q3bs3Ro0cZM2ZMofHvvPMOERERvP766zRq1IhBgwY59A8CePPNN+nTpw+jR4+mYcOGPPbYY/Tp08ch5sUXX2TcuHE8+OCDRERE0LNnT/7zn//w7rvvFvi+/v7++Pj42PschYSE4Onpad8/cuRIbr/9durVq0ft2rW56qqrGD16NK1btyYiIoIRI0YQGRnJl19+WeTn0bdvXx577DEaNGjA2LFjCQ4OvuhnKCLiknJbbq6+1qlpqOXmInw8rOyYFOm09y6p4OBg+vXrx+zZszEMg379+hEcHJwvbuDAgfTr149Vq1axdu1aFi9ezJQpU/jggw8cCosxY8bkKzSuuuqqInMYNGgQzz77LCkpKbzyyitUqVKFgQMHFhofFxdHp06dHAaw69y5s0PMrl27uO222xy2dejQge+//96+vmnTJjZs2MCLL75o35adnU1aWhpnz54t8cjT7du3d1jPzs7m5ZdfZt68eRw+fJj09HTS09MvepmuZcuW9mWLxUJISIj9GVIiIhVGxllI3GYuO7nlRsXNRVgslhJfGnK2oUOHMnz4cADefvvtQuO8vb3p2bMnPXv2ZPz48QwbNowJEyY4FDPBwcE0aNCgRO8fGBhoP2bOnDk0a9aMDz/8kIcffrjA+OL0LTIMI9/ovRceZ7PZeP7557n99tvzHe/t7V3c9O0uLFpee+01Xn/9daZNm0aLFi3w8/Nj5MiRZGRkFPk6F3ZEtlgs2Gy2EucjIlKuHdkMtiwICIXAq52aimv9akux9O7d2/6DGxlZ/Fanpk2b8vXXX1/RXDw8PHjmmWeIiYnh3nvvLbD1pKD3XbduncN648aNWb9+vcO2jRs3Oqy3bduWXbt2lagY8/T0JDs7u1ixq1aton///vbLfDabjT179tCkSZNiv5+ISIV1KOff6LAO4ORHyajPTQVktVqJi4sjLi6uwCdLHz9+nBtvvJE5c+awdetW4uPj+fLLL5kyZQr9+/d3iE1NTSUxMdFhSklJKVE+9913HxaLhenTpxe4Pzo6mr179zJq1Ch27drFZ599xqxZsxxiRowYwcKFC5k6dSp79uzh3XffZdGiRQ6tOePHj+fjjz9m4sSJbN++nbi4OObNm8e///3vQnOrW7cuW7duZdeuXRw7dqzIB1Y2aNCApUuXsmbNGuLi4njkkUdITEws0WchIlJhHcwpbq7u4Nw8UHFTYVWpUoUqVaoUuM/f35+OHTvy+uuvc/3119O8eXOee+45oqKieOuttxxix48fT2hoqMP09NNPlygXT09Phg8fzpQpUzh9+nS+/eHh4cyfP5/vvvuOVq1aMWPGDF566SWHmK5duzJjxgymTp1Kq1atWLx4MU899ZTD5abIyEi+//57li5dyrXXXkunTp2YOnUqderUKTS3qKgoGjVqRPv27alRowarV68uNPa5556jbdu2REZG0r17d0JCQq7oAIAiIi7LMM53JnZyfxsAi1HSwVRcXEpKCoGBgSQnJ+f78U9LSyM+Pp569epdUh8NKVtRUVHs3LmTVatWOTuVK0Z/gyLiko7vhTfbgtULYg6Wyhg3Rf1+X0h9bsRlvPrqq/Ts2RM/Pz8WLVrE7NmzC73UJSIiZSj3klTt1k4dvC+XihtxGevXr2fKlCmkpqYSERHBG2+8wbBhw5ydloiI5O1MXA6ouBGX8cUXXzg7BRERKUg56kwM6lAsIiIilyMtBZJ2mMvlpOVGxY2IiIhcusObwLBBUB0ICHF2NoCKGxEREbkc9odllo9WG1BxIyIiIpejHI1vk0vFjYiIiFwamw0O5rTcOPlJ4HmpuJEyt2/fPiwWC7GxsYXGLF++HIvFwqlTp8osLxERKaFjuyE9GTx8oVZzZ2djp+KmghgyZAgWi4Xo6Oh8+x577DEsFovD076TkpJ45JFHCA8Px8vLi5CQECIjI1m7dq09pm7dulgslnzTyy+/XGge3bt3t8d5enpSv359YmJiSE9Pt8eEhYWRkJBA8+bO/Q+hsPPLnbp3737Jr929e3dGjhx5xXIVESmXci9JXdUOrOVndJnyk4lctrCwMObOncvrr7+Oj48PYA7n//nnnxMeHu4QO3DgQDIzM5k9ezYRERH8/fff/Pzzz5w4ccIhbtKkSURFRTlsCwgIKDKPqKgoJk2aREZGBhs2bOChhx4CYPLkyYD5YM+QEOf3qN+wYYP9ieBr1qxh4MCB7Nq1yz6st6enpzPTExEp/8rZ4H251HJTgbRt25bw8HAWLFhg37ZgwQLCwsJo06aNfdupU6f49ddfeeWVV+jRowd16tShQ4cOxMTE0K9fP4fXDAgIICQkxGHy8/MrMg9fX19CQkIIDw9n4MCB9OzZkyVLltj3F3RZauHChTRs2BAfHx969OjBvn378r3u+++/T1hYGL6+vtx2221MnTqVoKAgh5jvvvuOdu3a4e3tTUREBM8//zxZWVkF5lmjRg37OVWrVg2AmjVr2rft3LmT66+/Hh8fH8LCwnjiiSc4c+aM/fjp06dzzTXX4O3tTa1atbjjjjsAsxVtxYoV/Pe//7W3AhV0PiIiLi8+59l+YZ2cm8cFVNwU15kzhU9pacWPPXeueLGX6KGHHmLmzJn29Y8++oihQ4c6xPj7++Pv78/XX3/tcLmoNGzZsoXVq1fj4eFRaMzBgwe5/fbb6du3L7GxsQwbNoxx48Y5xKxevZro6GiefPJJYmNj6dmzJy+++KJDzI8//sj999/PE088wY4dO3j33XeZNWtWvrji2LZtG5GRkdx+++1s3bqVefPm8euvvzJ8+HAANm7cyBNPPMGkSZPYtWsXixcv5vrrrwfgv//9L507dyYqKoqEhAQSEhIICwsrcQ4iIuXaib/g1H5w84A6XZydjSOjkklOTjYAIzk5Od++c+fOGTt27DDOnTuX/0Dzge4FT337Osb6+hYee8MNjrHBwQXHldCDDz5o9O/f3zh69Kjh5eVlxMfHG/v27TO8vb2No0ePGv379zcefPBBe/xXX31lVK1a1fD29ja6dOlixMTEGFu2bHF4zTp16hienp6Gn5+fw7Rs2bJC87jhhhsMDw8Pw8/Pz/D09DQAw83Nzfjqq6/sMfHx8QZgbN682TAMw4iJiTGaNGli2Gw2e8zYsWMNwDh58qRhGIZx9913G/369XN4r0GDBhmBgYH29W7duhkvvfSSQ8wnn3xihIaGXvTzW7ZsmcP7DR482PjnP//pELNq1SrDzc3NOHfunDF//nyjSpUqRkpKSqGfw5NPPnnR971QkX+DIiLlyfoPDGNCFcP4qO/FY6+Aon6/L6Q+NxVMcHAw/fr1Y/bs2RiGQb9+/QgODs4XN3DgQPr168eqVatYu3YtixcvZsqUKXzwwQcOHY/HjBnjsA5w1VVXFZnDoEGDePbZZ0lJSeGVV16hSpUqDBw4sND4uLg4OnXqhMVisW/r3LmzQ8yuXbu47bbbHLZ16NCB77//3r6+adMmNmzY4NBSk52dTVpaGmfPnsXX17fIvPPatGkTf/75J59++ql9m2EY2Gw24uPj6dmzJ3Xq1CEiIoLevXvTu3dvbrvtthK9h4iIS9v7izmv392paRRExU1xnT5d+D6r1XE9KanwWLcLrgSWQl+MoUOH2i+fvP3224XGeXt707NnT3r27Mn48eMZNmwYEyZMcChmgoODadCgQYnePzAw0H7MnDlzaNasGR9++CEPP/xwgfGGYVz0NQ3DcCh+CjrOZrPx/PPPc/vtt+c73tvbu7jp21/rkUce4Yknnsi3Lzw8HE9PT37//XeWL1/OkiVLGD9+PBMnTmTDhg35+gGJiFQ42Vnn+9vUv9G5uRRAxU1xXaQTbZnEFlPv3r3JyMgAIDIystjHNW3alK+//vqK5uLh4cEzzzxDTEwM9957b4EtGwW977p16xzWGzduzPr16x22bdy40WG9bdu27Nq1q8TFWEHatm3L9u3bi3wtd3d3br75Zm6++WYmTJhAUFAQv/zyC7fffjuenp72O7FERCqcI5vN8W28gyC0tbOzyUcdiisgq9VKXFwccXFxWC9sVQKOHz/OjTfeyJw5c9i6dSvx8fF8+eWXTJkyhf79+zvEpqamkpiY6DClpKSUKJ/77rsPi8XC9OnTC9wfHR3N3r17GTVqFLt27eKzzz5j1qxZDjEjRoxg4cKFTJ06lT179vDuu++yaNEih9ac8ePH8/HHHzNx4kS2b99OXFwc8+bN49///neJ8gUYO3Ysa9eu5fHHHyc2NpY9e/bw7bffMmLECAC+//573njjDWJjY9m/fz8ff/wxNpuNRo0aAeYYOr/99hv79u3j2LFj2Gy2EucgIlJu5V6SirgB3PL/zjibipsKqkqVKvbxWi7k7+9Px44def3117n++utp3rw5zz33HFFRUbz11lsOsePHjyc0NNRhevrpp0uUi6enJ8OHD2fKlCmcLuDyXnh4OPPnz+e7776jVatWzJgxg5deeskhpmvXrsyYMYOpU6fSqlUrFi9ezFNPPeVwuSkyMpLvv/+epUuXcu2119KpUyemTp1KnTp1SpQvQMuWLVmxYgV79uyhW7dutGnThueee47Q0FAAgoKCWLBgATfeeCNNmjRhxowZfP755zRr1gyA0aNHY7Vaadq0KTVq1ODAgQMlzkFEpNz6a5k5L4eXpAAsRnE6PFQgKSkpBAYGkpycnO/HPy0tjfj4eOrVq1fiPhpS9qKioti5cyerVq1ydipXjP4GRaTcS0uBV+qCkQ1PboWqJf8fyEtR1O/3hdTnRlzGq6++Ss+ePfHz82PRokXMnj270EtdIiJSSvb9ahY21SLKrLApKRU34jLWr1/PlClTSE1NJSIigjfeeINhw4Y5Oy0RkcqlnF+SAhU34kK++OILZ6cgIiL2zsQ9nJtHEdShWERERIrn1EE4/idYrFCvm7OzKZSKGxERESme3EtSV7cH70Dn5lIEpxY3kydP5tprryUgIICaNWsyYMAAdu3aVeQxy5cvtz9pOe+0c+fOMspaRESkknKBS1Lg5OJmxYoVPP7446xbt46lS5eSlZVFr169OFOMp2Lv2rXL/sTlhIQErrnmmjLIWEREpJKyZcNfy83l+uW7uHFqh+LFixc7rM+cOZOaNWuyadMmrr/++iKPrVmzpp7hIyIiUlYStsC5k+BVBa5q5+xsilSu+twkJycDUK1atYvGtmnThtDQUG666SaWLVtWaFx6ejopKSkOk4iIiJTQzh/MecQNYPVwbi4XUW6KG8MwGDVqFNdddx3NmzcvNC40NJT33nuP+fPns2DBAho1asRNN93EypUrC4yfPHkygYGB9iksLKy0TkEquYkTJ9K6dWtnpyEicuUZBuz4xlxuOqDwuD174P/+D7ZtK5O0ClNuipvhw4ezdetWPv/88yLjGjVqRFRUFG3btqVz585Mnz6dfv368eqrrxYYHxMTQ3Jysn06ePBgaaTvdEOGDMFisRAdHZ1v32OPPYbFYmHIkCH2bUlJSTzyyCOEh4fj5eVFSEgIkZGRrF271h5Tt27dAjtvv/zyy4Xm0b17d0aOHHklT63UzJo164pe2hw9ejQ///zzFXs9EZFy4+hOOL4HrJ5wTa/C4+rWhc6dITa2rDIrULkYxG/EiBF8++23rFy5kquvvrrEx3fq1Ik5c+YUuM/LywsvL6/LTdElhIWFMXfuXF5//XV8fHwA81lFn3/+OeHh4Q6xAwcOJDMzk9mzZxMREcHff//Nzz//zIkTJxziJk2aRFRUlMO2gICA0j2RciYjIwNPT8+Lxvn7++Pv718GGYmIlLHcVpv6N4F3Ec918vCA664zJydyasuNYRgMHz6cBQsW8Msvv1CvXr1Lep3Nmzfbn9ZcmbVt25bw8HAWLFhg37ZgwQLCwsJo06aNfdupU6f49ddfeeWVV+jRowd16tShQ4cOxMTE0K9fP4fXDAgIICQkxGHy8/O75BzHjh1Lw4YN8fX1JSIigueee47MzEz7/i1bttCjRw8CAgKoUqUK7dq1Y+PGjQDs37+fW2+9lapVq+Ln50ezZs1YuHCh/dgVK1bQoUMHvLy8CA0NZdy4cWRlZRWYx/Lly3nooYdITk62t0hNnDgRMFusXnjhBYYMGUJgYKC9uLtY7hdelhoyZAgDBgzg1VdfJTQ0lOrVq/P44487HCMi4hLsl6T+4dw8ismpLTePP/44n332Gd988w0BAQEkJiYCEBgYaG95iImJ4fDhw3z88ccATJs2jbp169KsWTMyMjKYM2cO8+fPZ/78+aWTpGFA5tnSee2L8fAFi6VEhzz00EPMnDmTQYMGAfDRRx8xdOhQli9fbo/JbWH4+uuv6dSpU5m2bAUEBDBr1ixq167Ntm3biIqKIiAggKeffhqAQYMG0aZNG9555x2sViuxsbF4eJgd1x5//HEyMjJYuXIlfn5+7Nixw95ScvjwYfr27cuQIUP4+OOP2blzJ1FRUXh7e9uLlry6dOnCtGnTGD9+vH1spbytLv/3f//Hc889x7///e9i516QZcuWERoayrJly/jzzz+5++67ad26db7WMBGRcuvYHkjaAW7u0KhP4XFr1sCcOTBgAPQq4tJVGXBqcfPOO+8AZj+NvGbOnGnvH5KQkMCBAwfs+zIyMhg9ejSHDx/Gx8eHZs2a8cMPP9C3b9/SSTLzLLxUu3Re+2KeOQKeJWslGTx4MDExMezbtw+LxcLq1auZO3euQ3Hj7u7OrFmziIqKYsaMGbRt25YbbriBe+65h5YtWzq83tixYx1+4AG+//77fN9ZceV9rbp16/Kvf/2LefPm2QuEAwcOMGbMGBo3bgzgMH7RgQMHGDhwIC1atAAgIiLCvm/69OmEhYXx1ltvYbFYaNy4MUeOHGHs2LGMHz8eNzfHRkpPT08CAwOxWCyEhITky/PGG29k9OjRJcq9IFWrVuWtt97CarXSuHFj+vXrx88//6ziRkRcR26rTb0bwKdq4XFffw3vvANnz1bu4sYwjIvGzJo1y2H96aefLvLHpLILDg6mX79+zJ49G8Mw6NevH8HBwfniBg4cSL9+/Vi1ahVr165l8eLFTJkyhQ8++MCh4/GYMWMc1gGuuuqqS87vq6++Ytq0afz555+cPn2arKwsqlQ5f/121KhRDBs2jE8++YSbb76ZO++8k/r16wPwxBNP8Oijj7JkyRJuvvlmBg4caC/G4uLi6Ny5M5Y8LV1du3bl9OnTHDp0KF+fo4tp3759iXMvSLNmzbBarfb10NBQtjn5LgIRkRKJ+9acN+1fdNzSpea8Z8/SzacYykWH4nLNw9dsQXHWe1+CoUOHMnz4cADefvvtQuO8vb3p2bMnPXv2ZPz48QwbNowJEyY4FDPBwcE0aNDgkvK40Lp167jnnnt4/vnniYyMJDAwkLlz5/Laa6/ZYyZOnMh9993HDz/8wKJFi5gwYQJz587ltttuY9iwYURGRvLDDz+wZMkSJk+ezGuvvcaIESMwDMOhsIHzxfOF24vjwn5Fxcm9ILmX1HJZLBZsNluJ8xERcYoT8ebgfRY3aNyv8LikpPN3SN18c5mkVhQVNxdjsZT40pCz9e7dm4yMDAAiIyOLfVzTpk35+uuvSykrWL16NXXq1OHZZ5+1b9u/f3++uIYNG9KwYUOeeuop7r33XmbOnMltt90GmHeERUdHEx0dTUxMDO+//z4jRoygadOmzJ8/36HIWbNmDQEBAYW2NHl6epKdnX1FcxcRqVDivjPndbqCX/6rAHa5w2C0bAm1apV+Xheh4qYCslqtxMXF2ZcvdPz4ce68806GDh1Ky5YtCQgIYOPGjUyZMoX+/R2bHVNTU+0dvXP5+voWeTnm6NGjxF4wxkFISAgNGjTgwIEDzJ07l2uvvZYffviB//3vf/aYc+fOMWbMGO644w7q1avHoUOH2LBhAwMHDgRg5MiR9OnTh4YNG3Ly5El++eUXmjRpAphj+UybNo0RI0YwfPhwdu3axYQJExg1alS+/ja56taty+nTp/n5559p1aoVvr6++PoW3Fp2sdxFRCok+11Sxbwk5eS+NrnKzSB+cmVVqVKl0ALE39+fjh078vrrr3P99dfTvHlznnvuOaKionjrrbccYsePH09oaKjDdLE+T5999hlt2rRxmGbMmEH//v156qmnGD58OK1bt2bNmjU899xz9uOsVivHjx/ngQceoGHDhtx111306dOH559/HoDs7Gwef/xxmjRpQu/evWnUqBHTp08HzH5ACxcuZP369bRq1Yro6GgefvjhfJ2h8+rSpQvR0dHcfffd1KhRgylTphQae7HcRUQqnORDcHgjYIEmtxYeZxjlqr8NgMUoTq/eCiQlJYXAwECSk5Pz/finpaURHx9PvXr18Pb2dlKGUpnpb1BEyo1178DicRDeGYYuLjzu6FFzVOJDh+DkScgZyuVKK+r3+0K6LCUiIiL5bf/anDe5yMB9NWrAn39CYmKpFTYlpctSIiIi4ujYHji4zrxL6mL9bXIVMGaYs6i4EREREUe/m08FoEFPCCxibLPsbCjkMTfOpOJGREREzsvKgNjPzOV2DxYdu2YNVK8Ow4aVfl4loOJGREREztu9CM4eA/9acM1Fbu1euhRSUuDMmbLJrZhU3IiIiMh5m2ab89aDwOpRdOzChea8nNwCnkvFjYiIiJhO7oe9v5jLbQcXHfvXX7BpE7i5wS23lH5uJaDiRkREREyxnwIG1LseqkUUHfvVV+a8e3eoWbO0MysRFTciIiICtmzYPMdcbnuRjsQAX35pzu+8s/RyukQqbkRERAT+/AlSDoNP1aIftwAQHw8bN5qXpG6/vWzyKwEVNxXEkCFDsFgsREdH59v32GOPYbFYGDJkiH1bUlISjzzyCOHh4Xh5eRESEkJkZCRr1661x9StWxeLxZJvevnllwvNo3v37owcOfJKnlqpyP28ipou1cSJE2nduvWVS1ZEpCzkjm3T6l5w9yo61t8fJk+G6Ohyd0kK9PiFCiUsLIy5c+fy+uuv45MzBHZaWhqff/454eHhDrEDBw4kMzOT2bNnExERwd9//83PP//MiRMnHOImTZpEVFSUw7aAgIDSPZEy8N///tehSAsNDWXmzJn07t3biVmJiDhJaiLsWmQuF+eSVI0aMG5c6eZ0GdRyU4G0bduW8PBwFixYYN+2YMECwsLCaNOmjX3bqVOn+PXXX3nllVfo0aMHderUoUOHDsTExNCvXz+H1wwICCAkJMRh8vPzu+Qcx44dS8OGDfH19SUiIoLnnnuOzMxM+/4tW7bQo0cPAgICqFKlCu3atWPjxo0A7N+/n1tvvZWqVavi5+dHs2bNWJh7GyKwYsUKOnTogJeXF6GhoYwbN46sQkbODAwMdDgngKCgIPt6dnY2d999N1WrVqV69er079+fffv22Y9fvnw5HTp0wM/Pj6CgILp27cr+/fuZNWsWzz//PFu2bLG3AM2aNeuSPy8RkTKxayEY2XD1tVCzsbOzuWxquSmuogYosloh7xOci4p1c3N8sFhhsZdYQDz00EPMnDmTQYMGAfDRRx8xdOhQli9fbo/x9/fH39+fr7/+mk6dOuHldZHmxysoICCAWbNmUbt2bbZt20ZUVBQBAQE8/fTTAAwaNIg2bdrwzjvvYLVaiY2NxcPDHGfh8ccfJyMjg5UrV+Ln58eOHTvw9/cH4PDhw/Tt25chQ4bw8ccfs3PnTqKiovD29mbixIklyvHs2bP06NGDbt26sXLlStzd3XnhhRfo3bs3W7duxc3NjQEDBhAVFcXnn39ORkYG69evx2KxcPfdd/PHH3+wePFifvrpJ8AspEREyrWkneY8vPPFY7/9FlJT4dZb4SJP53Yao5JJTk42ACM5OTnfvnPnzhk7duwwzp07l/9AKHzq29cx1te38NgbbnCMDQ4uOK6EHnzwQaN///7G0aNHDS8vLyM+Pt7Yt2+f4e3tbRw9etTo37+/8eCDD9rjv/rqK6Nq1aqGt7e30aVLFyMmJsbYsmWLw2vWqVPH8PT0NPz8/BymZcuWFZrHDTfcYDz55JPFznvKlClGu3bt7OsBAQHGrFmzCoxt0aKFMXHixAL3PfPMM0ajRo0Mm81m3/b2228b/v7+RnZ29kXzAIz//e9/hmEYxocffpjvtdLT0w0fHx/jxx9/NI4fP24AxvLlywt8rQkTJhitWrW66HsWpMi/QRGR0jK7v2FMqGIYmz6+eGzHjubv1FtvlXpaeRX1+30hXZaqYIKDg+nXrx+zZ89m5syZ9OvXj+Dg4HxxAwcO5MiRI3z77bdERkayfPly2rZtm+8SypgxY4iNjXWYOnbseMn5ffXVV1x33XWEhITg7+/Pc889x4EDB+z7R40axbBhw7j55pt5+eWX2bt3r33fE088wQsvvEDXrl2ZMGECW7dute+Li4ujc+fODh2Bu3btyunTpzl06FCJcty0aRN//vknAQEB9lauatWqkZaWxt69e6lWrRpDhgwhMjKSW2+9lf/+978kJCRc8mciIuJ0x3ab8xqNio47cAB++w0sFhg4sPTzukQqborr9OnCp/nzHWOTkgqPXbTIMXbfvoLjLsPQoUOZNWsWs2fPZujQoYXGeXt707NnT8aPH8+aNWsYMmQIEyZMcIgJDg6mQYMGDpNP3stqJbBu3Truuece+vTpw/fff8/mzZt59tlnycjIsMdMnDiR7du3069fP3755ReaNm3K//73PwCGDRvGX3/9xeDBg9m2bRvt27fnzTffBMAwjHx3OBmGAVDiO59sNhvt2rXLV9Tt3r2b++67D4CZM2eydu1aunTpwrx582jYsCHr1q27pM9FRMSp0lPNW8ABqjcoOjZ34L5u3SCnv2J5pOKmuPz8Cp/y9re5WOyFhUFhcZehd+/eZGRkkJGRQWRkZLGPa9q0KWdK8eFnq1evpk6dOjz77LO0b9+ea665hv379+eLa9iwIU899RRLlizh9ttvZ+bMmfZ9YWFhREdHs2DBAv71r3/x/vvv23Nfs2aNvaABWLNmDQEBAVx11VUlyrNt27bs2bOHmjVr5ivs8vafadOmDTExMaxZs4bmzZvz2WfmU3Q9PT3Jzs4u0XuKiDjN8T/NuV8N8K1WdGw5HrgvLxU3FZDVaiUuLo64uDisVmu+/cePH+fGG29kzpw5bN26lfj4eL788kumTJlC//79HWJTU1NJTEx0mFJSUop8/6NHj+Zr9UhMTKRBgwYcOHCAuXPnsnfvXt544w17qwzAuXPnGD58OMuXL2f//v2sXr2aDRs20KRJEwBGjhzJjz/+SHx8PL///ju//PKLfd9jjz3GwYMHGTFiBDt37uSbb75hwoQJjBo1Cje3kv2ZDxo0iODgYPr378+qVauIj49nxYoVPPnkkxw6dIj4+HhiYmJYu3Yt+/fvZ8mSJezevdueS926dYmPjyc2NpZjx46Rnp5eovcXESlTR3MuSQU3LDru4EFYt868JFUOB+5zUNodgMqbS+5QXM7ldiguTN4OxWlpaca4ceOMtm3bGoGBgYavr6/RqFEj49///rdx9uxZ+zF16tQxgHzTI488Uuj73HDDDQUeM2HCBMMwDGPMmDFG9erVDX9/f+Puu+82Xn/9dSMwMNAwDLPT7j333GOEhYUZnp6eRu3atY3hw4fbv4/hw4cb9evXN7y8vIwaNWoYgwcPNo4dO2Z/7+XLlxvXXnut4enpaYSEhBhjx441MjMzi/X5kadDsWEYRkJCgvHAAw8YwcHBhpeXlxEREWFERUUZycnJRmJiojFgwAAjNDTU8PT0NOrUqWOMHz/e3nE5LS3NGDhwoBEUFGQAxsyZM4uVg2G49t+giLionyaZnYm/fbLouEmTzI7E119fJmldqCQdii2GkacdvxJISUkhMDCQ5ORkqlxwC1taWhrx8fHUq1cP7wsvNYmUAf0NikiZmzcY4r6FyMnQ+bHC44YMgdmz4ZNP4P77yyy9XEX9fl9Il6VEREQqs2PFvCw1axbs3g133FHqKV0uDeInIiJSWWVnwfGcITdqXKS4AbjmmtLN5wpRy42IiEhldWo/2DLB3QeqXF1wTFISHDlStnldJhU3IiIildXRXeY8uIH5eKCCTJ0KderASy+VXV6XScVNASpZH2spR/S3JyJlyt7fppCRidPT4cMPISsLmjYtu7wuk4qbPHIf0Hj27FknZyKVVe7fXu7foohIqTq2x5wX1pl4wQI4dgyuugpuuaXs8rpM6lCch9VqJSgoiKSkJAB8fX1LPHS/yKUwDIOzZ8+SlJREUFBQgYMviohccfaWm0I6Cs+YYc6josDddUoG18m0jITkPCsjt8ARKUtBQUH2v0ERkVJlGHAst89NAS0327fDypVgtcKwYWWb22VScXMBi8VCaGgoNWvWJDMz09npSCXi4eGhFhsRKTtnjkJaMmAp+IGZ775rzv/xD/OylAtRcVMIq9WqHxoREam4ci9JVa0DHheMiJ6ZCbnP/nv00bLN6wpQcSMiIlIZHS3ikpSHB2zbBp99BjffXLZ5XQEqbkRERCqji90pFRQEjxXxrKlyTLeCi4iIVEaFPVNq1y6zs7ELU3EjIiJSGRVU3CQmQps20LkznDjhnLyuABU3IiIilU3GGUg+aC7nLW5eeQXOnTOXq1Yt+7yuEBU3IiIilc3xP825b3Xwq24uJyScH7Tv+efBhQexVXEjIiJS2Rwt4JLUyy9DWpp5SapXL+fkdYWouBEREalsLnzswuHD5wftmzTJpVttQMWNiIhI5XPh08DnzzefAN65M9x0k/PyukJU3IiIiFQ2F94p9ccf5vymm1y+1QY0iJ+IiEjlYrPB8b3mcnDOM6WmTYNHHoFq1ZyW1pWk4kZERKQyOXsMstMBC1S52tzm6wvt2jk1rStJl6VEREQqk+RD5jwgBNw9nZtLKXFqcTN58mSuvfZaAgICqFmzJgMGDGDXrl0XPW7FihW0a9cOb29vIiIimJF7X76IiIgULbe4qXKVOd+4EaKjzYdkVhBOLW5WrFjB448/zrp161i6dClZWVn06tWLM2fOFHpMfHw8ffv2pVu3bmzevJlnnnmGJ554gvnz55dh5iIiIi4qt7gJzLkk9euv5m3gX33lvJyuMKf2uVm8eLHD+syZM6lZsyabNm3i+uuvL/CYGTNmEB4ezrRp0wBo0qQJGzdu5NVXX2XgwIH54tPT00lPT7evp6SkXLkTEBERcTUXFjfbt5vzZs2ck08pKFd9bpKTkwGoVkRv7bVr19LrgpETIyMj2bhxI5mZmfniJ0+eTGBgoH0KCwu7skmLiIi4kpTc4ibn91DFTekxDINRo0Zx3XXX0bx580LjEhMTqVWrlsO2WrVqkZWVxbFjx/LFx8TEkJycbJ8OHjx4xXMXERFxGfaWm6vAMCpkcVNubgUfPnw4W7du5ddff71orOWCAYYMwyhwO4CXlxdeXl5XJkkRERFXl/ey1JEjkJICVis0bFj0cS6kXLTcjBgxgm+//ZZly5Zx9dVXFxkbEhJCYmKiw7akpCTc3d2pXr16aaYpIiLi2rLS4fTf5nJg2PlWm2uugQrUEODU4sYwDIYPH86CBQv45ZdfqFev3kWP6dy5M0uXLnXYtmTJEtq3b4+Hh0dppSoiIuL6Uo6Yc3dv8K0Of/1lrlegS1Lg5OLm8ccfZ86cOXz22WcEBASQmJhIYmIi586ds8fExMTwwAMP2Nejo6PZv38/o0aNIi4ujo8++ogPP/yQ0aNHO+MUREREXEfeMW4sFnN8m5MnzccvVCBOLW7eeecdkpOT6d69O6GhofZp3rx59piEhAQOHDhgX69Xrx4LFy5k+fLltG7dmv/85z+88cYbBd4GLiIiInmkHDbngXm6gAQFwUW6hLgap3Yozu0IXJRZs2bl23bDDTfw+++/l0JGIiIiFVhyzh3DgRV7WJRy0aFYREREykDe28APH4aePWHMGOfmVApU3IiIiFQWeW8D/+MP+Okn+OEH5+ZUClTciIiIVBbJefrcVMDB+3KpuBEREakMDON8n5sqeYqbpk2dl1MpUXEjIiJSGaQlQ8ZpcznwKrXciIiIiIvLvQ3cpxp4+MKOHea6ihsRERFxSXk7Ex86BKmp4O5uPnqhgik3D84UERGRUmQf4ybngZnBwVCjBnh6OjevUqDiRkREpDLIe6dUx45w9CicPu3cnEqJLkuJiIhUBnkvS+Xy93dOLqVMxY2IiEhlkPehmRWcihsREZHKwF7cXA0tW8Itt5iXpiog9bkRERGp6GzZkHrEXDb8Yds2c/Lzc25epUQtNyIiIhXd6b/BlgUWKxxPM7fVqAG+vs7Nq5SouBEREano7JekasOhnLum6tRxXj6lTMWNiIhIRZf3Tqn9+83l8HDn5VPKVNyIiIhUdHmLmwMHzGW13IiIiIjLKqjlRsWNiIiIuKy8Y9z4+EC1ahX6spRuBRcREanoUnJbbsLg44/NZcNwXj6lTC03IiIiFV1Bj16wWJyTSxlQcSMiIlKRZZyFs8fN5cCK/+gFUHEjIiJSsaXkjEzs6Q+/bYEmTeCRR5ybUylTcSMiIlKRJR8054FXw19/wc6d528Hr6BU3IiIiFRklew2cFBxIyIiUrHlvQ08t8WmAt8GDipuREREKraT8ea8al213IiIiEgFcOIvc14tolI8VwpU3IiIiFRsJ3JaboLqwsGczsVquRERERGXlJYMZ4+Zy+7VoXlzqFEDatd2bl6lTI9fEBERqahyW238akCtMPj9d+fmU0bUciMiIlJR5e1vU4mouBEREamocu+UUnEjIiIiFUJuy03VejBmjPnohZkznZtTGVBxIyIiUlGdyNNys3OnOWVkODenMqDiRkREpKIqaIybCn4bOKi4ERERqZgyzkJqgrlcrZ6KGxEREXFxJ/eZc+8gyLRCSoq5XsFHJwYVNyIiIhWT/ZJUnlab6tXBz895OZURFTciIiIVUd7+NrlPA68El6RAxY2IiEjFlLe4cXODNm2gZUvn5lRG9PgFERGRiijvAH6t+0Lfvs7Npwyp5UZERKQiqqSPXgAVNyIiIhVPVjokHzKXq9Zzbi5OoOJGRESkojl1AAwbePiBf0245hrz0Qu7dzs7szKhPjciIiIVTd5LUhkZ8Oef5npQkNNSKktquREREalo7M+UqgeHci5P+fhAjRrOy6kMqbgRERGpaAoawC88HCwW5+VUhpxa3KxcuZJbb72V2rVrY7FY+Prrr4uMX758ORaLJd+0c+fOsklYRETEFRQ0gF8leOxCLqcWN2fOnKFVq1a89dZbJTpu165dJCQk2KdrrrmmlDIUERFxQQU9DbwSFTdO7VDcp08f+vTpU+LjatasSVAl6RQlIiJSItlZ5t1SYBY3f75vLjdo4LycyphL9rlp06YNoaGh3HTTTSxbtqzI2PT0dFJSUhwmERGRCivlENgyweoFAbXN50m1agXNmjk7szLjUsVNaGgo7733HvPnz2fBggU0atSIm266iZUrVxZ6zOTJkwkMDLRPYWFhZZixiIhIGcu9JFW1rvlMqRdegNhYuPVWZ2ZVpkpU3EyZMoVz587Z11euXEl6erp9PTU1lccee+zKZXeBRo0aERUVRdu2bencuTPTp0+nX79+vPrqq4UeExMTQ3Jysn06ePBgqeUnIiLidJX4sQu5SlTcxMTEkJqaal+/5ZZbOHz4sH397NmzvPvuu1cuu2Lo1KkTe/bsKXS/l5cXVapUcZhEREQqrBN5HpiZlQU2m3PzcYISFTeGYRS57gybN28mNDTU2WmIiIiUD3kH8Pv8c/D3h4cfdm5OZcypd0udPn2aP3OHhAbi4+OJjY2lWrVqhIeHExMTw+HDh/n4448BmDZtGnXr1qVZs2ZkZGQwZ84c5s+fz/z58511CiIiIuVL3stSu5fDuXPgXrmetuTUs924cSM9evSwr48aNQqABx98kFmzZpGQkMCB3MGHgIyMDEaPHs3hw4fx8fGhWbNm/PDDD/Tt27fMcxcRESl3bDY4maflZk/ObeCVbDy4Ehc3H3zwAf7+/gBkZWUxa9YsgoODARz64xRH9+7di7y0NWvWLIf1p59+mqeffrpkCYuIiFQWKYcgKw3cPCAwDHL7pKq4KVx4eDjvv/++fT0kJIRPPvkkX4yIiIg4QVLO44iqNwA3dxU3xbFv375SSkNEREQu29E4c16zMSQlQWqq+bDM+vWdm1cZc6lB/ERERKQIuS03NZrA7t3mcp064OXlvJycoETFzW+//caiRYsctn388cfUq1ePmjVr8s9//tNhUD8REREpQ3lbbnx84LbbIDLSuTk5QYmKm4kTJ7J161b7+rZt23j44Ye5+eabGTduHN999x2TJ0++4kmKiIjIRdhscHSXuVyjCbRvDwsWwIwZzs3LCUpU3MTGxnLTTTfZ1+fOnUvHjh15//33GTVqFG+88QZffPHFFU9SRERELiL5AGSeBatnpX70ApSwuDl58iS1atWyr69YsYLevXvb16+99lo9u0lERMQZcvvbBDcEqzscOwbl4EkCzlCi4qZWrVrEx5uDA2VkZPD777/TuXNn+/7U1FQ8PDyubIYiIiJycUk7zHmNxuYlqvBwqFIF8gyGW1mUqLjp3bs348aNY9WqVcTExODr60u3bt3s+7du3Ur9Sna7mYiISLlwNKflpmZjOHLEfOzCuXNQCZ+/WKJxbl544QVuv/12brjhBvz9/Zk1axaenp72/R999BG9evW64kmKiIjIRSTl3ClVo8n5wfsiIqASXlEpUXFTo0YNVq1aRXJyMv7+/litVof9X375JQEBAVc0QREREbkIWzYcyxnXpmYTWPmTuVzJRibOVaLiZujQocWK++ijjy4pGREREbkEJ/eZz5Ry94aqdSvtYxdylai4mTVrFnXq1KFNmzZFPvBSREREylBuf5vga8DNquKmJMHR0dHMnTuXv/76i6FDh3L//fdTrVq10spNREREiiNvfxs4X9w0bOicfJysRHdLTZ8+nYSEBMaOHct3331HWFgYd911Fz/++KNackRERJwl751SAAMGwD/+AU2aOC0lZ7IYl1GV7N+/n1mzZvHxxx+TmZnJjh078Pf3v5L5XXEpKSkEBgaSnJxMlSpVnJ2OiIjI5XvnOvh7G9zzOTTu6+xsSkVJfr8v66ngFosFi8WCYRjYbLbLeSkRERG5FNlZcCznmVI1K2dLzYVKXNykp6fz+eef07NnTxo1asS2bdt46623OHDgQLlvtREREalwTsZDdgZ4+EJQHUhMhKSkSvvoBShhcfPYY48RGhrKK6+8wi233MKhQ4f48ssv6du3L25ul9UIJCIiIpcitzNxcENwc4PJk6FWLXj2Wefm5UQlultqxowZhIeHU69ePVasWMGKFSsKjFuwYMEVSU5EREQuwt6Z+II7perWdUo65UGJipsHHngAi8VSWrmIiIhISdlvA8+5U6qSj3EDlzCIn4iIiJQjeVtuMjMhPt5cr8TFjTrKiIiIuKrsTDiW01JTo7HZapOdDf7+ULu2c3NzIhU3IiIirurEX2DLBA8/CAyDrVvN7c2bm52LK6nKe+YiIiKuLmmHOa/RyCxmtm0z11u2dF5O5UCJ+tyIiIhIOZJ0wZ1SPXrAmTPmvBJTcSMiIuKq/v7DnOcWNzffbE6VnC5LiYiIuKqELeY8tLVT0yhvVNyIiIi4otNHIfmguRzaChISYPVqSE52bl7lgIobERERV5QQa86rXwPeVeCbb+C66+Duu52aVnmg4kZERMQVHdlszmu3Mee6U8pOxY2IiIgrKqy4adHCOfmUIypuREREXFHe4sYwzg/gp5YbFTciIiIuJyUBUhPA4gYhLeDQIbMjsdUKjRs7OzunU3EjIiLianI7Ewc3Ai//85ekGjUCLy+npVVeqLgRERFxNRf2t9ElKQcaoVhERMTVHIk157nFzT/+YT4JvH59p6VUnqi4ERERcSWGkb/lpmlTcxJAl6VERERcS8oROJMEFiuENHd2NuWSihsRERFXkttqU7MJePjA4cMwe/b5fjei4kZERMSl2C9JtTbnK1bAkCHw6KPOyqjcUXEjIiLiSvTYhYtScSMiIuIqCupMrMcu5KPiRkRExFWcOgDnToCbB9TK6Uyc29dGxY2dihsRERFXkdtqU6spuHvBqVNw8KC5TcWNnYobERERV3HhJak//jDnYWEQFOSUlMojFTciIiKuIveZUhc+dkGtNg40QrGIiIgrKKgz8V13QUQE+Po6L69ySMWNiIiIKzjxF6Qlg9ULajQxtwUHQ+/ezs2rHHLqZamVK1dy6623Urt2bSwWC19//fVFj1mxYgXt2rXD29ubiIgIZsyYUfqJioiIONuBtea8dmtw93RqKuWdU4ubM2fO0KpVK956661ixcfHx9O3b1+6devG5s2beeaZZ3jiiSeYP39+KWcqIiLiZPtWm/M6Xc351q0QEwNLlzovp3LKqZel+vTpQ58+fYodP2PGDMLDw5k2bRoATZo0YePGjbz66qsMHDiwlLIUEREpB/ZfUNwsWQIvvww7d0LPns7Lqxxyqbul1q5dS69evRy2RUZGsnHjRjIzMws8Jj09nZSUFIdJRETEpSQfglP7weIG4R3Nbb/9Zs47dnReXuWUSxU3iYmJ1KpVy2FbrVq1yMrK4tixYwUeM3nyZAIDA+1TWFhYWaQqIiJy5exfY85DW4FXgLms4qZQLlXcAFgsFod1wzAK3J4rJiaG5ORk+3QwdyRHERERV7HvV3Oee0nqyBFzZGKLBdq3d15e5ZRL3QoeEhJCYmKiw7akpCTc3d2pXr16gcd4eXnh5eVVFumJiIiUjtyWm9ziJrfVplkzCAhwTk7lmEu13HTu3JmlF/QKX7JkCe3bt8fDw8NJWYmIiJSi1L/h+B7AAnU6m9tyi5tOnZyWVnnm1OLm9OnTxMbGEhsbC5i3esfGxnLgwAHAvKT0wAMP2OOjo6PZv38/o0aNIi4ujo8++ogPP/yQ0aNHOyN9ERGR0ncgp9WmVnPwqWoub99uztXfpkBOvSy1ceNGevToYV8fNWoUAA8++CCzZs0iISHBXugA1KtXj4ULF/LUU0/x9ttvU7t2bd544w3dBi4iIhWXfXybLue3ffst/PknFNIlo7KzGLk9ciuJlJQUAgMDSU5OpkqVKs5OR0REpGjTu0DSdrjrY2ja39nZOE1Jfr9dqs+NiIhIpXL2hFnYAIR3KTpW7FTciIiIlFe5z5MKbgT+NczlZ581nwa+apXz8irnVNyIiIiUV4X1t/nySzh+3Dk5uQAVNyIiIuVV7vOk6l5nzlNTdadUMai4ERERKY/SkiFxq7mc23KzYQMYBoSFQWio83Ir51TciIiIlEcH14Nhg6r1oEptc5sG7ysWFTciIiLl0YXPkwI9LLOYVNyIiIiUR/ErzHndnOLGMFTcFJOKGxERkfIm9W84stlcrn9TzrZUuPpq8PGBtm2dl5sLcKmngouIiFQKf+Y8JLp2GwioZS5XqWJ2KM7IAE9P5+XmAtRyIyIiUt7s/tGcXxOZf58Km4tScSMiIlKeZGXA3mXmcsNe5twwIDnZeTm5GBU3IiIi5cmBNZCRCn41IbSNuS0uDqpVgx49zEJHiqTiRkREpDzZvcScX9ML3HJ+pn/+GWw2cHcHi8V5ubkIFTciIiLlye7F5jz3khTATz+Z85tuKvt8XJCKGxERkfLi+F44sRfcPCCih7ktKwuWLzeXVdwUi4obERGR8iL3Lqk6XcC7irm8aROkpEBQkMa3KSYVNyIiIuXFnpzipmGeW8BzL0n16AFWa9nn5IJU3IiIiJQH6amwb7W53LD3+e0//2zOb7657HNyURqhWEREpDzYuwxsmVCtPlSvf3774MEQHAw9ezovNxej4kZERKQ8KOiSFMBDD5mTFJsuS4mIiDibzXZ+fJsLixspMRU3IiIizpawGc4kgWcAhHc5v33ePNi1S6MSl5AuS4mIiDjbHwvM+TU3g3vOgzFPnoR77zULmyNHIDTUefm5GLXciIiIOJMtG7Z9ZS63uOv89mXLzMKmSRMVNiWk4kZERMSZ9q2C04ngHQQN8tzunXsLuEYlLjEVNyIiIs607Utz3mzA+UtShgHff28u6xbwElNxIyIi4iyZabDjW3M57yWpTZvgwAHw81NxcwlU3IiIiDjLnh8hPQWqXA3hnc9vnz/fnPftCz4+zsnNham4ERERcZatX5jzFgPBLc9P8uLF5vz228s+pwpAt4KLiIg4w7lTsCdn4L68l6QAVq+GJUvUmfgSqbgRERFxhrhvITsDajaFkOaO+3x9YcAAp6RVEeiylIiIiDPYL0nd4dw8KiAVNyIiImUt5Qjs+9VcbnHn+e1790Lz5jBpkh65cBlU3IiIiJS1P+YDhnmHVFD4+e0LFsD27bByJVgsTkvP1am4ERERKUuGAVvmmst5W23ALG4ABg4s25wqGBU3IiIiZenAWvj7D3D3gWa3nd9+6BCsW2e22Kgz8WVRcSMiIlKWfnvXnLe8C3yrnd/+9dfmvHNnPSjzMqm4ERERKSvJhyDuO3O54yOO+3JHJdYlqcum4kZERKSsbPwIjGyocx3UanZ++9GjZidigNtuK/hYKTYN4iciIlIWMtNg0yxz+cJWm4wMePBB+PNPqFevzFOraFTciIiIlIU/5sPZ4+ZDMhv1ddx31VXw0Uca2+YK0WUpERGR0mYYsD6nI/G1D4O1kLYFjW1zRai4ERERKW0H10PCFnD3hrYPOu779FP4/Xfn5FVBqbgREREpbb/NMOct7gC/6ue3p6bCI49Au3awYYNzcquAVNyIiIiUppQE8wngAB0u6Eg8dy6cOQONGkH79mWfWwWl4kZERKQ0rX0LbFnmc6RCWzru++ADcz5smPrbXEEqbkREREpLSgJsyClguo123Ld1K6xfDx4e8MADZZ9bBabiRkREpLSseg2y0iCsEzS4yXHf+++b8/79oWbNss+tAnN6cTN9+nTq1auHt7c37dq1Y9WqVYXGLl++HIvFkm/auXNnGWYsIiJSDKcOnB+078ZnHS87nTsHc+aYy1FRZZ5aRefUQfzmzZvHyJEjmT59Ol27duXdd9+lT58+7Nixg/Dw8EKP27VrF1WqVLGv16hRoyzSFRERKb6V/we2TKh3vTnltXs3+PhAUBDcfLNT0qvILIbhvOEQO3bsSNu2bXnnnXfs25o0acKAAQOYPHlyvvjly5fTo0cPTp48SVBQULHeIz09nfT0dPt6SkoKYWFhJCcnOxRIIiIiV8zxvfDWteZzpIYugfCO+WOysmDfPmjQoMzTc0UpKSkEBgYW6/fbaZelMjIy2LRpE7169XLY3qtXL9asWVPksW3atCE0NJSbbrqJZcuWFRk7efJkAgMD7VNYWNhl5y4iIlKkFVPMwqZBz4ILGwB3dxU2pcRpxc2xY8fIzs6mVq1aDttr1apFYmJigceEhoby3nvvMX/+fBYsWECjRo246aabWJn7JNUCxMTEkJycbJ8OHjx4Rc9DRETEQdJO2DrPXL7x2fz7f/wRMjPLNqdKxukPzrRccF+/YRj5tuVq1KgRjRo1sq937tyZgwcP8uqrr3L99dcXeIyXlxdeXl5XLmEREZGiLJ8MGND4FqjdxnHfhg3Quzdccw1s2wb6fSoVTmu5CQ4Oxmq15mulSUpKyteaU5ROnTqxZ8+eK52eiIhIycWvhB1fAxboHpN//4svmvMuXVTYlCKnFTeenp60a9eOpUuXOmxfunQpXbp0KfbrbN68mdDQ0CudnoiISMlkpcP3T5nL7YdCSHPH/du2wTffmLeExxRQ+MgV49TLUqNGjWLw4MG0b9+ezp07895773HgwAGio6MBs7/M4cOH+fjjjwGYNm0adevWpVmzZmRkZDBnzhzmz5/P/PnznXkaIiIisPq/cPxP8KsJN43Pv/+ll8z5nXeaz5KSUuPU4ubuu+/m+PHjTJo0iYSEBJo3b87ChQupU6cOAAkJCRw4cMAen5GRwejRozl8+DA+Pj40a9aMH374gb59+zrrFERERMxbv1e+ai73ngw+QY77d++GeTmdjJ95pkxTq4ycOs6NM5TkPnkREZGLMgz45Db4axlE9IDB/8v/EMyhQ2HmTLj1Vvj2W+fk6eJcYpwbERGRCuGP+WZhY/WCfq/lL2yys+HQIXP52QJuDZcrzum3gouIiLisc6dgcU7n4OtHQ/X6+WOsVliyxOxQ3KJFmaZXWanlRkRE5FIYBiwcA2eSoPo10PXJouNV2JQZFTciIiKXYvMnsO0LsFih/1vgfsG4NZmZMH48HD3qnPwqMRU3IiIiJfX3Dlj4tLl847MQ3il/zNtvw3/+A127mv1upMyouBERESmJjDPw5RDIOgf1b4KuT+WPSUyECRPM5TFjzH43UmZU3IiIiJTED6Ph2C7wD4Hb3gW3An5Kx42DlBRo3968DVzKlIobERGR4or9DLZ8BhY3uOND8K+RP2btWpg921x+6y212jiBihsREZHiOLQRvh9lLnePgbrX5Y/JyIDHHjOXhw6Fjh3LLj+xU3EjIiJyMcf2wKd3mv1sGvSEbv8qOO611yA2FqpVg8mTyzRFOU+D+ImIiBQlJQE+uR3OnYDabeHOWeBWyKWmESPM50jddhvUrFmmacp5Km5EREQKk5ZsttgkH4BqETDoS/DyLzze3998hpQ4lS5LiYiIFCQrHeYOgr+3gV9NuH8B+AXnj8vOhrlzwWYr+xylQCpuRERELpR5Dr54APatAs8AuP8rqFav4NhXXoF774V77inbHKVQuiwlIiKS17lT8Pm9cGANuHvD3Z9AaKuCY1etMh+xANCnT5mlKEVTcSMiIpLrdJLZefjvbeBVBe6bB3W6FBy7Ywf0729elrr3XhgypExTlcKpuBEREQE4uR8+GQAn/srpYzMfQlsWHHvwIERGwsmT0KkTfPABWCxlmq4UTsWNiIjIgd/MPjanEyEoHAZ/DdXrFxx74gT07g2HDkGTJvD99+DrW6bpStFU3IiISOVlGLBuOiwdD7YsqNnUbLGpUrvwY7Zsgb/+gquugsWLoXr1sstXikXFjYiIVE5pKfDN4xD3rbnefCDc+kbR49gA9OgBS5aYoxCHh5d+nlJiKm5ERKTySdgKXw6BE3vBzQMiX4IOUYX3mzl1yrwM1by5ud6tW1llKpdAxY2IiFQemedg+cuw5k0wsqHK1XDXbLi6feHHHDpk9rE5ehTWrIH6hfTFkXJDxY2IiFQOf62A756Ek/HmepN/wC3TwK+IPjN//GGOX3PoENSuDWfPlkmqcnlU3IiISMWWkgC//AdiPzXXA2pDv1ehcb+ij1u6FO66y7wk1aSJ2XlYfWxcgoobERGpmM6dhF+nwW/vQtY5wALXDoObxoN3lcKPy8iAZ5+FV18117t2hW+/NTsQi0tQcSMiIhVLxhn4bQb8+l9ITza3hXWEXi9AWIeLHz916vnCJjraXPfxKb185YpTcSMiIhVD8mFY/x5smgVpp8xttZrDjc9Bw8jijyD85JOwaBE89RQMGFBKyUppUnEjIiKuyzDg0Eb47R3Y/rV5BxRAtQjo/ow5do2bW9Gv8dtvMH06fPghuLubrTTLl+txCi5MxY2IiLie5EOw9QvYMheO7Tq/vW436PQoNOwNbtaiX+PwYRg3DubMMdc7dYJHHzWXVdi4NBU3IiLiGgwDdv5g9qfZ9ytgmNutXtD8drOoCW118ddJSIA33jCn3Fu7hwzRJagKRMWNiIiUf6eT4Id/nX9UAkCdrtDybmjaH3yCLv4aWVlmB+FPPjHviALo0gX++19oX8QgfuJyVNyIiEj5ZRjwx3xYOAbOnQA3d+j8OLR/GKrWufjxWVlmPxow5wcOmIVNly4wZgz0769LUBWQxTAMw9lJlKWUlBQCAwNJTk6mSpUixjkQERHnSv0bfhgFO78312u1gAHTIbRl0ccZBqxebbbQfPUV/P471MkphH7/HdLSzOJGXEpJfr/VciMiIuXP9q/h+6dyWms84Pox0G0UWD0Kjj950hxReNEicyThxMTz++bPh1GjzOW2bUs9dXE+FTciIlJ+nDtpXoLa9qW5XqsF3DYDQpo7xuW93PTrr3DDDWCznd/v7w933AGDB5v7pFJRcSMiIuXDnqXw7ROQegQsbnDdKLhhLLh7mvv/+gteew1++AFuv90cORigTRuz0GnQwHx6d58+0K0beHk571zEqVTciIiIcx37E5b8G3YvMterN4Db3oWrc+5g2roVXnkF5s493zqzbt354/38zKd216hRtnlLuaXiRkREnOPcSVjxf7D+XbBlmXdCdXgEbvw3ePrC+vUwaZLZUpMrMhKeeAKuu87xtVTYSB4qbkREpGydOwUbP4I1b5odhgGuiTQfbFmj4fm4t94yCxuLxew/M26cOgRLsai4ERGRspF8CNa9Yz7YMuO0ua1GY4h8ERrcbLbUpP4FERHmvn//G6xWiImBhg0LfVmRC6m4ERGR0mMYcGAtbJwJ2xeYl58AajaFLiPgmn/AV/NhUEezuLnvPvj0UzOmYUOYOdN5uYvLUnEjIiJX3ukkiP0MNn8Cx/88v71uN+j6JGTUho8/hg+fgOPHzX2enuYdTjbbxZ/kLVIEFTciInJlnD0BuxbCjm9g7y/nW2k8/KDFQGj3EFzVFu66C7788vxx4eHmM58efhhq1nRO7lKhqLgREZFLd3I/7P0Z4r6D+JXnCxqAWu3Aci3sToNuE6FqVXN769bw9dfQqxdERUG/fucH5BO5AvRsKRERKb5zp2D/ati7zGydObH3/D6bAVkRcLoO7DkNq36DM2fMffPmmS02YD4qwWKBoKCyzl5cmJ4tJSIil8+WDcf3wqH1cPA3OLgeju48v98wzLFprr4WUuvBhM8gZQuw5XxMrVrmqMF18jzBO7cFR6SUqLgRERE4cxyO74G/t8Pff0DiNnM586zZInPCBkdzppM+kGDAvQNg8hvgHQgHDsBT70JAAHTtCtdfbw6417q1OgdLmVNxIyJSGRiGeQfTqf1mP5lT++DkPvPRB8d2Q8pxSLGZl4uq5RQjZ2zwcRqcyIasvD0Y0s3Z/lSzsAGzU/DmzdC8ufrPuDjDMMjMNjiXmU16ZjbncqeMnCkzm7M5y2czsjibs+9Murl+JiMbfy8rk29v6bRzcPpf4PTp0/m///s/EhISaNasGdOmTaNbt26Fxq9YsYJRo0axfft2ateuzdNPP010dHQZZiwiUo5kZZij/J45ahYv9nkSJB+Bvw9AcgLYjkJ2ulmk/JoOpw1zSrZBigFnc4qXzlfDpIchpAUEN4VpLc1jfH2haVNzatYM2rXLP1pw69ZlfvoVlS0zi4y0dDKs7qRnQ0a2jayjx8g+eozMtHSy0tLJSssgKy2NrLQMstPTSWzaljOe3qRn2fDftZ2gnX9gy8jESM/AlpGBkZmJkZGBkZHJ0i63kOBfnbRMG812/EbX2BWQlYklMwu3rEzcs7Nwt2XhbsvmlRuGsLNmPQD6xa3i8XVf4J6djYctE4/sbDxsWbhnZ+GRncVjA2L4tV4bgv29mHy78z4/pxY38+bNY+TIkUyfPp2uXbvy7rvv0qdPH3bs2EF4eHi++Pj4ePr27UtUVBRz5sxh9erVPPbYY9SoUYOBAwc64QxERC6BzQZZaZB5zrzsk3kWMs7kzM9CRiqcS4aEBEg9BSknc6ZTkJoMqangkwFXZ5gj/WYZ8G0apBlwzjDnaTkFiw1o6g53+gIWCLoKVu00LzVdyM8P6lwHN088v23pUqhb12yZccXLSzYbZGaaU1YWRmAg2TaDLJtB1oGDZJ84RXaGWSzY0jPITs8kOy0NW0Ymx2+4mSyb2Yrht3oFnvF7MdIzMDLSMXKKBjLMac2QJ0lz8yAj20bzH+YRtuU3LJmZWDIzcMvMzJkycMvO4sXHp3LS25+MLBv3ff8+vX9biHu2WVB4ZGXibsvGatjwBm5+5AMOBYUAMG7ZR0SvX1DoqfYa+ha7a9QF4Mlf53LP6s8KjZ0d0IjNV1kB6Lo7jgHrvy809pMuAznm74m3h5WGHhk0TYovNPaeFsHc2L0pgT4eRXwppc+pxc3UqVN5+OGHGTZsGADTpk3jxx9/5J133mHy5Mn54mfMmEF4eDjTpk0DoEmTJmzcuJFXX33V6cVNdlYWSft347F1S6ExtsBAsuvXz1kx8IjdXHhslSpkN2hgX/f4/fdCYw1/f7LyDE3uEbvV8XbMvLE+3mQ1aXo+dus2yMosONbLm6xm52Pdt23HkplecKyHF1ktmp2PjduB5VxawQm7uZPZ+nxzpfvOnVjOni041uJGZpvW52N378Zy+nTBsUBmnv+TtO7di1tycuGxrduAm8WM/Sset1MnC49t3twcYAyw7tuHW+6gYwXFNmsG3t4AuB3Yj/XoscJjmzQBHx8z9vAhrH8nmZcPCopt1Aj8/c3YI4exJiQ4BuT5scq8pgEEmpcL3BISsR4+DIDFsOV/3Qb17XetWJKScN+3P2dP3jzM5eyIetiqVwcM3JKO4R7/V573z3ntnPwz69bFqFXTjD12HOvuPWDYzNcywIJhj82qWwejdggYBm4nTuD+x46cOPM1LTYDsIHNIKteGLawqwEDy4mTeG6KzXlbG9hs5jkaBthsZNe7muz6dcwcTp7Cc/UGs5NsTpxhy8Zis4Etm+y6oWQ1iwBbNpaUFLx/XGvGZWebx2RnY8nOguxssuvXJLtNHbBlYEk5jdeCTViys7FkmfvJyjaPy8rGiPDF6BSEJTsDy+kzuH2chCXbgGwDsoBszOIkE2jiDrf65HwxBryUWuDfAuTE3uVrnrvVAn9kYink3tcz7tewc8CnpHnXwmb1oPFPQzGs7mQGVCEzsCoZgUFkBlYly8ubjKrVOP5HgvkRGhCycRduKzbbPzdsNgybDbKzSQ8IJL7XAGyGgWEYNPjfZ3ieOmF+BjkxudM5/0DW3x2FzWZgMww6fvYOVf4+bH5u9s/WhiU7i7O+Acx9ZALZNoNsm8GdM1/mqgO7sdiyccvOxpqVhZstC7fsbNI8vRk55kOzWMm2MeGjZ2m7ZxNWWzbu2WaRkCvDzZ2GY762r78/fxI9/1xf6Ed805hvyHYzC4A3vn2Df8StLDR2cK2bSPE2//ucvHINLbcuKTR2x1+J/B0QbOZ0Mpng1ML/PfHMNv8t93R3I9vHl9NevmRZ3cly9yDb6o7N6k621Z1sDw/aN6hFg/AQPK1u1Eprye5zB8DDwz5ZPDyweHpi8fDgsbs6kV2vPt4ebgS3t5LQthZWLw88vLywenvi4eWFu7cn7l6efBQZCbVrmwkdvAb+2dt8TU/P86/v6QmentxSu7ZZJDuZ04qbjIwMNm3axLhx4xy29+rVizVr1hR4zNq1a+nVq5fDtsjISD788EMyMzPx8MhfKaanp5Oefv4HOSUl5Qpkn9/JY0cI/aATvFzEP0bN3OEO8x8jbAb8p4jYhu5wr+/59RdSzH8EC1LPCg/k+WOakmr+31tBrrLCsDyxr6eaTdIFqekGj/qfX3/rNBzP/+MIQFULPBFwfv2905BQSKyfBUbniZ15Bg4UcnKeQEyeW/7mnIG9hcRagPF5Yr84C3EFF3kAPBsA7mZxw//OwdaCizwAxviDb87/tX5/DjYVETvSHwJzYn9Mg3UZhcc+7gfB5j+e/JIGq4qIjfKD2jmxv6bDzwUXmgAM8YU6Of95/5YBiwspNAEG+UKDnNjfM+C7ImLv8oEmOf+d/ZEJ888VHjvAG1qZBSG7MmFuEbH9vKF9TuxfWfBJIcUuQE8v6OJlLh/Mgo+KiO3uBTfkxCZmw7tnCo/t6gk3m0UpJ2zwfuFFNB08wSsnNtUGPxYRazsNzXL+W880YH/hf5OZp93YaavLGXw4Y/HkJlaBFXNyw/wbB8DCoZRqDEp/iVOGPyn4spsBeFDwf3OxyR4MmnsEOALAluXLCEwv+LP4vXYjoge/Zl9f+/azhJ4u+Mc3rkZdnjlT377+84fTqX/icIGx+4NCuD+4u32975IfaP733gJjj/oF8b8Ow+zrD+7azjWHdhQYe9rTh22Hz/9PTPbZc/hmFPw37G5z/LcjxcuP4z5VyLK6k5lTJGTlTDarO/UCvbB5e+FpdSOpcUt+8zSwWT2wubtj8/TC5uGBkTP9o30dLP7+eFjdIOAulnVuB15eWLw8sXh64eblgZuXF25eXkzs3A13fz883d3w7xfGjtR/4eGTU0x4e+Pp64WHtzcevt4sCgzA08Mdi8UC9AE+KfDcAF7Ku3JPG2BsobH186406g139C401kFYmDmVc04rbo4dO0Z2dja1atVy2F6rVi0SExMLPCYxMbHA+KysLI4dO0ZoaGi+YyZPnszzzz9/5RIvQprhgVcRn6hhsZBh5PwwGEaRsVgg3ThfrHm5k+cftgtjLY6xVgr/Zq0UP9atBLHWC3OwFB7rDmmGp33V2+1MEbGWC2LPFh5rcXxdL8s5LEV8xmmGJxiWnNi0ImPTDQ8Mw1qs2DQ8MAwzwMstHbciWmczDA+ycz43L0tGkbHpDrGZWIt6XTzIMrwwsOBlycbds/DYdMOdDMMspL2x4VFYrAXS8SDNMAteL07j7V14wZKGF6cNs9j05TS+Pmn217nQWbxINqpjYMGXVIL8z5mNRXljc5ZPW3z42wjFwII/qYRUOXc+zj43F5Ld/Nlr1MeGhQBSaFR9V773z/kT4Jh7VTYZbcjGjQAjlW611thjDSwOxx10D+U7IsnCip9xmodC55pxOS9mYCG3YWq7e0P+6/YIGXhgtWTybtDT2Gxg2NwgG2wWC2DBhoV1vtfyvNdYLBZws2XTNHAwWNwwLLmve/7cfr+qLbaq9QjEQpAF9ofUwz07C/OtLeYxFgs2i4VToeE0rOWPBQsWC+yt0wTf9LP2/YbFDSPnktPfoXVpX6cqbhbznPc0u5bEMykYbhYMNyuGm5uZh9XK8eqh9GpaC4sFrG4Wdnfrzd8pJ8At5zWt7hhWN7BaOVulKoM71cHqZsHNYmHfmQc4nnwcw90drFYMd3csVitYrWT7+/NsZBOsbhasbhaSw/7Nr6dTwN3dbH1wt+Lm4YnF0wOLlxczW7fD3WrG+g6YzR9ZmVg93LF6emH18sQtT4vEBk9PPNzdcLe64f6f3nhY3bC6FfyP6095V0ZeX2BMrh55V25tWlhYfg1rFD9Wis1pg/gdOXKEq666ijVr1tC5c2f79hdffJFPPvmEnTt35jumYcOGPPTQQ8TExNi3rV69muuuu46EhARCQkLyHVNQy01YWJgG8RMREXEhLjGIX3BwMFarNV8rTVJSUr7WmVwhISEFxru7u1O9evUCj/Hy8sLLy+vKJC0iIiLlntO6vnt6etKuXTuWLl3qsH3p0qV06dKlwGM6d+6cL37JkiW0b9++wP42IiIiUvk49b6+UaNG8cEHH/DRRx8RFxfHU089xYEDB+zj1sTExPDAAw/Y46Ojo9m/fz+jRo0iLi6Ojz76iA8//JDRo0c76xRERESknHHqreB33303x48fZ9KkSSQkJNC8eXMWLlxInZxnkCQkJHDgwAF7fL169Vi4cCFPPfUUb7/9NrVr1+aNN95w+m3gIiIiUn7oqeAiIiJS7pXk99sFh5sUERERKZyKGxEREalQVNyIiIhIhaLiRkRERCoUFTciIiJSoai4ERERkQpFxY2IiIhUKCpuREREpEJRcSMiIiIVilMfv+AMuQMyp6SkODkTERERKa7c3+3iPFih0hU3qampAISFhTk5ExERESmp1NRUAgMDi4ypdM+WstlsHDlyhICAACwWyxV97ZSUFMLCwjh48GCFfG5VRT8/qPjnqPNzfRX9HHV+rq+0ztEwDFJTU6lduzZubkX3qql0LTdubm5cffXVpfoeVapUqbB/tFDxzw8q/jnq/FxfRT9HnZ/rK41zvFiLTS51KBYREZEKRcWNiIiIVCgqbq4gLy8vJkyYgJeXl7NTKRUV/fyg4p+jzs/1VfRz1Pm5vvJwjpWuQ7GIiIhUbGq5ERERkQpFxY2IiIhUKCpuREREpEJRcSMiIiIVioqbKyQ9PZ3WrVtjsViIjY0tMtYwDCZOnEjt2rXx8fGhe/fubN++vWwSLaF//OMfhIeH4+3tTWhoKIMHD+bIkSNFHjNkyBAsFovD1KlTpzLKuGQu5fxc6fvbt28fDz/8MPXq1cPHx4f69eszYcIEMjIyijzOVb7DSz0/V/oOX3zxRbp06YKvry9BQUHFOsZVvr9cl3KOrvQdnjx5ksGDBxMYGEhgYCCDBw/m1KlTRR5T3r/D6dOnU69ePby9vWnXrh2rVq0qMn7FihW0a9cOb29vIiIimDFjRqnmp+LmCnn66aepXbt2sWKnTJnC1KlTeeutt9iwYQMhISH07NnT/tyr8qRHjx588cUX7Nq1i/nz57N3717uuOOOix7Xu3dvEhIS7NPChQvLINuSu5Tzc6Xvb+fOndhsNt599122b9/O66+/zowZM3jmmWcueqwrfIeXen6u9B1mZGRw55138uijj5boOFf4/nJdyjm60nd43333ERsby+LFi1m8eDGxsbEMHjz4oseV1+9w3rx5jBw5kmeffZbNmzfTrVs3+vTpw4EDBwqMj4+Pp2/fvnTr1o3NmzfzzDPP8MQTTzB//vzSS9KQy7Zw4UKjcePGxvbt2w3A2Lx5c6GxNpvNCAkJMV5++WX7trS0NCMwMNCYMWNGGWR7eb755hvDYrEYGRkZhcY8+OCDRv/+/csuqSvoYufn6t+fYRjGlClTjHr16hUZ48rf4cXOz1W/w5kzZxqBgYHFinXV76+45+hK3+GOHTsMwFi3bp1929q1aw3A2LlzZ6HHlefvsEOHDkZ0dLTDtsaNGxvjxo0rMP7pp582Gjdu7LDtkUceMTp16lRqOarl5jL9/fffREVF8cknn+Dr63vR+Pj4eBITE+nVq5d9m5eXFzfccANr1qwpzVQv24kTJ/j000/p0qULHh4eRcYuX76cmjVr0rBhQ6KiokhKSiqjLC9dcc7Plb+/XMnJyVSrVu2ica74HcLFz68ifIfF4arfX3G40ne4du1aAgMD6dixo31bp06dCAwMvGiu5fE7zMjIYNOmTQ6fPUCvXr0KPZ+1a9fmi4+MjGTjxo1kZmaWSp4qbi6DYRgMGTKE6Oho2rdvX6xjEhMTAahVq5bD9lq1atn3lTdjx47Fz8+P6tWrc+DAAb755psi4/v06cOnn37KL7/8wmuvvcaGDRu48cYbSU9PL6OMS6Yk5+eK319ee/fu5c033yQ6OrrIOFf7DnMV5/xc/TssDlf9/orLlb7DxMREatasmW97zZo1i8y1vH6Hx44dIzs7u0SffWJiYoHxWVlZHDt2rFTyVHFTgIkTJ+bryHXhtHHjRt58801SUlKIiYkp8XtYLBaHdcMw8m0rLcU9v1xjxoxh8+bNLFmyBKvVygMPPIBRxMDWd999N/369aN58+bceuutLFq0iN27d/PDDz+UxemV+vmBc78/KPk5Ahw5coTevXtz5513MmzYsCJf39W+QyjZ+YFr/TdYUs7+/qD0zxFc5zssKKeL5VoevsOilPSzLyi+oO1XinupvKqLGz58OPfcc0+RMXXr1uWFF15g3bp1+Z6f0b59ewYNGsTs2bPzHRcSEgKYlWxoaKh9e1JSUr7KtrQU9/xyBQcHExwcTMOGDWnSpAlhYWGsW7eOzp07F+v9QkNDqVOnDnv27LmctIutNM+vPHx/UPJzPHLkCD169KBz58689957JX6/8v4dluT8ysN3WNLzu1xl/f1B6Z6jK32HW7du5e+//8637+jRoyXK1RnfYUGCg4OxWq35WmmK+uxDQkIKjHd3d6d69eqlkqeKmwLk/thdzBtvvMELL7xgXz9y5AiRkZHMmzfP4fpqXvXq1SMkJISlS5fSpk0bwLyGuWLFCl555ZUrcwIXUdzzK0hutV2SptHjx49z8OBBh3+ESlNpnl95+P6gZOd4+PBhevToQbt27Zg5cyZubiVvsC3P32FJz688fIeX8zd6Kcr6+4PSPUdX+g47d+5McnIy69evp0OHDgD89ttvJCcn06VLl2K/nzO+w4J4enrSrl07li5dym233WbfvnTpUvr371/gMZ07d+a7775z2LZkyRLat29/0f6bl6zUuipXQvHx8QXeLdWoUSNjwYIF9vWXX37ZCAwMNBYsWGBs27bNuPfee43Q0FAjJSWljDMu2m+//Wa8+eabxubNm419+/YZv/zyi3HdddcZ9evXN9LS0uxxec8vNTXV+Ne//mWsWbPGiI+PN5YtW2Z07tzZuOqqqyrE+RmG63x/hmEYhw8fNho0aGDceOONxqFDh4yEhAT7lJerfoeXcn6G4Vrf4f79+43Nmzcbzz//vOHv729s3rzZ2Lx5s5GammqPcdXvL1dJz9EwXOs77N27t9GyZUtj7dq1xtq1a40WLVoYt9xyi0OMK32Hc+fONTw8PIwPP/zQ2LFjhzFy5EjDz8/P2Ldvn2EYhjFu3Dhj8ODB9vi//vrL8PX1NZ566iljx44dxocffmh4eHgYX331VanlqOLmCiqsuAGMmTNn2tdtNpsxYcIEIyQkxPDy8jKuv/56Y9u2bWWbbDFs3brV6NGjh1GtWjXDy8vLqFu3rhEdHW0cOnTIIS7v+Z09e9bo1auXUaNGDcPDw8MIDw83HnzwQePAgQNOOIOiXcr5GYbrfH+GYd5aCxQ45eWq3+GlnJ9huNZ3+OCDDxZ4fsuWLbPHuOr3l6uk52gYrvUdHj9+3Bg0aJAREBBgBAQEGIMGDTJOnjzpEONq3+Hbb79t1KlTx/D09DTatm1rrFixwr7vwQcfNG644QaH+OXLlxtt2rQxPD09jbp16xrvvPNOqeZnMYyL9JwUERERcSG6W0pEREQqFBU3IiIiUqGouBEREZEKRcWNiIiIVCgqbkRERKRCUXEjIiIiFYqKGxEREalQVNyIiIhIhaLiRkTsunfvzsiRI52dRoGOHz9OzZo12bdvHwDLly/HYrFw6tSpUn3fS32fWbNmERQUVKJjrr32WhYsWFCiY0QkPxU3IlJqEhISuO+++2jUqBFubm6FFk7z58+nadOmeHl50bRpU/73v//li5k8eTK33nrrFX1adnnz3HPPMW7cOGw2m7NTEXFpKm5EpNSkp6dTo0YNnn32WVq1alVgzNq1a7n77rsZPHgwW7ZsYfDgwdx111389ttv9phz587x4YcfMmzYsLJK3Sn69etHcnIyP/74o7NTEXFpKm5EpEAnT57kgQceoGrVqvj6+tKnTx/27NnjEPP+++8TFhaGr68vt912G1OnTnW4FFO3bl3++9//8sADDxAYGFjg+0ybNo2ePXsSExND48aNiYmJ4aabbmLatGn2mEWLFuHu7k7nzp0Lzff48ePce++9XH311fj6+tKiRQs+//xzh5ju3bszYsQIRo4cSdWqValVqxbvvfceZ86c4aGHHiIgIID69euzaNGifK+/evVqWrVqhbe3Nx07dmTbtm0O+2fNmkV4eLj9szh+/LjD/r1799K/f39q1aqFv78/1157LT/99JNDjNVqpW/fvvnyFpGSUXEjIgUaMmQIGzdu5Ntvv2Xt2rUYhkHfvn3JzMwEzB/76OhonnzySWJjY+nZsycvvvhiid9n7dq19OrVy2FbZGQka9assa+vXLmS9u3bF/k6aWlptGvXju+//54//viDf/7znwwePNihBQhg9uzZBAcHs379ekaMGMGjjz7KnXfeSZcuXfj999+JjIxk8ODBnD171uG4MWPG8Oqrr7JhwwZq1qzJP/7xD/tn8dtvvzF06FAee+wxYmNj6dGjBy+88ILD8adPn6Zv37789NNPbN68mcjISG699VYOHDjgENehQwdWrVpVvA9PRApWqs8cFxGXcsMNNxhPPvmksXv3bgMwVq9ebd937Ngxw8fHx/jiiy8MwzCMu+++2+jXr5/D8YMGDTICAwOLfO0LeXh4GJ9++qnDtk8//dTw9PS0r/fv398YOnSoQ8yyZcsMwDh58mSh59O3b1/jX//6l0MO1113nX09KyvL8PPzMwYPHmzflpCQYADG2rVrHd5n7ty59pjjx48bPj4+xrx58wzDMIx7773X6N27t8N733333YV+FrmaNm1qvPnmmw7bvvnmG8PNzc3Izs4u8lgRKZxabkQkn7i4ONzd3enYsaN9W/Xq1WnUqBFxcXEA7Nq1iw4dOjgcd+F6cVksFod1wzActp07dw5vb+8iXyM7O5sXX3yRli1bUr16dfz9/VmyZEm+lpGWLVval61WK9WrV6dFixb2bbVq1QIgKSnJ4bi8l8SqVavm8FnExcXlu2R24fqZM2d4+umnadq0KUFBQfj7+7Nz5858+fn4+GCz2UhPTy/yfEWkcO7OTkBEyh/DMArdnlt0XFiAFHVcUUJCQkhMTHTYlpSUZC8yAIKDgzl58mSRr/Paa6/x+uuvM23aNFq0aIGfnx8jR44kIyPDIc7Dw8Nh3WKxOGzLPafi3LGU97O4mDFjxvDjjz/y6quv0qBBA3x8fLjjjjvy5XfixAl8fX3x8fG56GuKSMHUciMi+TRt2pSsrCyH/irHjx9n9+7dNGnSBIDGjRuzfv16h+M2btxY4vfq3LkzS5cuddi2ZMkSunTpYl9v06YNO3bsKPJ1Vq1aRf/+/bn//vtp1aoVERER+TpAX45169bZl0+ePMnu3btp3LgxYH5eefdfGJ+b35AhQ7jtttto0aIFISEh9jF78vrjjz9o27btFctbpDJScSMi+VxzzTX079+fqKgofv31V7Zs2cL999/PVVddRf/+/QEYMWIECxcuZOrUqezZs4d3332XRYsW5WvNiY2NJTY2ltOnT3P06FFiY2MdCpUnn3ySJUuW8Morr7Bz505eeeUVfvrpJ4cxcSIjI9m+fXuRrTcNGjRg6dKlrFmzhri4OB555JF8LUKXY9KkSfz888/88ccfDBkyhODgYAYMGADAE088weLFi5kyZQq7d+/mrbfeYvHixfnyW7BgAbGxsWzZsoX77ruvwNahVatW5etgLSIlo+JGRAo0c+ZM2rVrxy233ELnzp0xDIOFCxfaL+F07dqVGTNmMHXqVFq1asXixYt56qmn8vWNadOmDW3atGHTpk189tlntGnThr59+9r3d+nShblz5zJz5kxatmzJrFmzmDdvnkN/nxYtWtC+fXu++OKLQvN97rnnaNu2LZGRkXTv3p2QkBB78XElvPzyyzz55JO0a9eOhIQEvv32Wzw9PQHo1KkTH3zwAW+++SatW7dmyZIl/Pvf/3Y4/vXXX6dq1ap06dKFW2+9lcjIyHwtNIcPH2bNmjU89NBDVyxvkcrIYlzKRXIRkQJERUWxc+fOUrmVeeHChYwePZo//vgDN7eK+f9lY8aMITk5mffee8/ZqYi4NHUoFpFL9uqrr9KzZ0/8/PxYtGgRs2fPZvr06aXyXn379mXPnj0cPnyYsLCwUnkPZ6tZsyajR492dhoiLk8tNyJyye666y6WL19OamoqERERjBgxgujoaGenJSKVnIobERERqVAq5oVrERERqbRU3IiIiEiFouJGREREKhQVNyIiIlKhqLgRERGRCkXFjYiIiFQoKm5ERESkQlFxIyIiIhXK/wNfwhHceS5LWgAAAABJRU5ErkJggg==\n", - "text/plain": [ - "
    " - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "execution_count": 3, + "id": "de2c4052", + "metadata": { + "collapsed": false, + "editable": true + }, + "outputs": [], "source": [ "import os\n", "import numpy as np\n", @@ -1479,7 +1444,7 @@ "np.random.seed(3155)\n", "\n", "x = np.random.rand(100)\n", - "y = 2.0+5*x*x#+0.1*np.random.randn(100)\n", + "y = 2.0+5*x*x+0.1*np.random.randn(100)\n", "\n", "# number of features p (here degree of polynomial\n", "p = 3\n", @@ -1510,12 +1475,12 @@ "MSETrain = np.zeros(nlambdas)\n", "MSELassoPredict = np.zeros(nlambdas)\n", "MSELassoTrain = np.zeros(nlambdas)\n", - "lambdas = np.logspace(-4, 0, nlambdas)\n", + "lambdas = np.logspace(-4, 4, nlambdas)\n", "for i in range(nlambdas):\n", " lmb = lambdas[i]\n", " Ridgebeta = np.linalg.inv(X_train.T @ X_train+lmb*I) @ X_train.T @ y_train\n", " # include lasso using Scikit-Learn\n", - " RegLasso = linear_model.Lasso(lmb,fit_intercept=False)\n", + " RegLasso = linear_model.Lasso(lmb)\n", " RegLasso.fit(X_train,y_train)\n", " # and then make the prediction\n", " ytildeRidge = X_train @ Ridgebeta\n", @@ -1542,8 +1507,10 @@ }, { "cell_type": "markdown", - "id": "b9ef9e8f", - "metadata": {}, + "id": "9cce830e", + "metadata": { + "editable": true + }, "source": [ "## Linking the regression analysis with a statistical interpretation\n", "\n", @@ -1569,8 +1536,10 @@ }, { "cell_type": "markdown", - "id": "ecc76b71", - "metadata": {}, + "id": "5161a229", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{align*} \n", @@ -1583,8 +1552,10 @@ }, { "cell_type": "markdown", - "id": "55711f96", - "metadata": {}, + "id": "7bba88f0", + "metadata": { + "editable": true + }, "source": [ "The randomness of $\\varepsilon_i$ implies that\n", "$\\mathbf{y}_i$ is also a random variable. In particular,\n", @@ -1600,8 +1571,10 @@ }, { "cell_type": "markdown", - "id": "c308d8ab", - "metadata": {}, + "id": "2a2a2293", + "metadata": { + "editable": true + }, "source": [ "## Assumptions made\n", "\n", @@ -1612,8 +1585,10 @@ }, { "cell_type": "markdown", - "id": "8947c746", - "metadata": {}, + "id": "16cbf75c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{y} = f(\\boldsymbol{x})+\\boldsymbol{\\varepsilon}\n", @@ -1622,8 +1597,10 @@ }, { "cell_type": "markdown", - "id": "f8b2eeba", - "metadata": {}, + "id": "175c5b85", + "metadata": { + "editable": true + }, "source": [ "We approximate this function with our model from the solution of the linear regression equations, that is our\n", "function $f$ is approximated by $\\boldsymbol{\\tilde{y}}$ where we want to minimize $(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2$, our MSE, with" @@ -1631,8 +1608,10 @@ }, { "cell_type": "markdown", - "id": "ef7bd7b3", - "metadata": {}, + "id": "c6f44f70", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{\\tilde{y}} = \\boldsymbol{X}\\boldsymbol{\\beta}.\n", @@ -1641,8 +1620,10 @@ }, { "cell_type": "markdown", - "id": "45571ed5", - "metadata": {}, + "id": "7b55c677", + "metadata": { + "editable": true + }, "source": [ "## Expectation value and variance\n", "\n", @@ -1651,8 +1632,10 @@ }, { "cell_type": "markdown", - "id": "50d372c4", - "metadata": {}, + "id": "e6fc4a97", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{align*} \n", @@ -1665,8 +1648,10 @@ }, { "cell_type": "markdown", - "id": "1e17772d", - "metadata": {}, + "id": "829c23cf", + "metadata": { + "editable": true + }, "source": [ "while\n", "its variance is" @@ -1674,8 +1659,10 @@ }, { "cell_type": "markdown", - "id": "54d7619a", - "metadata": {}, + "id": "b76ecfb6", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{align*} \\mbox{Var}(y_i) & = \\mathbb{E} \\{ [y_i\n", @@ -1695,8 +1682,10 @@ }, { "cell_type": "markdown", - "id": "510d4042", - "metadata": {}, + "id": "7d7a418d", + "metadata": { + "editable": true + }, "source": [ "Hence, $y_i \\sim \\mathcal{N}( \\mathbf{X}_{i, \\ast} \\, \\boldsymbol{\\beta}, \\sigma^2)$, that is $\\boldsymbol{y}$ follows a normal distribution with \n", "mean value $\\boldsymbol{X}\\boldsymbol{\\beta}$ and variance $\\sigma^2$ (not be confused with the singular values of the SVD)." @@ -1704,8 +1693,10 @@ }, { "cell_type": "markdown", - "id": "c1b2cec7", - "metadata": {}, + "id": "c773b5a1", + "metadata": { + "editable": true + }, "source": [ "## Expectation value and variance for $\\boldsymbol{\\beta}$\n", "\n", @@ -1714,8 +1705,10 @@ }, { "cell_type": "markdown", - "id": "5229aab4", - "metadata": {}, + "id": "83da9dfe", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}(\\boldsymbol{\\beta}) = \\mathbb{E}[ (\\mathbf{X}^{\\top} \\mathbf{X})^{-1}\\mathbf{X}^{T} \\mathbf{Y}]=(\\mathbf{X}^{T} \\mathbf{X})^{-1}\\mathbf{X}^{T} \\mathbb{E}[ \\mathbf{Y}]=(\\mathbf{X}^{T} \\mathbf{X})^{-1} \\mathbf{X}^{T}\\mathbf{X}\\boldsymbol{\\beta}=\\boldsymbol{\\beta}.\n", @@ -1724,8 +1717,10 @@ }, { "cell_type": "markdown", - "id": "d64c421e", - "metadata": {}, + "id": "c7133c36", + "metadata": { + "editable": true + }, "source": [ "This means that the estimator of the regression parameters is unbiased.\n", "\n", @@ -1736,8 +1731,10 @@ }, { "cell_type": "markdown", - "id": "fb80e64f", - "metadata": {}, + "id": "5b471cd7", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{eqnarray*}\n", @@ -1765,8 +1762,10 @@ }, { "cell_type": "markdown", - "id": "3883988a", - "metadata": {}, + "id": "b1c59e2f", + "metadata": { + "editable": true + }, "source": [ "where we have used that $\\mathbb{E} (\\mathbf{Y} \\mathbf{Y}^{T}) =\n", "\\mathbf{X} \\, \\boldsymbol{\\beta} \\, \\boldsymbol{\\beta}^{T} \\, \\mathbf{X}^{T} +\n", @@ -1785,8 +1784,10 @@ }, { "cell_type": "markdown", - "id": "d3b9fe91", - "metadata": {}, + "id": "0bc822be", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E} \\big[ \\boldsymbol{\\beta}^{\\mathrm{Ridge}} \\big]=(\\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I}_{pp})^{-1} (\\mathbf{X}^{\\top} \\mathbf{X})\\boldsymbol{\\beta}^{\\mathrm{OLS}}.\n", @@ -1795,8 +1796,10 @@ }, { "cell_type": "markdown", - "id": "1cc565ef", - "metadata": {}, + "id": "78f3bc26", + "metadata": { + "editable": true + }, "source": [ "We see clearly that \n", "$\\mathbb{E} \\big[ \\boldsymbol{\\beta}^{\\mathrm{Ridge}} \\big] \\not= \\boldsymbol{\\beta}^{\\mathrm{OLS}}$ for any $\\lambda > 0$. We say then that the ridge estimator is biased.\n", @@ -1806,8 +1809,10 @@ }, { "cell_type": "markdown", - "id": "d45602c7", - "metadata": {}, + "id": "d0cf1c6e", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mbox{Var}[\\boldsymbol{\\beta}^{\\mathrm{Ridge}}]=\\sigma^2[ \\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1} \\mathbf{X}^{T} \\mathbf{X} \\{ [ \\mathbf{X}^{\\top} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1}\\}^{T},\n", @@ -1816,8 +1821,10 @@ }, { "cell_type": "markdown", - "id": "18b63cb2", - "metadata": {}, + "id": "8e6e0610", + "metadata": { + "editable": true + }, "source": [ "and it is easy to see that if the parameter $\\lambda$ goes to infinity then the variance of Ridge parameters $\\boldsymbol{\\beta}$ goes to zero. \n", "\n", @@ -1826,8 +1833,10 @@ }, { "cell_type": "markdown", - "id": "a5fcd345", - "metadata": {}, + "id": "ba0e0b34", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mbox{Var}[\\boldsymbol{\\beta}^{\\mathrm{OLS}}]-\\mbox{Var}(\\boldsymbol{\\beta}^{\\mathrm{Ridge}})=\\sigma^2 [ \\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1}[ 2\\lambda\\mathbf{I} + \\lambda^2 (\\mathbf{X}^{T} \\mathbf{X})^{-1} ] \\{ [ \\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1}\\}^{T}.\n", @@ -1836,8 +1845,10 @@ }, { "cell_type": "markdown", - "id": "1820cc18", - "metadata": {}, + "id": "55705cba", + "metadata": { + "editable": true + }, "source": [ "The difference is non-negative definite since each component of the\n", "matrix product is non-negative definite. \n", @@ -1846,8 +1857,10 @@ }, { "cell_type": "markdown", - "id": "b9568e86", - "metadata": {}, + "id": "abf4bfdf", + "metadata": { + "editable": true + }, "source": [ "## Deriving OLS from a probability distribution\n", "\n", @@ -1867,8 +1880,10 @@ }, { "cell_type": "markdown", - "id": "5b51dd32", - "metadata": {}, + "id": "e77e798c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "y_i\\sim \\mathcal{N}(\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta}, \\sigma^2)=\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -1877,8 +1892,10 @@ }, { "cell_type": "markdown", - "id": "446c1afd", - "metadata": {}, + "id": "26d4690d", + "metadata": { + "editable": true + }, "source": [ "## Independent and Identically Distrubuted (iid)\n", "\n", @@ -1888,8 +1905,10 @@ }, { "cell_type": "markdown", - "id": "8965e119", - "metadata": {}, + "id": "22462772", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(y_i, \\boldsymbol{X}\\vert\\boldsymbol{\\beta})=\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]},\n", @@ -1898,8 +1917,10 @@ }, { "cell_type": "markdown", - "id": "c49af092", - "metadata": {}, + "id": "905d65a6", + "metadata": { + "editable": true + }, "source": [ "which reads as finding the likelihood of an event $y_i$ with the input variables $\\boldsymbol{X}$ given the parameters (to be determined) $\\boldsymbol{\\beta}$.\n", "\n", @@ -1908,8 +1929,10 @@ }, { "cell_type": "markdown", - "id": "ccd43fe6", - "metadata": {}, + "id": "4cf9d56d", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{y},\\boldsymbol{X}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}=\\prod_{i=0}^{n-1}p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta}).\n", @@ -1918,8 +1941,10 @@ }, { "cell_type": "markdown", - "id": "047a9630", - "metadata": {}, + "id": "c100c74c", + "metadata": { + "editable": true + }, "source": [ "We will write this in a more compact form reserving $\\boldsymbol{D}$ for the domain of events, including the ouputs (targets) and the inputs. That is\n", "in case we have a simple one-dimensional input and output case" @@ -1927,8 +1952,10 @@ }, { "cell_type": "markdown", - "id": "a5e47ffd", - "metadata": {}, + "id": "36f8ccf2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{D}=[(x_0,y_0), (x_1,y_1),\\dots, (x_{n-1},y_{n-1})].\n", @@ -1937,8 +1964,10 @@ }, { "cell_type": "markdown", - "id": "aea090b7", - "metadata": {}, + "id": "60b59fd2", + "metadata": { + "editable": true + }, "source": [ "In the more general case the various inputs should be replaced by the possible features represented by the input data set $\\boldsymbol{X}$. \n", "We can now rewrite the above probability as" @@ -1946,8 +1975,10 @@ }, { "cell_type": "markdown", - "id": "8b499343", - "metadata": {}, + "id": "ea63b747", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -1956,16 +1987,20 @@ }, { "cell_type": "markdown", - "id": "91844422", - "metadata": {}, + "id": "5ca95669", + "metadata": { + "editable": true + }, "source": [ "It is a conditional probability (see below) and reads as the likelihood of a domain of events $\\boldsymbol{D}$ given a set of parameters $\\boldsymbol{\\beta}$." ] }, { "cell_type": "markdown", - "id": "2eab09fd", - "metadata": {}, + "id": "4712ec27", + "metadata": { + "editable": true + }, "source": [ "## Maximum Likelihood Estimation (MLE)\n", "\n", @@ -1993,8 +2028,10 @@ }, { "cell_type": "markdown", - "id": "e8d5d443", - "metadata": {}, + "id": "7d52ddc4", + "metadata": { + "editable": true + }, "source": [ "## A new Cost Function\n", "\n", @@ -2003,8 +2040,10 @@ }, { "cell_type": "markdown", - "id": "c8b99cce", - "metadata": {}, + "id": "1c32ecc2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=-\\log{\\prod_{i=0}^{n-1}p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta})}=-\\sum_{i=0}^{n-1}\\log{p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta})},\n", @@ -2013,16 +2052,20 @@ }, { "cell_type": "markdown", - "id": "12497fc6", - "metadata": {}, + "id": "5249db96", + "metadata": { + "editable": true + }, "source": [ "which becomes" ] }, { "cell_type": "markdown", - "id": "6cf6ea28", - "metadata": {}, + "id": "0712fbb0", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{n}{2}\\log{2\\pi\\sigma^2}+\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}.\n", @@ -2031,16 +2074,20 @@ }, { "cell_type": "markdown", - "id": "b2cb5aa1", - "metadata": {}, + "id": "5e4c844b", + "metadata": { + "editable": true + }, "source": [ "Taking the derivative of the *new* cost function with respect to the parameters $\\beta$ we recognize our familiar OLS equation, namely" ] }, { "cell_type": "markdown", - "id": "ebf349ab", - "metadata": {}, + "id": "5b56b5bb", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}^T\\left(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta}\\right) =0,\n", @@ -2049,16 +2096,20 @@ }, { "cell_type": "markdown", - "id": "6f257fe9", - "metadata": {}, + "id": "e7c21df5", + "metadata": { + "editable": true + }, "source": [ "which leads to the well-known OLS equation for the optimal paramters $\\beta$" ] }, { "cell_type": "markdown", - "id": "145222d0", - "metadata": {}, + "id": "7f35e661", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{OLS}}=\\left(\\boldsymbol{X}^T\\boldsymbol{X}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y}!\n", @@ -2067,16 +2118,20 @@ }, { "cell_type": "markdown", - "id": "62ff20c8", - "metadata": {}, + "id": "72dc84b0", + "metadata": { + "editable": true + }, "source": [ "Before we make a similar analysis for Ridge and Lasso regression, we need a short reminder on statistics." ] }, { "cell_type": "markdown", - "id": "d0c36c09", - "metadata": {}, + "id": "0071468f", + "metadata": { + "editable": true + }, "source": [ "## More basic Statistics and Bayes' theorem\n", "\n", @@ -2093,8 +2148,10 @@ }, { "cell_type": "markdown", - "id": "fbe75b0a", - "metadata": {}, + "id": "6f471644", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X \\cup Y)= p(X)+p(Y)-p(X \\cap Y).\n", @@ -2103,16 +2160,20 @@ }, { "cell_type": "markdown", - "id": "1631de7c", - "metadata": {}, + "id": "4a31022d", + "metadata": { + "editable": true + }, "source": [ "**The product rule (aka joint probability) is given by.**" ] }, { "cell_type": "markdown", - "id": "c94e2fdd", - "metadata": {}, + "id": "6b7d8e8f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X \\cup Y)= p(X,Y)= p(X\\vert Y)p(Y)=p(Y\\vert X)p(X),\n", @@ -2121,8 +2182,10 @@ }, { "cell_type": "markdown", - "id": "1444e890", - "metadata": {}, + "id": "9b67f32e", + "metadata": { + "editable": true + }, "source": [ "where we read $p(X\\vert Y)$ as the likelihood of obtaining $X$ given $Y$.\n", "\n", @@ -2131,8 +2194,10 @@ }, { "cell_type": "markdown", - "id": "13cccaf3", - "metadata": {}, + "id": "1deaa595", + "metadata": { + "editable": true + }, "source": [ "## Marginal Probability\n", "\n", @@ -2141,8 +2206,10 @@ }, { "cell_type": "markdown", - "id": "b0ea5ddc", - "metadata": {}, + "id": "67ae1f0a", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X)=\\sum_{i=0}^{n-1}p(X,Y=y_i)=\\sum_{i=0}^{n-1}p(X\\vert Y=y_i)p(Y=y_i)=\\sum_{i=0}^{n-1}p(X\\vert y_i)p(y_i).\n", @@ -2151,8 +2218,10 @@ }, { "cell_type": "markdown", - "id": "0d807fa1", - "metadata": {}, + "id": "645b3b06", + "metadata": { + "editable": true + }, "source": [ "## Conditional Probability\n", "\n", @@ -2161,8 +2230,10 @@ }, { "cell_type": "markdown", - "id": "5204e804", - "metadata": {}, + "id": "899643ce", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X\\vert Y)= \\frac{p(X,Y)}{p(Y)}=\\frac{p(X,Y)}{\\sum_{i=0}^{n-1}p(Y\\vert X=x_i)p(x_i)}.\n", @@ -2171,8 +2242,10 @@ }, { "cell_type": "markdown", - "id": "4b4e8c55", - "metadata": {}, + "id": "4ef190c8", + "metadata": { + "editable": true + }, "source": [ "## Bayes' Theorem\n", "\n", @@ -2181,8 +2254,10 @@ }, { "cell_type": "markdown", - "id": "b60be092", - "metadata": {}, + "id": "ea14b40c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X\\vert Y)= \\frac{p(X,Y)}{p(Y)},\n", @@ -2191,16 +2266,20 @@ }, { "cell_type": "markdown", - "id": "9d9acb12", - "metadata": {}, + "id": "17d256d9", + "metadata": { + "editable": true + }, "source": [ "which we can rewrite as" ] }, { "cell_type": "markdown", - "id": "318ad87c", - "metadata": {}, + "id": "07483acd", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X\\vert Y)= \\frac{p(X,Y)}{\\sum_{i=0}^{n-1}p(Y\\vert X=x_i)p(x_i)}=\\frac{p(Y\\vert X)p(X)}{\\sum_{i=0}^{n-1}p(Y\\vert X=x_i)p(x_i)},\n", @@ -2209,16 +2288,20 @@ }, { "cell_type": "markdown", - "id": "cdf2921d", - "metadata": {}, + "id": "b5a3ee7e", + "metadata": { + "editable": true + }, "source": [ "which is Bayes' theorem. It allows us to evaluate the uncertainty in in $X$ after we have observed $Y$. We can easily interchange $X$ with $Y$." ] }, { "cell_type": "markdown", - "id": "83bd31f6", - "metadata": {}, + "id": "1116169a", + "metadata": { + "editable": true + }, "source": [ "## Interpretations of Bayes' Theorem\n", "\n", @@ -2234,8 +2317,10 @@ }, { "cell_type": "markdown", - "id": "48d7057c", - "metadata": {}, + "id": "88861a9a", + "metadata": { + "editable": true + }, "source": [ "## Example of Usage of Bayes' theorem\n", "\n", @@ -2253,8 +2338,10 @@ }, { "cell_type": "markdown", - "id": "2b19e099", - "metadata": {}, + "id": "469f8261", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X=1\\vert Y=1) =0.8.\n", @@ -2263,8 +2350,10 @@ }, { "cell_type": "markdown", - "id": "ed04f0b1", - "metadata": {}, + "id": "829b1728", + "metadata": { + "editable": true + }, "source": [ "This obviously sounds scary since many would conclude that if the test is positive, there is a likelihood of $80\\%$ for having cancer.\n", "It is however not correct, as the following Bayesian analysis shows." @@ -2272,8 +2361,10 @@ }, { "cell_type": "markdown", - "id": "2260fa5e", - "metadata": {}, + "id": "4d99133c", + "metadata": { + "editable": true + }, "source": [ "## Doing it correctly\n", "\n", @@ -2283,8 +2374,10 @@ }, { "cell_type": "markdown", - "id": "c9c48a17", - "metadata": {}, + "id": "fb649782", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(Y=1) =0.004.\n", @@ -2293,16 +2386,20 @@ }, { "cell_type": "markdown", - "id": "f49df4a7", - "metadata": {}, + "id": "8079bdf4", + "metadata": { + "editable": true + }, "source": [ "We need also to account for the fact that the test may produce a false positive result (false alarm). Let us here assume that we have" ] }, { "cell_type": "markdown", - "id": "f34c4bc8", - "metadata": {}, + "id": "6b8d6139", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X=1\\vert Y=0) =0.1.\n", @@ -2311,16 +2408,20 @@ }, { "cell_type": "markdown", - "id": "44347f5c", - "metadata": {}, + "id": "7fea340f", + "metadata": { + "editable": true + }, "source": [ "Using Bayes' theorem we can then find the posterior probability that the person has breast cancer in case of a positive test, that is we can compute" ] }, { "cell_type": "markdown", - "id": "64a31efc", - "metadata": {}, + "id": "30609da9", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(Y=1\\vert X=1)=\\frac{p(X=1\\vert Y=1)p(Y=1)}{p(X=1\\vert Y=1)p(Y=1)+p(X=1\\vert Y=0)p(Y=0)}=\\frac{0.8\\times 0.004}{0.8\\times 0.004+0.1\\times 0.996}=0.031.\n", @@ -2329,16 +2430,20 @@ }, { "cell_type": "markdown", - "id": "d2a13b09", - "metadata": {}, + "id": "9f9352c3", + "metadata": { + "editable": true + }, "source": [ "That is, in case of a positive test, there is only a $3\\%$ chance of having breast cancer!" ] }, { "cell_type": "markdown", - "id": "0f1aec4a", - "metadata": {}, + "id": "b9a9af20", + "metadata": { + "editable": true + }, "source": [ "## Bayes' Theorem and Ridge and Lasso Regression\n", "\n", @@ -2352,8 +2457,10 @@ }, { "cell_type": "markdown", - "id": "1c278acf", - "metadata": {}, + "id": "c17cc808", + "metadata": { + "editable": true + }, "source": [ "## Test Function for what happens with OLS, Ridge and Lasso\n", "\n", @@ -2369,8 +2476,11 @@ { "cell_type": "code", "execution_count": 4, - "id": "692eede0", - "metadata": {}, + "id": "da52ef4c", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import numpy as np\n", @@ -2440,16 +2550,20 @@ }, { "cell_type": "markdown", - "id": "f7333256", - "metadata": {}, + "id": "baf1d84c", + "metadata": { + "editable": true + }, "source": [ "How can we understand this?" ] }, { "cell_type": "markdown", - "id": "d1b2354d", - "metadata": {}, + "id": "e4fd5c75", + "metadata": { + "editable": true + }, "source": [ "## Invoking Bayes' theorem\n", "\n", @@ -2460,8 +2574,10 @@ }, { "cell_type": "markdown", - "id": "03e1739d", - "metadata": {}, + "id": "908fb3e8", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{D}=[(x_0,y_0), (x_1,y_1),\\dots, (x_{n-1},y_{n-1})],\n", @@ -2470,16 +2586,20 @@ }, { "cell_type": "markdown", - "id": "b6a41543", - "metadata": {}, + "id": "42a4fecc", + "metadata": { + "editable": true + }, "source": [ "is given by" ] }, { "cell_type": "markdown", - "id": "9bf01ae9", - "metadata": {}, + "id": "1838ea96", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -2488,16 +2608,20 @@ }, { "cell_type": "markdown", - "id": "b8a5b2b2", - "metadata": {}, + "id": "4a0ea7f0", + "metadata": { + "editable": true + }, "source": [ "In Bayes' theorem this function plays the role of the so-called likelihood. We could now ask the question what is the posterior probability of a parameter set $\\boldsymbol{\\beta}$ given a domain of events $\\boldsymbol{D}$? That is, how can we define the posterior probability" ] }, { "cell_type": "markdown", - "id": "e217276b", - "metadata": {}, + "id": "4cb46e3c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D}).\n", @@ -2506,16 +2630,20 @@ }, { "cell_type": "markdown", - "id": "26a0e9e6", - "metadata": {}, + "id": "db9d993c", + "metadata": { + "editable": true + }, "source": [ "Bayes' theorem comes to our rescue here since (omitting the normalization constant)" ] }, { "cell_type": "markdown", - "id": "48f76275", - "metadata": {}, + "id": "f6efeb5c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D})\\propto p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})p(\\boldsymbol{\\beta}).\n", @@ -2524,16 +2652,20 @@ }, { "cell_type": "markdown", - "id": "89908237", - "metadata": {}, + "id": "acaf9f2d", + "metadata": { + "editable": true + }, "source": [ "We have a model for $p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})$ but need one for the **prior** $p(\\boldsymbol{\\beta}$!" ] }, { "cell_type": "markdown", - "id": "488d1d81", - "metadata": {}, + "id": "a73cf73a", + "metadata": { + "editable": true + }, "source": [ "## Ridge and Bayes\n", "\n", @@ -2546,8 +2678,10 @@ }, { "cell_type": "markdown", - "id": "26057db7", - "metadata": {}, + "id": "6bdfe707", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta})=\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\beta_j^2}{2\\tau^2}\\right)}.\n", @@ -2556,16 +2690,20 @@ }, { "cell_type": "markdown", - "id": "b532ac2d", - "metadata": {}, + "id": "6a71e2d7", + "metadata": { + "editable": true + }, "source": [ "Our posterior probability becomes then (omitting the normalization factor which is just a constant)" ] }, { "cell_type": "markdown", - "id": "f2ea1634", - "metadata": {}, + "id": "ae0f27b9", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta\\vert\\boldsymbol{D})}=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\beta_j^2}{2\\tau^2}\\right)}.\n", @@ -2574,8 +2712,10 @@ }, { "cell_type": "markdown", - "id": "66de1d0b", - "metadata": {}, + "id": "1dd65d98", + "metadata": { + "editable": true + }, "source": [ "We can now optimize this quantity with respect to $\\boldsymbol{\\beta}$. As we\n", "did for OLS, this is most conveniently done by taking the negative\n", @@ -2585,8 +2725,10 @@ }, { "cell_type": "markdown", - "id": "2b931fbc", - "metadata": {}, + "id": "8a4d9e5b", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\frac{1}{2\\tau^2}\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_2^2,\n", @@ -2595,16 +2737,20 @@ }, { "cell_type": "markdown", - "id": "ddec27fc", - "metadata": {}, + "id": "61ba5bcd", + "metadata": { + "editable": true + }, "source": [ "and replacing $1/2\\tau^2$ with $\\lambda$ we have" ] }, { "cell_type": "markdown", - "id": "1ddbe4ec", - "metadata": {}, + "id": "23b69c58", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\lambda\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_2^2,\n", @@ -2613,16 +2759,20 @@ }, { "cell_type": "markdown", - "id": "341422d4", - "metadata": {}, + "id": "899f2235", + "metadata": { + "editable": true + }, "source": [ "which is our Ridge cost function! Nice, isn't it?" ] }, { "cell_type": "markdown", - "id": "388e8810", - "metadata": {}, + "id": "dbfeab4a", + "metadata": { + "editable": true + }, "source": [ "## Lasso and Bayes\n", "\n", @@ -2631,8 +2781,10 @@ }, { "cell_type": "markdown", - "id": "93fe5a38", - "metadata": {}, + "id": "7242122e", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta})=\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\vert\\beta_j\\vert}{\\tau}\\right)}.\n", @@ -2641,16 +2793,20 @@ }, { "cell_type": "markdown", - "id": "71aa863c", - "metadata": {}, + "id": "5d8302f0", + "metadata": { + "editable": true + }, "source": [ "Our posterior probability becomes then (omitting the normalization factor which is just a constant)" ] }, { "cell_type": "markdown", - "id": "a8c9f8be", - "metadata": {}, + "id": "0ab7dd23", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\vert\\beta_j\\vert}{\\tau}\\right)}.\n", @@ -2659,8 +2815,10 @@ }, { "cell_type": "markdown", - "id": "44397b9e", - "metadata": {}, + "id": "319acfe8", + "metadata": { + "editable": true + }, "source": [ "Taking the negative\n", "logarithm of the posterior probability and leaving out the\n", @@ -2669,8 +2827,10 @@ }, { "cell_type": "markdown", - "id": "328f04d9", - "metadata": {}, + "id": "f22117ce", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\frac{1}{\\tau}\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -2679,16 +2839,20 @@ }, { "cell_type": "markdown", - "id": "a84bc3c7", - "metadata": {}, + "id": "54f622e6", + "metadata": { + "editable": true + }, "source": [ "and replacing $1/\\tau$ with $\\lambda$ we have" ] }, { "cell_type": "markdown", - "id": "56346230", - "metadata": {}, + "id": "0adbb936", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\lambda\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -2697,16 +2861,20 @@ }, { "cell_type": "markdown", - "id": "d1b52def", - "metadata": {}, + "id": "6c067f7f", + "metadata": { + "editable": true + }, "source": [ "which is our Lasso cost function!" ] }, { "cell_type": "markdown", - "id": "66cdc40f", - "metadata": {}, + "id": "3960583d", + "metadata": { + "editable": true + }, "source": [ "## Deriving OLS from a probability distribution\n", "\n", @@ -2726,8 +2894,10 @@ }, { "cell_type": "markdown", - "id": "bf51888d", - "metadata": {}, + "id": "b5519ef3", + "metadata": { + "editable": true + }, "source": [ "$$\n", "y_i\\sim \\mathcal{N}(\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta}, \\sigma^2)=\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -2736,8 +2906,10 @@ }, { "cell_type": "markdown", - "id": "a855c7a1", - "metadata": {}, + "id": "ad965079", + "metadata": { + "editable": true + }, "source": [ "## Independent and Identically Distrubuted (iid)\n", "\n", @@ -2747,8 +2919,10 @@ }, { "cell_type": "markdown", - "id": "3664ccdb", - "metadata": {}, + "id": "e786100f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(y_i, \\boldsymbol{X}\\vert\\boldsymbol{\\beta})=\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]},\n", @@ -2757,8 +2931,10 @@ }, { "cell_type": "markdown", - "id": "7893fe17", - "metadata": {}, + "id": "c620ff98", + "metadata": { + "editable": true + }, "source": [ "which reads as finding the likelihood of an event $y_i$ with the input variables $\\boldsymbol{X}$ given the parameters (to be determined) $\\boldsymbol{\\beta}$.\n", "\n", @@ -2767,8 +2943,10 @@ }, { "cell_type": "markdown", - "id": "4179b956", - "metadata": {}, + "id": "b8636581", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{y},\\boldsymbol{X}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}=\\prod_{i=0}^{n-1}p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta}).\n", @@ -2777,8 +2955,10 @@ }, { "cell_type": "markdown", - "id": "1e53d72b", - "metadata": {}, + "id": "c95c3402", + "metadata": { + "editable": true + }, "source": [ "We will write this in a more compact form reserving $\\boldsymbol{D}$ for the domain of events, including the ouputs (targets) and the inputs. That is\n", "in case we have a simple one-dimensional input and output case" @@ -2786,8 +2966,10 @@ }, { "cell_type": "markdown", - "id": "c54d0c81", - "metadata": {}, + "id": "8a1f4160", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{D}=[(x_0,y_0), (x_1,y_1),\\dots, (x_{n-1},y_{n-1})].\n", @@ -2796,8 +2978,10 @@ }, { "cell_type": "markdown", - "id": "fbecd0ad", - "metadata": {}, + "id": "85444cf8", + "metadata": { + "editable": true + }, "source": [ "In the more general case the various inputs should be replaced by the possible features represented by the input data set $\\boldsymbol{X}$. \n", "We can now rewrite the above probability as" @@ -2805,8 +2989,10 @@ }, { "cell_type": "markdown", - "id": "0abd00aa", - "metadata": {}, + "id": "a51ad674", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -2815,16 +3001,20 @@ }, { "cell_type": "markdown", - "id": "71478f61", - "metadata": {}, + "id": "27de9788", + "metadata": { + "editable": true + }, "source": [ "It is a conditional probability (see below) and reads as the likelihood of a domain of events $\\boldsymbol{D}$ given a set of parameters $\\boldsymbol{\\beta}$." ] }, { "cell_type": "markdown", - "id": "6352fa5a", - "metadata": {}, + "id": "fec2d139", + "metadata": { + "editable": true + }, "source": [ "## Maximum Likelihood Estimation (MLE)\n", "\n", @@ -2852,8 +3042,10 @@ }, { "cell_type": "markdown", - "id": "51af15f4", - "metadata": {}, + "id": "d28cabe5", + "metadata": { + "editable": true + }, "source": [ "## A new Cost Function\n", "\n", @@ -2862,8 +3054,10 @@ }, { "cell_type": "markdown", - "id": "cebbeb23", - "metadata": {}, + "id": "853b3797", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=-\\log{\\prod_{i=0}^{n-1}p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta})}=-\\sum_{i=0}^{n-1}\\log{p(y_i,\\boldsymbol{X}\\vert\\boldsymbol{\\beta})},\n", @@ -2872,16 +3066,20 @@ }, { "cell_type": "markdown", - "id": "fb31577d", - "metadata": {}, + "id": "42952aa8", + "metadata": { + "editable": true + }, "source": [ "which becomes" ] }, { "cell_type": "markdown", - "id": "1e57fdc1", - "metadata": {}, + "id": "6f9b202f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{n}{2}\\log{2\\pi\\sigma^2}+\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}.\n", @@ -2890,16 +3088,20 @@ }, { "cell_type": "markdown", - "id": "d47da897", - "metadata": {}, + "id": "c531bc45", + "metadata": { + "editable": true + }, "source": [ "Taking the derivative of the *new* cost function with respect to the parameters $\\beta$ we recognize our familiar OLS equation, namely" ] }, { "cell_type": "markdown", - "id": "cafa1620", - "metadata": {}, + "id": "83d5af93", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{X}^T\\left(\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta}\\right) =0,\n", @@ -2908,16 +3110,20 @@ }, { "cell_type": "markdown", - "id": "1d0a4b40", - "metadata": {}, + "id": "10a67fb3", + "metadata": { + "editable": true + }, "source": [ "which leads to the well-known OLS equation for the optimal paramters $\\beta$" ] }, { "cell_type": "markdown", - "id": "f19df38b", - "metadata": {}, + "id": "8eaa8af1", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\hat{\\boldsymbol{\\beta}}^{\\mathrm{OLS}}=\\left(\\boldsymbol{X}^T\\boldsymbol{X}\\right)^{-1}\\boldsymbol{X}^T\\boldsymbol{y}!\n", @@ -2926,8 +3132,10 @@ }, { "cell_type": "markdown", - "id": "7e66279f", - "metadata": {}, + "id": "191ae18b", + "metadata": { + "editable": true + }, "source": [ "## Bayes' Theorem\n", "\n", @@ -2936,8 +3144,10 @@ }, { "cell_type": "markdown", - "id": "dd2a6dce", - "metadata": {}, + "id": "9f9aa209", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X\\vert Y)= \\frac{p(X,Y)}{p(Y)},\n", @@ -2946,16 +3156,20 @@ }, { "cell_type": "markdown", - "id": "5553ddfd", - "metadata": {}, + "id": "ec245678", + "metadata": { + "editable": true + }, "source": [ "which we can rewrite as" ] }, { "cell_type": "markdown", - "id": "bbc1dc01", - "metadata": {}, + "id": "3cb237fb", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(X\\vert Y)= \\frac{p(X,Y)}{\\sum_{i=0}^{n-1}p(Y\\vert X=x_i)p(x_i)}=\\frac{p(Y\\vert X)p(X)}{\\sum_{i=0}^{n-1}p(Y\\vert X=x_i)p(x_i)},\n", @@ -2964,16 +3178,20 @@ }, { "cell_type": "markdown", - "id": "1d934976", - "metadata": {}, + "id": "91a6a9a2", + "metadata": { + "editable": true + }, "source": [ "which is Bayes' theorem. It allows us to evaluate the uncertainty in in $X$ after we have observed $Y$. We can easily interchange $X$ with $Y$." ] }, { "cell_type": "markdown", - "id": "af91acff", - "metadata": {}, + "id": "a194c6bd", + "metadata": { + "editable": true + }, "source": [ "## Interpretations of Bayes' Theorem\n", "\n", @@ -2987,8 +3205,10 @@ }, { "cell_type": "markdown", - "id": "6cd0aef9", - "metadata": {}, + "id": "f6da60ea", + "metadata": { + "editable": true + }, "source": [ "## Test Function for what happens with OLS, Ridge and Lasso\n", "\n", @@ -3004,8 +3224,11 @@ { "cell_type": "code", "execution_count": 5, - "id": "138aefde", - "metadata": {}, + "id": "9f0dc5f6", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import numpy as np\n", @@ -3075,16 +3298,20 @@ }, { "cell_type": "markdown", - "id": "f2f0e4e2", - "metadata": {}, + "id": "e51aac1d", + "metadata": { + "editable": true + }, "source": [ "How can we understand this?" ] }, { "cell_type": "markdown", - "id": "70a740db", - "metadata": {}, + "id": "24853f82", + "metadata": { + "editable": true + }, "source": [ "## Rerunning the above code\n", "\n", @@ -3111,8 +3338,11 @@ { "cell_type": "code", "execution_count": 6, - "id": "d78d3856", - "metadata": {}, + "id": "d75c01b1", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import numpy as np\n", @@ -3157,8 +3387,10 @@ }, { "cell_type": "markdown", - "id": "56b5bd43", - "metadata": {}, + "id": "ba2e83ae", + "metadata": { + "editable": true + }, "source": [ "## Invoking Bayes' theorem\n", "\n", @@ -3169,8 +3401,10 @@ }, { "cell_type": "markdown", - "id": "6323ead1", - "metadata": {}, + "id": "831bdde3", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{D}=[(x_0,y_0), (x_1,y_1),\\dots, (x_{n-1},y_{n-1})],\n", @@ -3179,16 +3413,20 @@ }, { "cell_type": "markdown", - "id": "6c5a9d05", - "metadata": {}, + "id": "de3da0a6", + "metadata": { + "editable": true + }, "source": [ "is given by" ] }, { "cell_type": "markdown", - "id": "c44868f3", - "metadata": {}, + "id": "091f76a6", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}.\n", @@ -3197,16 +3435,20 @@ }, { "cell_type": "markdown", - "id": "32c5a08a", - "metadata": {}, + "id": "763e3b26", + "metadata": { + "editable": true + }, "source": [ "In Bayes' theorem this function plays the role of the so-called likelihood. We could now ask the question what is the posterior probability of a parameter set $\\boldsymbol{\\beta}$ given a domain of events $\\boldsymbol{D}$? That is, how can we define the posterior probability" ] }, { "cell_type": "markdown", - "id": "6aee2553", - "metadata": {}, + "id": "5d412a98", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D}).\n", @@ -3215,16 +3457,20 @@ }, { "cell_type": "markdown", - "id": "ae17b4da", - "metadata": {}, + "id": "0ba3e207", + "metadata": { + "editable": true + }, "source": [ "Bayes' theorem comes to our rescue here since (omitting the normalization constant)" ] }, { "cell_type": "markdown", - "id": "07edfc9d", - "metadata": {}, + "id": "75c3c8dd", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D})\\propto p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})p(\\boldsymbol{\\beta}).\n", @@ -3233,16 +3479,20 @@ }, { "cell_type": "markdown", - "id": "168fa7f1", - "metadata": {}, + "id": "8c04ebfb", + "metadata": { + "editable": true + }, "source": [ "We have a model for $p(\\boldsymbol{D}\\vert\\boldsymbol{\\beta})$ but need one for the **prior** $p(\\boldsymbol{\\beta}$!" ] }, { "cell_type": "markdown", - "id": "ba943592", - "metadata": {}, + "id": "a1971089", + "metadata": { + "editable": true + }, "source": [ "## Ridge and Bayes\n", "\n", @@ -3255,8 +3505,10 @@ }, { "cell_type": "markdown", - "id": "2febbf21", - "metadata": {}, + "id": "9a5608dc", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta})=\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\beta_j^2}{2\\tau^2}\\right)}.\n", @@ -3265,16 +3517,20 @@ }, { "cell_type": "markdown", - "id": "e72040ed", - "metadata": {}, + "id": "03ea7796", + "metadata": { + "editable": true + }, "source": [ "Our posterior probability becomes then (omitting the normalization factor which is just a constant)" ] }, { "cell_type": "markdown", - "id": "d9c1b70a", - "metadata": {}, + "id": "e4261565", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta\\vert\\boldsymbol{D})}=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\beta_j^2}{2\\tau^2}\\right)}.\n", @@ -3283,8 +3539,10 @@ }, { "cell_type": "markdown", - "id": "c241b580", - "metadata": {}, + "id": "e10cbbc6", + "metadata": { + "editable": true + }, "source": [ "We can now optimize this quantity with respect to $\\boldsymbol{\\beta}$. As we\n", "did for OLS, this is most conveniently done by taking the negative\n", @@ -3294,8 +3552,10 @@ }, { "cell_type": "markdown", - "id": "04e7c00b", - "metadata": {}, + "id": "c5daa671", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\frac{1}{2\\tau^2}\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_2^2,\n", @@ -3304,16 +3564,20 @@ }, { "cell_type": "markdown", - "id": "6bc23829", - "metadata": {}, + "id": "ec6d384a", + "metadata": { + "editable": true + }, "source": [ "and replacing $1/2\\tau^2$ with $\\lambda$ we have" ] }, { "cell_type": "markdown", - "id": "b4110e51", - "metadata": {}, + "id": "f810360e", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta})=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\lambda\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_2^2,\n", @@ -3322,16 +3586,20 @@ }, { "cell_type": "markdown", - "id": "c9945279", - "metadata": {}, + "id": "694d4a3b", + "metadata": { + "editable": true + }, "source": [ "which is our Ridge cost function! Nice, isn't it?" ] }, { "cell_type": "markdown", - "id": "03964466", - "metadata": {}, + "id": "84fa0f29", + "metadata": { + "editable": true + }, "source": [ "## Lasso and Bayes\n", "\n", @@ -3340,8 +3608,10 @@ }, { "cell_type": "markdown", - "id": "4ab9227c", - "metadata": {}, + "id": "0a9bc6c3", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta})=\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\vert\\beta_j\\vert}{\\tau}\\right)}.\n", @@ -3350,16 +3620,20 @@ }, { "cell_type": "markdown", - "id": "d1677169", - "metadata": {}, + "id": "426a688b", + "metadata": { + "editable": true + }, "source": [ "Our posterior probability becomes then (omitting the normalization factor which is just a constant)" ] }, { "cell_type": "markdown", - "id": "7a3792ad", - "metadata": {}, + "id": "4f32b202", + "metadata": { + "editable": true + }, "source": [ "$$\n", "p(\\boldsymbol{\\beta}\\vert\\boldsymbol{D})=\\prod_{i=0}^{n-1}\\frac{1}{\\sqrt{2\\pi\\sigma^2}}\\exp{\\left[-\\frac{(y_i-\\boldsymbol{X}_{i,*}\\boldsymbol{\\beta})^2}{2\\sigma^2}\\right]}\\prod_{j=0}^{p-1}\\exp{\\left(-\\frac{\\vert\\beta_j\\vert}{\\tau}\\right)}.\n", @@ -3368,8 +3642,10 @@ }, { "cell_type": "markdown", - "id": "f187678a", - "metadata": {}, + "id": "faea2c8f", + "metadata": { + "editable": true + }, "source": [ "Taking the negative\n", "logarithm of the posterior probability and leaving out the\n", @@ -3378,8 +3654,10 @@ }, { "cell_type": "markdown", - "id": "681825df", - "metadata": {}, + "id": "c7f21d34", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\frac{1}{\\tau}\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -3388,16 +3666,20 @@ }, { "cell_type": "markdown", - "id": "bedf4fc9", - "metadata": {}, + "id": "ab07e217", + "metadata": { + "editable": true + }, "source": [ "and replacing $1/\\tau$ with $\\lambda$ we have" ] }, { "cell_type": "markdown", - "id": "edd7db8f", - "metadata": {}, + "id": "63f80d0b", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{\\beta}=\\frac{\\vert\\vert (\\boldsymbol{y}-\\boldsymbol{X}\\boldsymbol{\\beta})\\vert\\vert_2^2}{2\\sigma^2}+\\lambda\\vert\\vert\\boldsymbol{\\beta}\\vert\\vert_1,\n", @@ -3406,16 +3688,20 @@ }, { "cell_type": "markdown", - "id": "6617f9bb", - "metadata": {}, + "id": "a2fba07c", + "metadata": { + "editable": true + }, "source": [ "which is our Lasso cost function!" ] }, { "cell_type": "markdown", - "id": "77e9e54e", - "metadata": {}, + "id": "43fdf691", + "metadata": { + "editable": true + }, "source": [ "## Why resampling methods\n", "\n", @@ -3431,8 +3717,10 @@ }, { "cell_type": "markdown", - "id": "935d4fe8", - "metadata": {}, + "id": "087509a5", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods\n", "Resampling methods are an indispensable tool in modern\n", @@ -3457,8 +3745,10 @@ }, { "cell_type": "markdown", - "id": "508f611a", - "metadata": {}, + "id": "0df9fa6e", + "metadata": { + "editable": true + }, "source": [ "## Resampling approaches can be computationally expensive\n", "\n", @@ -3481,8 +3771,10 @@ }, { "cell_type": "markdown", - "id": "fce867a0", - "metadata": {}, + "id": "f7b93afa", + "metadata": { + "editable": true + }, "source": [ "## Why resampling methods ?\n", "**Statistical analysis.**\n", @@ -3496,8 +3788,10 @@ }, { "cell_type": "markdown", - "id": "0701c4d6", - "metadata": {}, + "id": "a14b4f27", + "metadata": { + "editable": true + }, "source": [ "## Statistical analysis\n", "\n", @@ -3514,8 +3808,10 @@ }, { "cell_type": "markdown", - "id": "7db598fa", - "metadata": {}, + "id": "eb00cc6a", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods\n", "\n", @@ -3541,8 +3837,10 @@ }, { "cell_type": "markdown", - "id": "0fcb4a86", - "metadata": {}, + "id": "2e9ced9e", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Jackknife and Bootstrap\n", "\n", @@ -3564,8 +3862,10 @@ }, { "cell_type": "markdown", - "id": "a29c2835", - "metadata": {}, + "id": "d0696d2a", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Jackknife\n", "\n", @@ -3576,8 +3876,10 @@ }, { "cell_type": "markdown", - "id": "9b3e5c68", - "metadata": {}, + "id": "31962112", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{x}_i = (x_1,x_2,\\cdots,x_{i-1},x_{i+1},\\cdots,x_n),\n", @@ -3586,8 +3888,10 @@ }, { "cell_type": "markdown", - "id": "c17dce67", - "metadata": {}, + "id": "dc02cc7f", + "metadata": { + "editable": true + }, "source": [ "which equals the vector $\\boldsymbol{x}$ with the exception that observation\n", "number $i$ is left out. Using this notation, define\n", @@ -3597,8 +3901,10 @@ }, { "cell_type": "markdown", - "id": "388f2cab", - "metadata": {}, + "id": "0faa5a99", + "metadata": { + "editable": true + }, "source": [ "## Jackknife code example" ] @@ -3606,8 +3912,11 @@ { "cell_type": "code", "execution_count": 7, - "id": "2fd22f69", - "metadata": {}, + "id": "9bc3ec1f", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "from numpy import *\n", @@ -3642,8 +3951,10 @@ }, { "cell_type": "markdown", - "id": "bca52c67", - "metadata": {}, + "id": "4be7d2c2", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Bootstrap\n", "Bootstrapping is a non-parametric approach to statistical inference\n", @@ -3665,8 +3976,10 @@ }, { "cell_type": "markdown", - "id": "d56da2f1", - "metadata": {}, + "id": "c1e61d5b", + "metadata": { + "editable": true + }, "source": [ "## The Central Limit Theorem\n", "\n", @@ -3683,8 +3996,10 @@ }, { "cell_type": "markdown", - "id": "b3237b5a", - "metadata": {}, + "id": "25922aa6", + "metadata": { + "editable": true + }, "source": [ "$$\n", "z=\\frac{x_1+x_2+\\dots+x_m}{m},\n", @@ -3693,16 +4008,20 @@ }, { "cell_type": "markdown", - "id": "105a5395", - "metadata": {}, + "id": "cfad9318", + "metadata": { + "editable": true + }, "source": [ "the question we pose is which is the PDF of the new variable $z$." ] }, { "cell_type": "markdown", - "id": "3fe83209", - "metadata": {}, + "id": "eff78e0a", + "metadata": { + "editable": true + }, "source": [ "## Finding the Limit\n", "\n", @@ -3714,8 +4033,10 @@ }, { "cell_type": "markdown", - "id": "c39470af", - "metadata": {}, + "id": "b06c20d2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\tilde{p}(z)=\\int dx_1p(x_1)\\int dx_2p(x_2)\\dots\\int dx_mp(x_m)\n", @@ -3725,8 +4046,10 @@ }, { "cell_type": "markdown", - "id": "5dfcba40", - "metadata": {}, + "id": "1d6f749b", + "metadata": { + "editable": true + }, "source": [ "where the $\\delta$-function enbodies the constraint that the mean is $z$.\n", "All measurements that lead to each individual $x_i$ are expected to\n", @@ -3736,8 +4059,10 @@ }, { "cell_type": "markdown", - "id": "d0bf9a86", - "metadata": {}, + "id": "4e969b8d", + "metadata": { + "editable": true + }, "source": [ "## Rewriting the $\\delta$-function\n", "\n", @@ -3746,8 +4071,10 @@ }, { "cell_type": "markdown", - "id": "e28b1469", - "metadata": {}, + "id": "8bde7f7f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\delta(z-\\frac{x_1+x_2+\\dots+x_m}{m})=\\frac{1}{2\\pi}\\int_{-\\infty}^{\\infty}\n", @@ -3757,8 +4084,10 @@ }, { "cell_type": "markdown", - "id": "66bf195d", - "metadata": {}, + "id": "3754744e", + "metadata": { + "editable": true + }, "source": [ "and inserting $e^{i\\mu q-i\\mu q}$ where $\\mu$ is the mean value\n", "we arrive at" @@ -3766,8 +4095,10 @@ }, { "cell_type": "markdown", - "id": "a0c7411a", - "metadata": {}, + "id": "e2d07bda", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\tilde{p}(z)=\\frac{1}{2\\pi}\\int_{-\\infty}^{\\infty}\n", @@ -3778,16 +4109,20 @@ }, { "cell_type": "markdown", - "id": "7a60c876", - "metadata": {}, + "id": "7e7f68ca", + "metadata": { + "editable": true + }, "source": [ "with the integral over $x$ resulting in" ] }, { "cell_type": "markdown", - "id": "f8d92005", - "metadata": {}, + "id": "7094c944", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\int_{-\\infty}^{\\infty}dxp(x)\\exp{\\left(iq(\\mu-x)/m\\right)}=\n", @@ -3798,8 +4133,10 @@ }, { "cell_type": "markdown", - "id": "de3350f6", - "metadata": {}, + "id": "3f94fc13", + "metadata": { + "editable": true + }, "source": [ "## Identifying Terms\n", "\n", @@ -3809,8 +4146,10 @@ }, { "cell_type": "markdown", - "id": "23267d7e", - "metadata": {}, + "id": "9437e2c7", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\int_{-\\infty}^{\\infty}dxp(x)e^{\\left(iq(\\mu-x)/m\\right)}=\n", @@ -3820,16 +4159,20 @@ }, { "cell_type": "markdown", - "id": "226eb6f6", - "metadata": {}, + "id": "bb47dde2", + "metadata": { + "editable": true + }, "source": [ "resulting in" ] }, { "cell_type": "markdown", - "id": "5137d145", - "metadata": {}, + "id": "e4831f7c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\left[\\int_{-\\infty}^{\\infty}dxp(x)\\exp{\\left(iq(\\mu-x)/m\\right)}\\right]^m\\approx\n", @@ -3839,16 +4182,20 @@ }, { "cell_type": "markdown", - "id": "e6ff93f6", - "metadata": {}, + "id": "f99710c8", + "metadata": { + "editable": true + }, "source": [ "and in the limit $m\\rightarrow \\infty$ we obtain" ] }, { "cell_type": "markdown", - "id": "da8b2b1a", - "metadata": {}, + "id": "c28fdadf", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\tilde{p}(z)=\\frac{1}{\\sqrt{2\\pi}(\\sigma/\\sqrt{m})}\n", @@ -3858,8 +4205,10 @@ }, { "cell_type": "markdown", - "id": "b3c5c051", - "metadata": {}, + "id": "5912be3b", + "metadata": { + "editable": true + }, "source": [ "which is the normal distribution with variance\n", "$\\sigma^2_m=\\sigma^2/m$, where $\\sigma$ is the variance of the PDF $p(x)$\n", @@ -3868,8 +4217,10 @@ }, { "cell_type": "markdown", - "id": "06dd6575", - "metadata": {}, + "id": "719d7288", + "metadata": { + "editable": true + }, "source": [ "## Wrapping it up\n", "\n", @@ -3885,8 +4236,10 @@ }, { "cell_type": "markdown", - "id": "bac1f242", - "metadata": {}, + "id": "3de2aead", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\sigma_m=\n", @@ -3896,8 +4249,10 @@ }, { "cell_type": "markdown", - "id": "96b515b5", - "metadata": {}, + "id": "2e6028b6", + "metadata": { + "editable": true + }, "source": [ "The latter is true only if the average value is known exactly. This is obtained in the limit\n", "$m\\rightarrow \\infty$ only. Because the mean and the variance are measured quantities we obtain \n", @@ -3906,8 +4261,10 @@ }, { "cell_type": "markdown", - "id": "3e0528cc", - "metadata": {}, + "id": "91809444", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\sigma_m\\approx \n", @@ -3917,8 +4274,10 @@ }, { "cell_type": "markdown", - "id": "8b1ceed8", - "metadata": {}, + "id": "ebc4cced", + "metadata": { + "editable": true + }, "source": [ "In many cases however the above estimate for the standard deviation,\n", "in particular if correlations are strong, may be too simplistic. Keep\n", @@ -3935,8 +4294,10 @@ }, { "cell_type": "markdown", - "id": "267c90d6", - "metadata": {}, + "id": "927fa786", + "metadata": { + "editable": true + }, "source": [ "## Confidence Intervals\n", "\n", @@ -3956,8 +4317,10 @@ }, { "cell_type": "markdown", - "id": "3a9aef51", - "metadata": {}, + "id": "e165b45a", + "metadata": { + "editable": true + }, "source": [ "## Standard Approach based on the Normal Distribution\n", "\n", @@ -3969,8 +4332,10 @@ }, { "cell_type": "markdown", - "id": "bd85a32f", - "metadata": {}, + "id": "a2d17c0f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\left(\\mu_{\\beta}\\pm \\frac{z\\sigma_{\\beta}}{\\sqrt{n}}\\right),\n", @@ -3979,8 +4344,10 @@ }, { "cell_type": "markdown", - "id": "5cafffd4", - "metadata": {}, + "id": "b1f0a225", + "metadata": { + "editable": true + }, "source": [ "where $z$ defines the level of certainty (or confidence). For a normal\n", "distribution typical parameters are $z=2.576$ which corresponds to a\n", @@ -3997,8 +4364,10 @@ }, { "cell_type": "markdown", - "id": "3547420c", - "metadata": {}, + "id": "1670f8f5", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Bootstrap background\n", "\n", @@ -4015,8 +4384,10 @@ }, { "cell_type": "markdown", - "id": "57d59b35", - "metadata": {}, + "id": "a23b34bd", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: More Bootstrap background\n", "\n", @@ -4037,8 +4408,10 @@ }, { "cell_type": "markdown", - "id": "2009b293", - "metadata": {}, + "id": "b60714c3", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Bootstrap approach\n", "\n", @@ -4056,8 +4429,10 @@ }, { "cell_type": "markdown", - "id": "3b3f7c23", - "metadata": {}, + "id": "2797c438", + "metadata": { + "editable": true + }, "source": [ "## Resampling methods: Bootstrap steps\n", "\n", @@ -4084,8 +4459,10 @@ }, { "cell_type": "markdown", - "id": "e315f571", - "metadata": {}, + "id": "8aa53514", + "metadata": { + "editable": true + }, "source": [ "## Code example for the Bootstrap method\n", "\n", @@ -4106,8 +4483,11 @@ { "cell_type": "code", "execution_count": 8, - "id": "e97da267", - "metadata": {}, + "id": "7e395f53", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import numpy as np\n", @@ -4140,16 +4520,20 @@ }, { "cell_type": "markdown", - "id": "7bfbf631", - "metadata": {}, + "id": "5ef4e750", + "metadata": { + "editable": true + }, "source": [ "We see that our new variance and from that the standard deviation, agrees with the central limit theorem." ] }, { "cell_type": "markdown", - "id": "5a1c8639", - "metadata": {}, + "id": "f7e08e78", + "metadata": { + "editable": true + }, "source": [ "## Plotting the Histogram" ] @@ -4157,8 +4541,11 @@ { "cell_type": "code", "execution_count": 9, - "id": "a86d297a", - "metadata": {}, + "id": "9b80e633", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "# the histogram of the bootstrapped data (normalized data if density = True)\n", @@ -4174,8 +4561,10 @@ }, { "cell_type": "markdown", - "id": "4c024047", - "metadata": {}, + "id": "9a8b0f53", + "metadata": { + "editable": true + }, "source": [ "## The bias-variance tradeoff\n", "\n", @@ -4190,8 +4579,10 @@ }, { "cell_type": "markdown", - "id": "a7a56e18", - "metadata": {}, + "id": "8196e56f", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{y}=f(\\boldsymbol{x}) + \\boldsymbol{\\epsilon}\n", @@ -4200,8 +4591,10 @@ }, { "cell_type": "markdown", - "id": "3e18c5de", - "metadata": {}, + "id": "73f47045", + "metadata": { + "editable": true + }, "source": [ "where $\\epsilon$ is normally distributed with mean zero and standard deviation $\\sigma^2$.\n", "\n", @@ -4215,8 +4608,10 @@ }, { "cell_type": "markdown", - "id": "bcce7420", - "metadata": {}, + "id": "6270c25d", + "metadata": { + "editable": true + }, "source": [ "$$\n", "C(\\boldsymbol{X},\\boldsymbol{\\beta}) =\\frac{1}{n}\\sum_{i=0}^{n-1}(y_i-\\tilde{y}_i)^2=\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right].\n", @@ -4225,16 +4620,20 @@ }, { "cell_type": "markdown", - "id": "121433f0", - "metadata": {}, + "id": "38f50b48", + "metadata": { + "editable": true + }, "source": [ "We can rewrite this as" ] }, { "cell_type": "markdown", - "id": "94d035af", - "metadata": {}, + "id": "ea869e50", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right]=\\frac{1}{n}\\sum_i(f_i-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right])^2+\\frac{1}{n}\\sum_i(\\tilde{y}_i-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right])^2+\\sigma^2.\n", @@ -4243,8 +4642,10 @@ }, { "cell_type": "markdown", - "id": "b6502ac4", - "metadata": {}, + "id": "779e7714", + "metadata": { + "editable": true + }, "source": [ "The three terms represent the square of the bias of the learning\n", "method, which can be thought of as the error caused by the simplifying\n", @@ -4258,8 +4659,10 @@ }, { "cell_type": "markdown", - "id": "66910ad4", - "metadata": {}, + "id": "0e3bc25d", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right]=\\mathbb{E}\\left[(\\boldsymbol{f}+\\boldsymbol{\\epsilon}-\\boldsymbol{\\tilde{y}})^2\\right],\n", @@ -4268,16 +4671,20 @@ }, { "cell_type": "markdown", - "id": "af07a8d0", - "metadata": {}, + "id": "dd6346d4", + "metadata": { + "editable": true + }, "source": [ "and adding and subtracting $\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right]$ we get" ] }, { "cell_type": "markdown", - "id": "56e8518d", - "metadata": {}, + "id": "a85c7fe2", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right]=\\mathbb{E}\\left[(\\boldsymbol{f}+\\boldsymbol{\\epsilon}-\\boldsymbol{\\tilde{y}}+\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right]-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right])^2\\right],\n", @@ -4286,16 +4693,20 @@ }, { "cell_type": "markdown", - "id": "259c8f56", - "metadata": {}, + "id": "a474a09e", + "metadata": { + "editable": true + }, "source": [ "which, using the abovementioned expectation values can be rewritten as" ] }, { "cell_type": "markdown", - "id": "96fcb975", - "metadata": {}, + "id": "f8392d4c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right]=\\mathbb{E}\\left[(\\boldsymbol{y}-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right])^2\\right]+\\mathrm{Var}\\left[\\boldsymbol{\\tilde{y}}\\right]+\\sigma^2,\n", @@ -4304,16 +4715,20 @@ }, { "cell_type": "markdown", - "id": "ba665645", - "metadata": {}, + "id": "88d30a5d", + "metadata": { + "editable": true + }, "source": [ "that is the rewriting in terms of the so-called bias, the variance of the model $\\boldsymbol{\\tilde{y}}$ and the variance of $\\boldsymbol{\\epsilon}$." ] }, { "cell_type": "markdown", - "id": "0d692042", - "metadata": {}, + "id": "9aa0a436", + "metadata": { + "editable": true + }, "source": [ "## A way to Read the Bias-Variance Tradeoff\n", "\n", @@ -4326,8 +4741,10 @@ }, { "cell_type": "markdown", - "id": "149e2573", - "metadata": {}, + "id": "23654b68", + "metadata": { + "editable": true + }, "source": [ "## Example code for Bias-Variance tradeoff" ] @@ -4335,8 +4752,11 @@ { "cell_type": "code", "execution_count": 10, - "id": "49178e41", - "metadata": {}, + "id": "fcb04b3d", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import matplotlib.pyplot as plt\n", @@ -4397,8 +4817,10 @@ }, { "cell_type": "markdown", - "id": "7572388e", - "metadata": {}, + "id": "c4f13eaa", + "metadata": { + "editable": true + }, "source": [ "## Understanding what happens" ] @@ -4406,8 +4828,11 @@ { "cell_type": "code", "execution_count": 11, - "id": "27bde540", - "metadata": {}, + "id": "b4212a78", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import matplotlib.pyplot as plt\n", @@ -4460,8 +4885,10 @@ }, { "cell_type": "markdown", - "id": "23ae6594", - "metadata": {}, + "id": "15e01f77", + "metadata": { + "editable": true + }, "source": [ "## Summing up\n", "\n", @@ -4496,8 +4923,10 @@ }, { "cell_type": "markdown", - "id": "168bc9ba", - "metadata": {}, + "id": "eec42478", + "metadata": { + "editable": true + }, "source": [ "## Another Example from Scikit-Learn's Repository" ] @@ -4505,8 +4934,11 @@ { "cell_type": "code", "execution_count": 12, - "id": "0b5340dc", - "metadata": {}, + "id": "8fcab853", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "\"\"\"\n", @@ -4584,8 +5016,10 @@ }, { "cell_type": "markdown", - "id": "837fbf8e", - "metadata": {}, + "id": "49f69f02", + "metadata": { + "editable": true + }, "source": [ "## Various steps in cross-validation\n", "\n", @@ -4607,8 +5041,10 @@ }, { "cell_type": "markdown", - "id": "b3ea4b1f", - "metadata": {}, + "id": "2a43c500", + "metadata": { + "editable": true + }, "source": [ "## How to set up the cross-validation for Ridge and/or Lasso\n", "\n", @@ -4621,8 +5057,10 @@ }, { "cell_type": "markdown", - "id": "cce015ce", - "metadata": {}, + "id": "3bf76366", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{align*}\n", @@ -4635,8 +5073,10 @@ }, { "cell_type": "markdown", - "id": "61403da6", - "metadata": {}, + "id": "2865226e", + "metadata": { + "editable": true + }, "source": [ "* Evaluate the prediction performance of these models on the test set by $[y_i, \\boldsymbol{X}_{i, \\ast}; \\boldsymbol{\\beta}_{-i}(\\lambda), \\boldsymbol{\\sigma}_{-i}^2(\\lambda)]$. Or, by the prediction error $|y_i - \\boldsymbol{X}_{i, \\ast} \\boldsymbol{\\beta}_{-i}(\\lambda)|$, the relative error, the error squared or the R2 score function.\n", "\n", @@ -4647,8 +5087,10 @@ }, { "cell_type": "markdown", - "id": "bbd26e8b", - "metadata": {}, + "id": "c28a4bda", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\begin{align*}\n", @@ -4659,8 +5101,10 @@ }, { "cell_type": "markdown", - "id": "56cf97f5", - "metadata": {}, + "id": "4328f2a6", + "metadata": { + "editable": true + }, "source": [ "## Cross-validation in brief\n", "\n", @@ -4685,8 +5129,10 @@ }, { "cell_type": "markdown", - "id": "d2161943", - "metadata": {}, + "id": "f8ab4371", + "metadata": { + "editable": true + }, "source": [ "## Code Example for Cross-validation and $k$-fold Cross-validation\n", "\n", @@ -4696,8 +5142,11 @@ { "cell_type": "code", "execution_count": 13, - "id": "280eff45", - "metadata": {}, + "id": "8a7f8126", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "import numpy as np\n", @@ -4793,8 +5242,10 @@ }, { "cell_type": "markdown", - "id": "6c93e890", - "metadata": {}, + "id": "e5d7cb66", + "metadata": { + "editable": true + }, "source": [ "## More examples on bootstrap and cross-validation and errors" ] @@ -4802,8 +5253,11 @@ { "cell_type": "code", "execution_count": 14, - "id": "8ac1b77b", - "metadata": {}, + "id": "03d23637", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "# Common imports\n", @@ -4888,16 +5342,20 @@ }, { "cell_type": "markdown", - "id": "501003fd", - "metadata": {}, + "id": "211fa442", + "metadata": { + "editable": true + }, "source": [ "Note that we kept the intercept column in the fitting here. This means that we need to set the **intercept** in the call to the **Scikit-Learn** function as **False**. Alternatively, we could have set up the design matrix $X$ without the first column of ones." ] }, { "cell_type": "markdown", - "id": "fe6913fe", - "metadata": {}, + "id": "e42d4d92", + "metadata": { + "editable": true + }, "source": [ "## The same example but now with cross-validation\n", "\n", @@ -4907,8 +5365,11 @@ { "cell_type": "code", "execution_count": 15, - "id": "cddd5a5c", - "metadata": {}, + "id": "9cba801d", + "metadata": { + "collapsed": false, + "editable": true + }, "outputs": [], "source": [ "# Common imports\n", @@ -4982,8 +5443,10 @@ }, { "cell_type": "markdown", - "id": "00dbf855", - "metadata": {}, + "id": "c0142fc4", + "metadata": { + "editable": true + }, "source": [ "## Overarching aims of the exercises this week\n", "\n", @@ -5005,8 +5468,10 @@ }, { "cell_type": "markdown", - "id": "96a5f8dd", - "metadata": {}, + "id": "80c71c8c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{y} = f(\\boldsymbol{x})+\\boldsymbol{\\varepsilon}\n", @@ -5015,8 +5480,10 @@ }, { "cell_type": "markdown", - "id": "77a692f8", - "metadata": {}, + "id": "960da1af", + "metadata": { + "editable": true + }, "source": [ "We then approximate this function $f(\\boldsymbol{x})$ with our model $\\boldsymbol{\\tilde{y}}$ from the solution of the linear regression equations (ordinary least squares OLS), that is our\n", "function $f$ is approximated by $\\boldsymbol{\\tilde{y}}$ where we minimized $(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2$, with" @@ -5024,8 +5491,10 @@ }, { "cell_type": "markdown", - "id": "bbbedeba", - "metadata": {}, + "id": "3cc81e1c", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\boldsymbol{\\tilde{y}} = \\boldsymbol{X}\\boldsymbol{\\beta}.\n", @@ -5034,16 +5503,20 @@ }, { "cell_type": "markdown", - "id": "3a58e150", - "metadata": {}, + "id": "e09d6ec4", + "metadata": { + "editable": true + }, "source": [ "The matrix $\\boldsymbol{X}$ is the so-called design or feature matrix." ] }, { "cell_type": "markdown", - "id": "792c0c41", - "metadata": {}, + "id": "b4c428ce", + "metadata": { + "editable": true + }, "source": [ "## Exercise 1: Expectation values for ordinary least squares expressions\n", "\n", @@ -5052,8 +5525,10 @@ }, { "cell_type": "markdown", - "id": "be05a547", - "metadata": {}, + "id": "76925bba", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}(y_i) =\\sum_{j}x_{ij} \\beta_j=\\mathbf{X}_{i, \\ast} \\, \\boldsymbol{\\beta},\n", @@ -5062,8 +5537,10 @@ }, { "cell_type": "markdown", - "id": "a3362a89", - "metadata": {}, + "id": "661303fc", + "metadata": { + "editable": true + }, "source": [ "and that\n", "its variance is" @@ -5071,8 +5548,10 @@ }, { "cell_type": "markdown", - "id": "6c741120", - "metadata": {}, + "id": "4dfa5687", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mbox{Var}(y_i) = \\sigma^2.\n", @@ -5081,8 +5560,10 @@ }, { "cell_type": "markdown", - "id": "261662ab", - "metadata": {}, + "id": "9773149b", + "metadata": { + "editable": true + }, "source": [ "Hence, $y_i \\sim N( \\mathbf{X}_{i, \\ast} \\, \\boldsymbol{\\beta}, \\sigma^2)$, that is $\\boldsymbol{y}$ follows a normal distribution with \n", "mean value $\\boldsymbol{X}\\boldsymbol{\\beta}$ and variance $\\sigma^2$.\n", @@ -5092,8 +5573,10 @@ }, { "cell_type": "markdown", - "id": "98f68e9b", - "metadata": {}, + "id": "48801ec9", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E}(\\boldsymbol{\\hat{\\beta}}) = \\boldsymbol{\\beta}.\n", @@ -5102,16 +5585,20 @@ }, { "cell_type": "markdown", - "id": "62d36b90", - "metadata": {}, + "id": "a57119a4", + "metadata": { + "editable": true + }, "source": [ "Show finally that the variance of $\\boldsymbol{\\boldsymbol{\\beta}}$ is" ] }, { "cell_type": "markdown", - "id": "cdd7640b", - "metadata": {}, + "id": "98bd6f19", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mbox{Var}(\\boldsymbol{\\hat{\\beta}}) = \\sigma^2 \\, (\\mathbf{X}^{T} \\mathbf{X})^{-1}.\n", @@ -5120,8 +5607,10 @@ }, { "cell_type": "markdown", - "id": "85f6cbfd", - "metadata": {}, + "id": "bd33f144", + "metadata": { + "editable": true + }, "source": [ "We can use the last expression when we define a [so-called confidence interval](https://en.wikipedia.org/wiki/Confidence_interval) for the parameters $\\beta$. \n", "A given parameter $\\beta_j$ is given by the diagonal matrix element of the above matrix." @@ -5129,8 +5618,10 @@ }, { "cell_type": "markdown", - "id": "9ddba3ca", - "metadata": {}, + "id": "b35f3e96", + "metadata": { + "editable": true + }, "source": [ "## Exercise 2: Expectation values for Ridge regression\n", "\n", @@ -5139,8 +5630,10 @@ }, { "cell_type": "markdown", - "id": "45aa8dd2", - "metadata": {}, + "id": "d3c4c45a", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mathbb{E} \\big[ \\hat{\\boldsymbol{\\beta}}^{\\mathrm{Ridge}} \\big]=(\\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I}_{pp})^{-1} (\\mathbf{X}^{\\top} \\mathbf{X})\\boldsymbol{\\beta}.\n", @@ -5149,8 +5642,10 @@ }, { "cell_type": "markdown", - "id": "32a9f4d1", - "metadata": {}, + "id": "9c302b69", + "metadata": { + "editable": true + }, "source": [ "We see clearly that\n", "$\\mathbb{E} \\big[ \\hat{\\boldsymbol{\\beta}}^{\\mathrm{Ridge}} \\big] \\not= \\mathbb{E} \\big[\\hat{\\boldsymbol{\\beta}}^{\\mathrm{OLS}}\\big ]$ for any $\\lambda > 0$.\n", @@ -5160,8 +5655,10 @@ }, { "cell_type": "markdown", - "id": "5a3613f1", - "metadata": {}, + "id": "bbcfa633", + "metadata": { + "editable": true + }, "source": [ "$$\n", "\\mbox{Var}[\\hat{\\boldsymbol{\\beta}}^{\\mathrm{Ridge}}]=\\sigma^2[ \\mathbf{X}^{T} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1} \\mathbf{X}^{T}\\mathbf{X} \\{ [ \\mathbf{X}^{\\top} \\mathbf{X} + \\lambda \\mathbf{I} ]^{-1}\\}^{T},\n", @@ -5170,32 +5667,165 @@ }, { "cell_type": "markdown", - "id": "6760b10e", - "metadata": {}, + "id": "af584cdb", + "metadata": { + "editable": true + }, "source": [ "and it is easy to see that if the parameter $\\lambda$ goes to infinity then the variance of the Ridge parameters $\\boldsymbol{\\beta}$ goes to zero." ] + }, + { + "cell_type": "markdown", + "id": "3c9ef45f", + "metadata": { + "editable": true + }, + "source": [ + "## Exercise 3: Bias-Variance tradeoff\n", + "\n", + "The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method. \n", + "\n", + "Consider a\n", + "dataset $\\mathcal{L}$ consisting of the data\n", + "$\\mathbf{X}_\\mathcal{L}=\\{(y_j, \\boldsymbol{x}_j), j=0\\ldots n-1\\}$.\n", + "\n", + "We assume that the true data is generated from a noisy model" + ] + }, + { + "cell_type": "markdown", + "id": "3459f52e", + "metadata": { + "editable": true + }, + "source": [ + "$$\n", + "\\boldsymbol{y}=f(\\boldsymbol{x}) + \\boldsymbol{\\epsilon}.\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "id": "061f9980", + "metadata": { + "editable": true + }, + "source": [ + "Here $\\epsilon$ is normally distributed with mean zero and standard\n", + "deviation $\\sigma^2$.\n", + "\n", + "In our derivation of the ordinary least squares method we defined \n", + "an approximation to the function $f$ in terms of the parameters\n", + "$\\boldsymbol{\\beta}$ and the design matrix $\\boldsymbol{X}$ which embody our model,\n", + "that is $\\boldsymbol{\\tilde{y}}=\\boldsymbol{X}\\boldsymbol{\\beta}$.\n", + "\n", + "The parameters $\\boldsymbol{\\beta}$ are in turn found by optimizing the mean\n", + "squared error via the so-called cost function" + ] + }, + { + "cell_type": "markdown", + "id": "65fe1ada", + "metadata": { + "editable": true + }, + "source": [ + "$$\n", + "C(\\boldsymbol{X},\\boldsymbol{\\beta}) =\\frac{1}{n}\\sum_{i=0}^{n-1}(y_i-\\tilde{y}_i)^2=\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right].\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "id": "cbea2b54", + "metadata": { + "editable": true + }, + "source": [ + "Here the expected value $\\mathbb{E}$ is the sample value. \n", + "\n", + "Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a\n", + "term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise.\n", + "That is, show that" + ] + }, + { + "cell_type": "markdown", + "id": "b1d514f5", + "metadata": { + "editable": true + }, + "source": [ + "$$\n", + "\\mathbb{E}\\left[(\\boldsymbol{y}-\\boldsymbol{\\tilde{y}})^2\\right]=\\mathrm{Bias}[\\tilde{y}]+\\mathrm{var}[\\tilde{y}]+\\sigma^2,\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "id": "e14d012c", + "metadata": { + "editable": true + }, + "source": [ + "with" + ] + }, + { + "cell_type": "markdown", + "id": "88c42b43", + "metadata": { + "editable": true + }, + "source": [ + "$$\n", + "\\mathrm{Bias}[\\tilde{y}]=\\mathbb{E}\\left[\\left(\\boldsymbol{y}-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right]\\right)^2\\right],\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "id": "035dd127", + "metadata": { + "editable": true + }, + "source": [ + "and" + ] + }, + { + "cell_type": "markdown", + "id": "e594908a", + "metadata": { + "editable": true + }, + "source": [ + "$$\n", + "\\mathrm{var}[\\tilde{y}]=\\mathbb{E}\\left[\\left(\\tilde{\\boldsymbol{y}}-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right]\\right)^2\\right]=\\frac{1}{n}\\sum_i(\\tilde{y}_i-\\mathbb{E}\\left[\\boldsymbol{\\tilde{y}}\\right])^2.\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "id": "0e6d9419", + "metadata": { + "editable": true + }, + "source": [ + "Explain what the terms mean and discuss their interpretations.\n", + "\n", + "Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by\n", + "studying the MSE value as function of the complexity of your model. Use ordinary least squares only.\n", + "\n", + "Discuss the bias and variance trade-off as function\n", + "of your model complexity (the degree of the polynomial) and the number\n", + "of data points, and possibly also your training and test data using the **bootstrap** resampling method.\n", + "You can follow the code example in the jupyter-book at ." + ] } ], - "metadata": { - "kernelspec": { - "display_name": "Python 3 (ipykernel)", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.9.10" - } - }, + "metadata": {}, "nbformat": 4, "nbformat_minor": 5 } diff --git a/doc/pub/day3/ipynb/ipynb-day3-src.tar.gz b/doc/pub/day3/ipynb/ipynb-day3-src.tar.gz index ef39b82..c27df2b 100644 Binary files a/doc/pub/day3/ipynb/ipynb-day3-src.tar.gz and b/doc/pub/day3/ipynb/ipynb-day3-src.tar.gz differ diff --git a/doc/pub/day3/pdf/day3.pdf b/doc/pub/day3/pdf/day3.pdf index fa1bf66..13be624 100644 Binary files a/doc/pub/day3/pdf/day3.pdf and b/doc/pub/day3/pdf/day3.pdf differ diff --git a/doc/src/Day3/Day3.do.txt b/doc/src/Day3/Day3.do.txt index 9f56e1d..60d8068 100644 --- a/doc/src/Day3/Day3.do.txt +++ b/doc/src/Day3/Day3.do.txt @@ -1,14 +1,14 @@ TITLE: Data Analysis and Machine Learning: Ridge and Lasso Regression and Resampling Methods AUTHOR: Morten Hjorth-Jensen {copyright, 1999-present|CC BY-NC} at Department of Physics and Center for Computing in Science Education, University of Oslo, Norway & Department of Physics and Astronomy and Facility for Rare Isotope Beams and National Superconducting Cyclotron Laboratory, Michigan State University, USA -DATE: October 15 and 22, 2023 +DATE: October 16 and 23, 2023 !split ===== Plans for Sessions 4-6 ===== * More on Ridge and Lasso Regression * Statistics, probability theory and resampling methods - * "Video of Lecture October 15 to be added":"https://youtu.be/" - * "Video of Lecture October 22 to be added":"https://youtu.be/" + * "Video of Lecture October 16 to be added":"https://youtu.be/iqRKUPJr_bY" + * "Video of Lecture October 23 to be added":"https://youtu.be/" !split @@ -2972,4 +2972,73 @@ and it is easy to see that if the parameter $\lambda$ goes to infinity then the +===== Exercise: Bias-Variance tradeoff ===== + +The aim of the exercises is to derive the equations for the bias-variance tradeoff to be used in project 1 as well as testing this for a simpler function using the bootstrap method. + +Consider a +dataset $\mathcal{L}$ consisting of the data +$\mathbf{X}_\mathcal{L}=\{(y_j, \boldsymbol{x}_j), j=0\ldots n-1\}$. + +We assume that the true data is generated from a noisy model + +!bt +\[ +\bm{y}=f(\boldsymbol{x}) + \bm{\epsilon}. +\] +!et + +Here $\epsilon$ is normally distributed with mean zero and standard +deviation $\sigma^2$. + +In our derivation of the ordinary least squares method we defined +an approximation to the function $f$ in terms of the parameters +$\bm{\beta}$ and the design matrix $\bm{X}$ which embody our model, +that is $\bm{\tilde{y}}=\bm{X}\bm{\beta}$. + +The parameters $\bm{\beta}$ are in turn found by optimizing the mean +squared error via the so-called cost function + +!bt +\[ +C(\bm{X},\bm{\beta}) =\frac{1}{n}\sum_{i=0}^{n-1}(y_i-\tilde{y}_i)^2=\mathbb{E}\left[(\bm{y}-\bm{\tilde{y}})^2\right]. +\] +!et +Here the expected value $\mathbb{E}$ is the sample value. + +Show that you can rewrite this in terms of a term which contains the variance of the model itself (the so-called variance term), a +term which measures the deviation from the true data and the mean value of the model (the bias term) and finally the variance of the noise. +That is, show that +!bt +\[ +\mathbb{E}\left[(\bm{y}-\bm{\tilde{y}})^2\right]=\mathrm{Bias}[\tilde{y}]+\mathrm{var}[\tilde{y}]+\sigma^2, +\] +!et +with +!bt +\[ +\mathrm{Bias}[\tilde{y}]=\mathbb{E}\left[\left(\bm{y}-\mathbb{E}\left[\bm{\tilde{y}}\right]\right)^2\right], +\] +!et +and +!bt +\[ +\mathrm{var}[\tilde{y}]=\mathbb{E}\left[\left(\tilde{\bm{y}}-\mathbb{E}\left[\bm{\tilde{y}}\right]\right)^2\right]=\frac{1}{n}\sum_i(\tilde{y}_i-\mathbb{E}\left[\bm{\tilde{y}}\right])^2. +\] +!et + + + +Explain what the terms mean and discuss their interpretations. + +Perform then a bias-variance analysis of a simple one-dimensional (or other models of your choice) function by +studying the MSE value as function of the complexity of your model. Use ordinary least squares only. + +Discuss the bias and variance trade-off as function +of your model complexity (the degree of the polynomial) and the number +of data points, and possibly also your training and test data using the _bootstrap_ resampling method. +You can follow the code example in the jupyter-book at URL:"https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/chapter3.html#the-bias-variance-tradeoff". + + +