Skip to content

Commit

Permalink
correcting typos
Browse files Browse the repository at this point in the history
  • Loading branch information
mhjensen committed Jan 15, 2024
1 parent 58dd43f commit 2d6d397
Show file tree
Hide file tree
Showing 8 changed files with 1,836 additions and 131 deletions.
298 changes: 294 additions & 4 deletions doc/pub/week1/html/week1-bs.html
Original file line number Diff line number Diff line change
Expand Up @@ -212,7 +212,30 @@
('Mathematics of deep learning and neural networks',
2,
None,
'mathematics-of-deep-learning-and-neural-networks')]}
'mathematics-of-deep-learning-and-neural-networks'),
('Vectors', 2, None, 'vectors'),
('Outer products', 2, None, 'outer-products'),
('Basic Matrix Features', 2, None, 'basic-matrix-features'),
('Basic Matrix Features', 2, None, 'basic-matrix-features'),
('Basic Matrix Features', 2, None, 'basic-matrix-features'),
('Some famous Matrices', 2, None, 'some-famous-matrices'),
('Basic Matrix Features', 2, None, 'basic-matrix-features'),
('Important Mathematical Operations',
2,
None,
'important-mathematical-operations'),
('Important Mathematical Operations',
2,
None,
'important-mathematical-operations'),
('Important Mathematical Operations',
2,
None,
'important-mathematical-operations'),
('Important Mathematical Operations',
2,
None,
'important-mathematical-operations')]}
end of tocinfo -->

<body>
Expand Down Expand Up @@ -297,6 +320,17 @@
<!-- navigation toc: --> <li><a href="#bayes-theorem" style="font-size: 80%;">Bayes' Theorem</a></li>
<!-- navigation toc: --> <li><a href="#quantified-limits-of-the-nuclear-landscape-https-journals-aps-org-prc-abstract-10-1103-physrevc-101-044307" style="font-size: 80%;">"Quantified limits of the nuclear landscape":"https://journals.aps.org/prc/abstract/10.1103/PhysRevC.101.044307"</a></li>
<!-- navigation toc: --> <li><a href="#mathematics-of-deep-learning-and-neural-networks" style="font-size: 80%;">Mathematics of deep learning and neural networks</a></li>
<!-- navigation toc: --> <li><a href="#vectors" style="font-size: 80%;">Vectors</a></li>
<!-- navigation toc: --> <li><a href="#outer-products" style="font-size: 80%;">Outer products</a></li>
<!-- navigation toc: --> <li><a href="#basic-matrix-features" style="font-size: 80%;">Basic Matrix Features</a></li>
<!-- navigation toc: --> <li><a href="#basic-matrix-features" style="font-size: 80%;">Basic Matrix Features</a></li>
<!-- navigation toc: --> <li><a href="#basic-matrix-features" style="font-size: 80%;">Basic Matrix Features</a></li>
<!-- navigation toc: --> <li><a href="#some-famous-matrices" style="font-size: 80%;">Some famous Matrices</a></li>
<!-- navigation toc: --> <li><a href="#basic-matrix-features" style="font-size: 80%;">Basic Matrix Features</a></li>
<!-- navigation toc: --> <li><a href="#important-mathematical-operations" style="font-size: 80%;">Important Mathematical Operations</a></li>
<!-- navigation toc: --> <li><a href="#important-mathematical-operations" style="font-size: 80%;">Important Mathematical Operations</a></li>
<!-- navigation toc: --> <li><a href="#important-mathematical-operations" style="font-size: 80%;">Important Mathematical Operations</a></li>
<!-- navigation toc: --> <li><a href="#important-mathematical-operations" style="font-size: 80%;">Important Mathematical Operations</a></li>

</ul>
</li>
Expand Down Expand Up @@ -339,7 +373,7 @@ <h2 id="overview-of-first-week-january-15-19-2024" class="anchor">Overview of fi
<div class="panel-body">
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->
<ol>
<li> Presentation of course with general overview of methods</li>
<li> Presentation of course</li>
<li> Discussion of possible projects and presentation of participants</li>
<li> Deep learning methods, review of neural networks</li>
</ol>
Expand Down Expand Up @@ -376,7 +410,7 @@ <h2 id="deep-learning-methods-covered-tentative" class="anchor">Deep learning me
<li> Generative Adversarial Networks (GANs)</li>
<li> Autoregressive methods (tentative)</li>
</ol>
<li> <b>Physical Sciences informed machine learning</b></li>
<li> <b>Physical Sciences (often just called Physics informed) informed machine learning</b></li>
</ol>
<!-- !split -->
<h2 id="additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html" class="anchor"><a href="https://jenfb.github.io/bkmr/overview.html" target="_self">Additional topics: Kernel regression (Gaussian processes) and Bayesian statistics</a> </h2>
Expand Down Expand Up @@ -1118,7 +1152,263 @@ <h2 id="quantified-limits-of-the-nuclear-landscape-https-journals-aps-org-prc-ab
<!-- !split -->
<h2 id="mathematics-of-deep-learning-and-neural-networks" class="anchor">Mathematics of deep learning and neural networks </h2>

<p>Material to be added</p>
<p>Throughout this course we will use the following notations. Vectors,
Matrices and higher-order tensors are always boldfaced, with vectors
given by lower case letter letters.
</p>

<p>Unless otherwise stated, the elements \( v_i \) of a vector \( \boldsymbol{v} \) are assumed to be real. That is a vector of length \( n \) is defined as
\( \boldsymbol{x}\in \mathbb{R}^{n} \) and if we have a complex vector we have \( \boldsymbol{x}\in \mathbb{C}^{n} \).
</p>

<p>For a matrix of dimension \( n\times n \) we have
\( \boldsymbol{A}\in \mathbb{R}^{n\times n} \)
</p>

<!-- !split -->
<h2 id="vectors" class="anchor">Vectors </h2>

<p>We start by defining a vector \( \boldsymbol{x} \) with \( n \) components, with \( x_0 \) as our first element, as</p>

$$
\boldsymbol{x} = \begin{bmatrix} x_0\\ x_1 \\ x_2 \\ \dots \\ \dots \\ x_{n-1} \end{bmatrix}.
$$

<p>and its transpose </p>
$$
\boldsymbol{x}^{T} = \begin{bmatrix} x_0 & x_1 & x_2 & \dots & \dots & x_{n-1} \end{bmatrix},
$$

<p>In case we have a complex vector we define the hermitian conjugate</p>
$$
\boldsymbol{x}^{\dagger} = \begin{bmatrix} x_0^* & x_1^* & x_2^* & \dots & \dots & x_{n-1}^* \end{bmatrix},
$$

<p>With a given vector \( \boldsymbol{x} \), we define the inner product as</p>
$$
\boldsymbol{x}^T \boldsymbol{x} = \sum_{i=0}^{n-1} x_ix_i=x_0^2+x_1^2+\dots + x_{n-1}^2.
$$


<!-- !split -->
<h2 id="outer-products" class="anchor">Outer products </h2>

<p>In addition to inner products between vectors/states, the outer
product plays a central role in many applications. It is
defined as
</p>
$$
\boldsymbol{x}\boldsymbol{y}^T = \begin{bmatrix}
x_0y_0 & x_0y_1 & x_0y_2 & \dots & \dots & x_0y_{n-2} & x_0y_{n-1} \\
x_1y_0 & x_1y_1 & x_1y_2 & \dots & \dots & x_1y_{n-2} & x_1y_{n-1} \\
x_2y_0 & x_2y_1 & x_2y_2 & \dots & \dots & x_2y_{n-2} & x_2y_{n-1} \\
\dots & \dots & \dots & \dots & \dots & \dots & \dots \\
\dots & \dots & \dots & \dots & \dots & \dots & \dots \\
x_{n-2}y_0 & x_{n-2}y_1 & x_{n-2}y_2 & \dots & \dots & x_{n-2}y_{n-2} & x_{n-2}y_{n-1} \\
x_{n-1}y_0 & x_{n-1}y_1 & x_{n-1}y_2 & \dots & \dots & x_{n-1}y_{n-2} & x_{n-1}y_{n-1} \end{bmatrix}
$$


<!-- !split -->
<h2 id="basic-matrix-features" class="anchor">Basic Matrix Features </h2>

<div class="panel panel-default">
<div class="panel-body">
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->
$$
\mathbf{A} =
\begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} \\
a_{21} & a_{22} & a_{23} & a_{24} \\
a_{31} & a_{32} & a_{33} & a_{34} \\
a_{41} & a_{42} & a_{43} & a_{44}
\end{bmatrix}\qquad
\mathbf{I} =
\begin{bmatrix} 1 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1
\end{bmatrix}
$$
</div>
</div>

<!-- !split -->
<h2 id="basic-matrix-features" class="anchor">Basic Matrix Features </h2>
<div class="panel panel-default">
<div class="panel-body">
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->
<p>The inverse of a matrix is defined by</p>

$$
\mathbf{A}^{-1} \cdot \mathbf{A} = I
$$
</div>
</div>


<!-- !split -->
<h2 id="basic-matrix-features" class="anchor">Basic Matrix Features </h2>

<div class="panel panel-default">
<div class="panel-body">
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->

<div class="row">
<div class="col-xs-12">
<table class="dotable table-striped table-hover table-condensed">
<thead>
<tr><td align="center"><b> Relations </b></td> <td align="center"><b> Name </b></td> <td align="center"><b> matrix elements </b></td> </tr>
</thead>
<tbody>
<tr><td align="center"> \( A = A^{T} \) </td> <td align="center"> symmetric </td> <td align="center"> \( a_{ij} = a_{ji} \) </td> </tr>
<tr><td align="center"> \( A = \left (A^{T} \right )^{-1} \) </td> <td align="center"> real orthogonal </td> <td align="center"> \( \sum_k a_{ik} a_{jk} = \sum_k a_{ki} a_{kj} = \delta_{ij} \) </td> </tr>
<tr><td align="center"> \( A = A^{ * } \) </td> <td align="center"> real matrix </td> <td align="center"> \( a_{ij} = a_{ij}^{ * } \) </td> </tr>
<tr><td align="center"> \( A = A^{\dagger} \) </td> <td align="center"> hermitian </td> <td align="center"> \( a_{ij} = a_{ji}^{ * } \) </td> </tr>
<tr><td align="center"> \( A = \left (A^{\dagger} \right )^{-1} \) </td> <td align="center"> unitary </td> <td align="center"> \( \sum_k a_{ik} a_{jk}^{ * } = \sum_k a_{ki}^{ * } a_{kj} = \delta_{ij} \) </td> </tr>
</tbody>
</table>
</div> <!-- col-xs-12 -->
</div> <!-- cell row -->
</div>
</div>


<!-- !split -->
<h2 id="some-famous-matrices" class="anchor">Some famous Matrices </h2>

<ul>
<li> Diagonal if \( a_{ij}=0 \) for \( i\ne j \)</li>
<li> Upper triangular if \( a_{ij}=0 \) for \( i > j \)</li>
<li> Lower triangular if \( a_{ij}=0 \) for \( i < j \)</li>
<li> Upper Hessenberg if \( a_{ij}=0 \) for \( i > j+1 \)</li>
<li> Lower Hessenberg if \( a_{ij}=0 \) for \( i < j+1 \)</li>
<li> Tridiagonal if \( a_{ij}=0 \) for \( |i -j| > 1 \)</li>
<li> Lower banded with bandwidth \( p \): \( a_{ij}=0 \) for \( i > j+p \)</li>
<li> Upper banded with bandwidth \( p \): \( a_{ij}=0 \) for \( i < j+p \)</li>
<li> Banded, block upper triangular, block lower triangular....</li>
</ul>
<!-- !split -->
<h2 id="basic-matrix-features" class="anchor">Basic Matrix Features </h2>

<div class="panel panel-default">
<div class="panel-body">
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->
<p>For an \( N\times N \) matrix \( \mathbf{A} \) the following properties are all equivalent</p>

<ul>
<li> If the inverse of \( \mathbf{A} \) exists, \( \mathbf{A} \) is nonsingular.</li>
<li> The equation \( \mathbf{Ax}=0 \) implies \( \mathbf{x}=0 \).</li>
<li> The rows of \( \mathbf{A} \) form a basis of \( R^N \).</li>
<li> The columns of \( \mathbf{A} \) form a basis of \( R^N \).</li>
<li> \( \mathbf{A} \) is a product of elementary matrices.</li>
<li> \( 0 \) is not eigenvalue of \( \mathbf{A} \).</li>
</ul>
</div>
</div>


<!-- !split -->
<h2 id="important-mathematical-operations" class="anchor">Important Mathematical Operations </h2>

<p>The basic matrix operations that we will deal with are addition and subtraction</p>

$$
\begin{equation}
\mathbf{A}= \mathbf{B}\pm\mathbf{C} \Longrightarrow a_{ij} = b_{ij}\pm c_{ij},
\label{eq:mtxadd}
\end{equation}
$$

<p>scalar-matrix multiplication</p>

$$
\begin{equation}
\mathbf{A}= \gamma\mathbf{B} \Longrightarrow a_{ij} = \gamma b_{ij},
\label{_auto1}
\end{equation}
$$

<p>vector-matrix multiplication</p>

<!-- !split -->
<h2 id="important-mathematical-operations" class="anchor">Important Mathematical Operations </h2>
$$
\begin{equation}
\mathbf{y}=\mathbf{Ax} \Longrightarrow y_{i} = \sum_{j=1}^{n} a_{ij}x_j,
\label{eq:vecmtx}
\end{equation}
$$

<p>matrix-matrix multiplication</p>

$$
\begin{equation}
\mathbf{A}=\mathbf{BC} \Longrightarrow a_{ij} = \sum_{k=1}^{n} b_{ik}c_{kj},
\label{eq:mtxmtx}
\end{equation}
$$

<p>and transposition</p>

$$
\begin{equation}
\mathbf{A}=\mathbf{B}^T \Longrightarrow a_{ij} = b_{ji}
\label{_auto2}
\end{equation}
$$


<!-- !split -->
<h2 id="important-mathematical-operations" class="anchor">Important Mathematical Operations </h2>

<p>Similarly, important vector operations that we will deal with are addition and subtraction</p>

$$
\begin{equation}
\mathbf{x}= \mathbf{y}\pm\mathbf{z} \Longrightarrow x_{i} = y_{i}\pm z_{i},
\label{_auto3}
\end{equation}
$$

<p>scalar-vector multiplication</p>

$$
\begin{equation}
\mathbf{x}= \gamma\mathbf{y} \Longrightarrow x_{i} = \gamma y_{i},
\label{_auto4}
\end{equation}
$$

<p>vector-vector multiplication (called Hadamard multiplication)</p>

<!-- !split -->
<h2 id="important-mathematical-operations" class="anchor">Important Mathematical Operations </h2>
$$
\begin{equation}
\mathbf{x}=\mathbf{yz} \Longrightarrow x_{i} = y_{i}z_i,
\label{_auto5}
\end{equation}
$$

<p>the inner or so-called dot product resulting in a constant</p>

$$
\begin{equation}
x=\mathbf{y}^T\mathbf{z} \Longrightarrow x = \sum_{j=1}^{n} y_{j}z_{j},
\label{eq:innerprod}
\end{equation}
$$

<p>and the outer product, which yields a matrix,</p>

$$
\begin{equation}
\mathbf{A}= \mathbf{yz}^T \Longrightarrow a_{ij} = y_{i}z_{j},
\label{eq:outerprod}
\end{equation}
$$


<!-- ------------------- end of main content --------------- -->
</div> <!-- end container -->
<!-- include javascript, jQuery *first* -->
Expand Down
Loading

0 comments on commit 2d6d397

Please sign in to comment.