Skip to content

Commit

Permalink
update first week
Browse files Browse the repository at this point in the history
  • Loading branch information
mhjensen committed Dec 17, 2024
1 parent f4b21e3 commit d0d1465
Show file tree
Hide file tree
Showing 8 changed files with 919 additions and 899 deletions.
254 changes: 130 additions & 124 deletions doc/pub/week1/html/week1-bs.html

Large diffs are not rendered by default.

206 changes: 103 additions & 103 deletions doc/pub/week1/html/week1-reveal.html
Original file line number Diff line number Diff line change
Expand Up @@ -212,9 +212,9 @@ <h2 id="practicalities">Practicalities </h2>

<ol>
<p><li> Lectures Thursdays 1215pm-2pm, room F&#216;434, Department of Physics</li>
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, , room F&#216;434, Department of Physics</li>
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, room F&#216;434, Department of Physics</li>
<p><li> We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants</li>
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.</li>
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade</li>
<p><li> All info at the GitHub address <a href="https://github.com/CompPhysics/AdvancedMachineLearning" target="_blank"><tt>https://github.com/CompPhysics/AdvancedMachineLearning</tt></a></li>
</ol>
</section>
Expand All @@ -223,7 +223,7 @@ <h2 id="practicalities">Practicalities </h2>
<h2 id="deep-learning-methods-covered-tentative">Deep learning methods covered, tentative </h2>

<ol>
<p><li> <b>Deep learning, classics</b>
<p><li> <b>Deep learning</b>
<ol type="a"></li>
<p><li> Feed forward neural networks and its mathematics (NNs)</li>
<p><li> Convolutional neural networks (CNNs)</li>
Expand All @@ -244,7 +244,7 @@ <h2 id="deep-learning-methods-covered-tentative">Deep learning methods covered,
<p><li> Autoregressive methods (tentative)</li>
</ol>
<p>
<p><li> <b>Physical Sciences (often just called Physics informed) informed machine learning</b></li>
<p><li> <b>Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning</b></li>
</ol>
</section>

Expand All @@ -258,28 +258,11 @@ <h2 id="additional-topics-kernel-regression-gaussian-processes-and-bayesian-stat
variable).
</p>

<p>We have not made plans for Reinforcement learning, but this can be another option.</p>
<p>We have not made plans for Reinforcement learning.</p>
</section>

<section>
<h2 id="good-books-with-hands-on-material-and-codes">Good books with hands-on material and codes </h2>
<div class="alert alert-block alert-block alert-text-normal">
<b></b>
<p>
<ul>
<p><li> <a href="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
<p><li> <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
<p><li> <a href="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
</ul>
</div>

<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
from Goodfellow, Bengio and Courville's text <a href="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
</p>
</section>

<section>
<h2 id="project-paths">Project paths </h2>
<h2 id="project-paths-overarching-view">Project paths, overarching view </h2>

<p>The course can also be used as a self-study course and besides the
lectures, many of you may wish to independently work on your own
Expand All @@ -296,6 +279,101 @@ <h2 id="project-paths">Project paths </h2>
</ol>
</section>

<section>
<h2 id="possible-paths-for-the-projects">Possible paths for the projects </h2>

<p>The differential equation path: Here we propose a set of differential
equations (ordinary and/or partial) to be solved first using neural
networks (using either your own code or TensorFlow/Pytorch or similar
libraries). Thereafter we can extend the set of methods for
solving these equations to recurrent neural networks and autoencoders
(AE) and/or Generalized Adversarial Networks (GANs). All these
approaches can be expanded into one large project. This project can
also be extended into including <a href="https://github.com/maziarraissi/PINNs" target="_blank">Physics informed machine
learning</a>. Here we can discuss
neural networks that are trained to solve supervised learning tasks
while respecting any given law of physics described by general
nonlinear partial differential equations.
</p>

<p>For those interested in mathematical aspects of deep learning, this could also be included.</p>
</section>

<section>
<h2 id="the-generative-models">The generative models </h2>

<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
the lectures. Topics for data sets will be discussed.
</p>
</section>

<section>
<h2 id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>

<p>The computational path: Here we propose a path where you develop your
own code for a convolutional or eventually recurrent neural network
and apply this to data selects of your own selection. The code should
be object oriented and flexible allowing for eventual extensions by
including different Loss/Cost functions and other
functionalities. Feel free to select data sets from those suggested
below here. This code can also be extended upon by adding for example
autoencoders. You can compare your own codes with implementations
using TensorFlow(Keras)/PyTorch or other libraries.
</p>
</section>

<section>
<h2 id="the-application-path-own-data">The application path/own data </h2>

<p>The application path: Here you can use the most relevant method(s)
(say convolutional neural networks for images) and apply this(these)
to data sets relevant for your own research.
</p>
</section>

<section>
<h2 id="gaussian-processes-and-bayesian-analysis">Gaussian processes and Bayesian analysis </h2>

<p>The Gaussian processes/Bayesian statistics path: <a href="https://jenfb.github.io/bkmr/overview.html" target="_blank">Kernel regression
(Gaussian processes) and Bayesian
statistics</a> are popular
tools in the machine learning literature. The main idea behind these
approaches is to flexibly model the relationship between a large
number of variables and a particular outcome (dependent
variable). This can form a second part of a project where for example
standard Kernel regression methods are used on a specific data
set. Alternatively, participants can opt to work on a large project
relevant for their own research using gaussian processes and/or
Bayesian machine Learning.
</p>
</section>

<section>
<h2 id="hpc-path">HPC path </h2>

<p>Another alternative is to study high-performance computing aspects in
designing ML codes. This can also be linked with an exploration of
mathematical aspects of deep learning methods.
</p>
</section>

<section>
<h2 id="good-books-with-hands-on-material-and-codes">Good books with hands-on material and codes </h2>
<div class="alert alert-block alert-block alert-text-normal">
<b></b>
<p>
<ul>
<p><li> <a href="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
<p><li> <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
<p><li> <a href="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
</ul>
</div>

<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
from Goodfellow, Bengio and Courville's text <a href="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
</p>
</section>

<section>
<h2 id="types-of-machine-learning">Types of machine learning </h2>

Expand Down Expand Up @@ -363,7 +441,7 @@ <h2 id="what-is-generative-modeling">What Is Generative Modeling? </h2>
</section>

<section>
<h2 id="example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deeep Learning by David Foster</a> </h2>
<h2 id="example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deep Learning by David Foster</a> </h2>

<br/><br/>
<center>
Expand Down Expand Up @@ -442,85 +520,7 @@ <h2 id="taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning
</section>

<section>
<h2 id="possible-paths-for-the-projects">Possible paths for the projects </h2>

<p>The differential equation path: Here we propose a set of differential
equations (ordinary and/or partial) to be solved first using neural
networks (using either your own code or TensorFlow/Pytorch or similar
libraries). Thereafter we can extend the set of methods for
solving these equations to recurrent neural networks and autoencoders
(AE) and/or Generalized Adversarial Networks (GANs). All these
approaches can be expanded into one large project. This project can
also be extended into including <a href="https://github.com/maziarraissi/PINNs" target="_blank">Physics informed machine
learning</a>. Here we can discuss
neural networks that are trained to solve supervised learning tasks
while respecting any given law of physics described by general
nonlinear partial differential equations.
</p>

<p>For those interested in mathematical aspects of deep learning, this could also be included.</p>
</section>

<section>
<h2 id="the-generative-models">The generative models </h2>

<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
the lectures. Topics for data sets will be discussed during the lab sessions.
</p>
</section>

<section>
<h2 id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>

<p>The computational path: Here we propose a path where you develop your
own code for a convolutional or eventually recurrent neural network
and apply this to data selects of your own selection. The code should
be object oriented and flexible allowing for eventual extensions by
including different Loss/Cost functions and other
functionalities. Feel free to select data sets from those suggested
below here. This code can also be extended upon by adding for example
autoencoders. You can compare your own codes with implementations
using TensorFlow(Keras)/PyTorch or other libraries.
</p>
</section>

<section>
<h2 id="the-application-path">The application path </h2>

<p>The application path: Here you can use the most relevant method(s)
(say convolutional neural networks for images) and apply this(these)
to data sets relevant for your own research.
</p>
</section>

<section>
<h2 id="gaussian-processes-and-bayesian-analysis">Gaussian processes and Bayesian analysis </h2>

<p>The Gaussian processes/Bayesian statistics path: <a href="https://jenfb.github.io/bkmr/overview.html" target="_blank">Kernel regression
(Gaussian processes) and Bayesian
statistics</a> are popular
tools in the machine learning literature. The main idea behind these
approaches is to flexibly model the relationship between a large
number of variables and a particular outcome (dependent
variable). This can form a second part of a project where for example
standard Kernel regression methods are used on a specific data
set. Alternatively, participants can opt to work on a large project
relevant for their own research using gaussian processes and/or
Bayesian machine Learning.
</p>
</section>

<section>
<h2 id="hpc-path">HPC path </h2>

<p>Another alternative is to study high-performance computing aspects in
designing ML codes. This can also be linked with an exploration of
mathematical aspects of deep learning methods.
</p>
</section>

<section>
<h2 id="what-are-the-basic-machine-learning-ingredients">What are the basic Machine Learning ingredients? </h2>
<h2 id="reminder-on-the-basic-machine-learning-ingredients">Reminder on the basic Machine Learning ingredients </h2>
<div class="alert alert-block alert-block alert-text-normal">
<b></b>
<p>
Expand Down
Loading

0 comments on commit d0d1465

Please sign in to comment.