diff --git a/doc/pub/week1/html/week1-bs.html b/doc/pub/week1/html/week1-bs.html index af9779a..ee922fb 100644 --- a/doc/pub/week1/html/week1-bs.html +++ b/doc/pub/week1/html/week1-bs.html @@ -51,11 +51,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -69,12 +90,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -93,25 +114,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -342,25 +348,25 @@
The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +
+ +The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +
+ +For those interested in mathematical aspects of deep learning, this could also be included.
+ + +This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +
+ + +The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +
+ + +The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +
+ + +The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +
+ + +Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +
The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -
- -The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -
- -For those interested in mathematical aspects of deep learning, this could also be included.
- - -This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -
- - -The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -
- - -The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -
- - -The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -
- - -Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -
- - --
-
-All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as -from Goodfellow, Bengio and Courville's text Deep Learning -
- - -The course can also be used as a self-study course and besides the lectures, many of you may wish to independently work on your own @@ -296,6 +279,101 @@
The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +
+ +For those interested in mathematical aspects of deep learning, this could also be included.
+This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +
+The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +
+The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +
+The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +
+Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +
++
+All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as +from Goodfellow, Bengio and Courville's text Deep Learning +
+The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -
- -For those interested in mathematical aspects of deep learning, this could also be included.
-This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -
-The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -
-The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -
-The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -
-Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -
-diff --git a/doc/pub/week1/html/week1-solarized.html b/doc/pub/week1/html/week1-solarized.html index c9bf836..9f39a8c 100644 --- a/doc/pub/week1/html/week1-solarized.html +++ b/doc/pub/week1/html/week1-solarized.html @@ -78,11 +78,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -96,12 +117,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -120,25 +141,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -390,16 +396,16 @@
The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +
+ +The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +
+ +For those interested in mathematical aspects of deep learning, this could also be included.
+ +This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +
+ +The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +
+ +The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +
+ +The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +
+ +Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +
The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -
- -The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -
- -For those interested in mathematical aspects of deep learning, this could also be included.
- -This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -
- -The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -
- -The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -
- -The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -
- -Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -
- -diff --git a/doc/pub/week1/html/week1.html b/doc/pub/week1/html/week1.html index 276b14e..1fadfc7 100644 --- a/doc/pub/week1/html/week1.html +++ b/doc/pub/week1/html/week1.html @@ -155,11 +155,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -173,12 +194,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -197,25 +218,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -467,16 +473,16 @@
The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +
+ +The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +
+ +For those interested in mathematical aspects of deep learning, this could also be included.
+ +This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +
+ +The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +
+ +The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +
+ +The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +
+ +Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +
The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -
- -The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -
- -For those interested in mathematical aspects of deep learning, this could also be included.
- -This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -
- -The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -
- -The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -
- -The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -
- -Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -
- -
diff --git a/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz b/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz
index 39a8ea8..07310da 100644
Binary files a/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz and b/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz differ
diff --git a/doc/pub/week1/ipynb/week1.ipynb b/doc/pub/week1/ipynb/week1.ipynb
index e63d572..17ade54 100644
--- a/doc/pub/week1/ipynb/week1.ipynb
+++ b/doc/pub/week1/ipynb/week1.ipynb
@@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "markdown",
- "id": "35a2aa16",
+ "id": "4dfdb2fb",
"metadata": {
"editable": true
},
@@ -14,7 +14,7 @@
},
{
"cell_type": "markdown",
- "id": "41f19827",
+ "id": "bd264ebc",
"metadata": {
"editable": true
},
@@ -27,7 +27,7 @@
},
{
"cell_type": "markdown",
- "id": "53d99299",
+ "id": "94146d6f",
"metadata": {
"editable": true
},
@@ -47,7 +47,7 @@
},
{
"cell_type": "markdown",
- "id": "377ad32f",
+ "id": "92cf2427",
"metadata": {
"editable": true
},
@@ -56,25 +56,25 @@
"\n",
"1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics\n",
"\n",
- "2. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics \n",
+ "2. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics \n",
"\n",
"3. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants\n",
"\n",
- "4. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.\n",
+ "4. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade\n",
"\n",
"5. All info at the GitHub address