diff --git a/doc/pub/week1/html/week1-bs.html b/doc/pub/week1/html/week1-bs.html index af9779a..ee922fb 100644 --- a/doc/pub/week1/html/week1-bs.html +++ b/doc/pub/week1/html/week1-bs.html @@ -51,11 +51,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -69,12 +90,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -93,25 +114,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -342,25 +348,25 @@
  • Practicalities
  • Deep learning methods covered, tentative
  • "Additional topics: Kernel regression (Gaussian processes) and Bayesian statistics":"https://jenfb.github.io/bkmr/overview.html"
  • +
  • Project paths, overarching view
  • +
  • Possible paths for the projects
  • +
  • The generative models
  • +
  • Paths for projects, writing own codes
  • +
  • The application path/own data
  • +
  • Gaussian processes and Bayesian analysis
  • +
  • HPC path
  • Good books with hands-on material and codes
  • -
  • Project paths
  • Types of machine learning
  • Main categories
  • The plethora of machine learning algorithms/methods
  • What Is Generative Modeling?
  • -
  • Example of generative modeling, "taken from Generative Deeep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"
  • +
  • Example of generative modeling, "taken from Generative Deep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"
  • Generative Modeling
  • Generative Versus Discriminative Modeling
  • Example of discriminative modeling, "taken from Generative Deeep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"
  • Discriminative Modeling
  • Taxonomy of generative deep learning, "taken from Generative Deep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"
  • -
  • Possible paths for the projects
  • -
  • The generative models
  • -
  • Paths for projects, writing own codes
  • -
  • The application path
  • -
  • Gaussian processes and Bayesian analysis
  • -
  • HPC path
  • -
  • What are the basic Machine Learning ingredients?
  • +
  • Reminder on the basic Machine Learning ingredients
  • Low-level machine learning, the family of ordinary least squares methods
  • Setting up the equations
  • The objective/cost/loss function
  • @@ -478,16 +484,16 @@

    Practicalities

    1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics
    2. -
    3. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics
    4. +
    5. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics
    6. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants
    7. -
    8. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.
    9. +
    10. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade
    11. All info at the GitHub address https://github.com/CompPhysics/AdvancedMachineLearning

    Deep learning methods covered, tentative

      -
    1. Deep learning, classics +
    2. Deep learning
      1. Feed forward neural networks and its mathematics (NNs)
      2. Convolutional neural networks (CNNs)
      3. @@ -505,7 +511,7 @@

        Deep learning me
      4. Generative Adversarial Networks (GANs)
      5. Autoregressive methods (tentative)
      -
    3. Physical Sciences (often just called Physics informed) informed machine learning
    4. +
    5. Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning

    Additional topics: Kernel regression (Gaussian processes) and Bayesian statistics

    @@ -517,7 +523,95 @@

    Project paths, overarching view

    + +

    The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +

    + +
      +
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. +
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. +
    5. Implementing generative methods
    6. +
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. +
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. +
    + +

    Possible paths for the projects

    + +

    The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +

    + +

    For those interested in mathematical aspects of deep learning, this could also be included.

    + + +

    The generative models

    + +

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +

    + + +

    Paths for projects, writing own codes

    + +

    The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +

    + + +

    The application path/own data

    + +

    The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +

    + + +

    Gaussian processes and Bayesian analysis

    + +

    The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +

    + + +

    HPC path

    + +

    Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +

    Good books with hands-on material and codes

    @@ -537,22 +631,6 @@

    Good books w from Goodfellow, Bengio and Courville's text Deep Learning

    - -

    Project paths

    - -

    The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -

    - -
      -
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. -
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. -
    5. Implementing generative methods
    6. -
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. -
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. -

    Types of machine learning

    @@ -619,7 +697,7 @@

    What Is Generative Modeling?

    -

    Example of generative modeling, taken from Generative Deeep Learning by David Foster

    +

    Example of generative modeling, taken from Generative Deep Learning by David Foster



    @@ -692,79 +770,7 @@

    Possible paths for the projects

    - -

    The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -

    - -

    For those interested in mathematical aspects of deep learning, this could also be included.

    - - -

    The generative models

    - -

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -

    - - -

    Paths for projects, writing own codes

    - -

    The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -

    - - -

    The application path

    - -

    The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -

    - - -

    Gaussian processes and Bayesian analysis

    - -

    The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -

    - - -

    HPC path

    - -

    Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -

    - - -

    What are the basic Machine Learning ingredients?

    +

    Reminder on the basic Machine Learning ingredients

    diff --git a/doc/pub/week1/html/week1-reveal.html b/doc/pub/week1/html/week1-reveal.html index 57b621b..0be7873 100644 --- a/doc/pub/week1/html/week1-reveal.html +++ b/doc/pub/week1/html/week1-reveal.html @@ -212,9 +212,9 @@

    Practicalities

    1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics
    2. -

    3. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics
    4. +

    5. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics
    6. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants
    7. -

    8. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.
    9. +

    10. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade
    11. All info at the GitHub address https://github.com/CompPhysics/AdvancedMachineLearning
    @@ -223,7 +223,7 @@

    Practicalities

    Deep learning methods covered, tentative

      -

    1. Deep learning, classics +

    2. Deep learning

      1. Feed forward neural networks and its mathematics (NNs)
      2. Convolutional neural networks (CNNs)
      3. @@ -244,7 +244,7 @@

        Deep learning methods covered,

      4. Autoregressive methods (tentative)

      -

    3. Physical Sciences (often just called Physics informed) informed machine learning
    4. +

    5. Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning
    @@ -258,28 +258,11 @@

    Good books with hands-on material and codes

    - - -

    All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as -from Goodfellow, Bengio and Courville's text Deep Learning -

    - - -
    -

    Project paths

    +

    Project paths, overarching view

    The course can also be used as a self-study course and besides the lectures, many of you may wish to independently work on your own @@ -296,6 +279,101 @@

    Project paths

    +
    +

    Possible paths for the projects

    + +

    The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +

    + +

    For those interested in mathematical aspects of deep learning, this could also be included.

    +
    + +
    +

    The generative models

    + +

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +

    +
    + +
    +

    Paths for projects, writing own codes

    + +

    The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +

    +
    + +
    +

    The application path/own data

    + +

    The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +

    +
    + +
    +

    Gaussian processes and Bayesian analysis

    + +

    The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +

    +
    + +
    +

    HPC path

    + +

    Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +

    +
    + +
    +

    Good books with hands-on material and codes

    + + +

    All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as +from Goodfellow, Bengio and Courville's text Deep Learning +

    +
    +

    Types of machine learning

    @@ -363,7 +441,7 @@

    What Is Generative Modeling?

    -

    Example of generative modeling, taken from Generative Deeep Learning by David Foster

    +

    Example of generative modeling, taken from Generative Deep Learning by David Foster



    @@ -442,85 +520,7 @@

    Possible paths for the projects

    - -

    The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -

    - -

    For those interested in mathematical aspects of deep learning, this could also be included.

    -
    - -
    -

    The generative models

    - -

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -

    -
    - -
    -

    Paths for projects, writing own codes

    - -

    The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -

    -
    - -
    -

    The application path

    - -

    The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -

    -
    - -
    -

    Gaussian processes and Bayesian analysis

    - -

    The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -

    -
    - -
    -

    HPC path

    - -

    Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -

    -
    - -
    -

    What are the basic Machine Learning ingredients?

    +

    Reminder on the basic Machine Learning ingredients

    diff --git a/doc/pub/week1/html/week1-solarized.html b/doc/pub/week1/html/week1-solarized.html index c9bf836..9f39a8c 100644 --- a/doc/pub/week1/html/week1-solarized.html +++ b/doc/pub/week1/html/week1-solarized.html @@ -78,11 +78,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -96,12 +117,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -120,25 +141,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -390,16 +396,16 @@

    Practicalities

    1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics
    2. -
    3. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics
    4. +
    5. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics
    6. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants
    7. -
    8. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.
    9. +
    10. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade
    11. All info at the GitHub address https://github.com/CompPhysics/AdvancedMachineLearning










    Deep learning methods covered, tentative

      -
    1. Deep learning, classics +
    2. Deep learning
      1. Feed forward neural networks and its mathematics (NNs)
      2. Convolutional neural networks (CNNs)
      3. @@ -417,7 +423,7 @@

        Deep learning methods covered,
      4. Generative Adversarial Networks (GANs)
      5. Autoregressive methods (tentative)
      -
    3. Physical Sciences (often just called Physics informed) informed machine learning
    4. +
    5. Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning










    Additional topics: Kernel regression (Gaussian processes) and Bayesian statistics

    @@ -429,7 +435,95 @@

    Project paths, overarching view

    + +

    The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +

    + +
      +
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. +
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. +
    5. Implementing generative methods
    6. +
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. +
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. +
    +









    +

    Possible paths for the projects

    + +

    The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +

    + +

    For those interested in mathematical aspects of deep learning, this could also be included.

    + +









    +

    The generative models

    + +

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +

    + +









    +

    Paths for projects, writing own codes

    + +

    The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +

    + +









    +

    The application path/own data

    + +

    The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +

    + +









    +

    Gaussian processes and Bayesian analysis

    + +

    The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +

    + +









    +

    HPC path

    + +

    Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +











    Good books with hands-on material and codes

    @@ -448,22 +542,6 @@

    Good books with hands-on ma from Goodfellow, Bengio and Courville's text Deep Learning

    -









    -

    Project paths

    - -

    The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -

    - -
      -
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. -
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. -
    5. Implementing generative methods
    6. -
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. -
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. -










    Types of machine learning

    @@ -528,7 +606,7 @@

    What Is Generative Modeling?











    -

    Example of generative modeling, taken from Generative Deeep Learning by David Foster

    +

    Example of generative modeling, taken from Generative Deep Learning by David Foster



    @@ -601,79 +679,7 @@

    Possible paths for the projects

    - -

    The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -

    - -

    For those interested in mathematical aspects of deep learning, this could also be included.

    - -









    -

    The generative models

    - -

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -

    - -









    -

    Paths for projects, writing own codes

    - -

    The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -

    - -









    -

    The application path

    - -

    The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -

    - -









    -

    Gaussian processes and Bayesian analysis

    - -

    The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -

    - -









    -

    HPC path

    - -

    Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -

    - -









    -

    What are the basic Machine Learning ingredients?

    +

    Reminder on the basic Machine Learning ingredients

    diff --git a/doc/pub/week1/html/week1.html b/doc/pub/week1/html/week1.html index 276b14e..1fadfc7 100644 --- a/doc/pub/week1/html/week1.html +++ b/doc/pub/week1/html/week1.html @@ -155,11 +155,32 @@ 2, None, 'additional-topics-kernel-regression-gaussian-processes-and-bayesian-statistics-https-jenfb-github-io-bkmr-overview-html'), + ('Project paths, overarching view', + 2, + None, + 'project-paths-overarching-view'), + ('Possible paths for the projects', + 2, + None, + 'possible-paths-for-the-projects'), + ('The generative models', 2, None, 'the-generative-models'), + ('Paths for projects, writing own codes', + 2, + None, + 'paths-for-projects-writing-own-codes'), + ('The application path/own data', + 2, + None, + 'the-application-path-own-data'), + ('Gaussian processes and Bayesian analysis', + 2, + None, + 'gaussian-processes-and-bayesian-analysis'), + ('HPC path', 2, None, 'hpc-path'), ('Good books with hands-on material and codes', 2, None, 'good-books-with-hands-on-material-and-codes'), - ('Project paths', 2, None, 'project-paths'), ('Types of machine learning', 2, None, @@ -173,12 +194,12 @@ 2, None, 'what-is-generative-modeling'), - ('Example of generative modeling, "taken from Generative Deeep ' + ('Example of generative modeling, "taken from Generative Deep ' 'Learning by David ' 'Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html"', 2, None, - 'example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), + 'example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), ('Generative Modeling', 2, None, 'generative-modeling'), ('Generative Versus Discriminative Modeling', 2, @@ -197,25 +218,10 @@ 2, None, 'taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html'), - ('Possible paths for the projects', + ('Reminder on the basic Machine Learning ingredients', 2, None, - 'possible-paths-for-the-projects'), - ('The generative models', 2, None, 'the-generative-models'), - ('Paths for projects, writing own codes', - 2, - None, - 'paths-for-projects-writing-own-codes'), - ('The application path', 2, None, 'the-application-path'), - ('Gaussian processes and Bayesian analysis', - 2, - None, - 'gaussian-processes-and-bayesian-analysis'), - ('HPC path', 2, None, 'hpc-path'), - ('What are the basic Machine Learning ingredients?', - 2, - None, - 'what-are-the-basic-machine-learning-ingredients'), + 'reminder-on-the-basic-machine-learning-ingredients'), ('Low-level machine learning, the family of ordinary least ' 'squares methods', 2, @@ -467,16 +473,16 @@

    Practicalities

    1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics
    2. -
    3. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics
    4. +
    5. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics
    6. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants
    7. -
    8. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.
    9. +
    10. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade
    11. All info at the GitHub address https://github.com/CompPhysics/AdvancedMachineLearning










    Deep learning methods covered, tentative

      -
    1. Deep learning, classics +
    2. Deep learning
      1. Feed forward neural networks and its mathematics (NNs)
      2. Convolutional neural networks (CNNs)
      3. @@ -494,7 +500,7 @@

        Deep learning methods covered,
      4. Generative Adversarial Networks (GANs)
      5. Autoregressive methods (tentative)
      -
    3. Physical Sciences (often just called Physics informed) informed machine learning
    4. +
    5. Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning










    Additional topics: Kernel regression (Gaussian processes) and Bayesian statistics

    @@ -506,7 +512,95 @@

    Project paths, overarching view

    + +

    The course can also be used as a self-study course and besides the +lectures, many of you may wish to independently work on your own +projects related to for example your thesis or research. In general, +in addition to the lectures, we have often followed five main paths: +

    + +
      +
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. +
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. +
    5. Implementing generative methods
    6. +
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. +
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. +
    +









    +

    Possible paths for the projects

    + +

    The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including Physics informed machine +learning. Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. +

    + +

    For those interested in mathematical aspects of deep learning, this could also be included.

    + +









    +

    The generative models

    + +

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. +

    + +









    +

    Paths for projects, writing own codes

    + +

    The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. +

    + +









    +

    The application path/own data

    + +

    The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. +

    + +









    +

    Gaussian processes and Bayesian analysis

    + +

    The Gaussian processes/Bayesian statistics path: Kernel regression +(Gaussian processes) and Bayesian +statistics are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. +

    + +









    +

    HPC path

    + +

    Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. +











    Good books with hands-on material and codes

    @@ -525,22 +619,6 @@

    Good books with hands-on ma from Goodfellow, Bengio and Courville's text Deep Learning

    -









    -

    Project paths

    - -

    The course can also be used as a self-study course and besides the -lectures, many of you may wish to independently work on your own -projects related to for example your thesis or research. In general, -in addition to the lectures, we have often followed five main paths: -

    - -
      -
    1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.
    2. -
    3. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.
    4. -
    5. Implementing generative methods
    6. -
    7. The own data path. Some of you may have data you wish to analyze with different deep learning methods
    8. -
    9. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work.
    10. -










    Types of machine learning

    @@ -605,7 +683,7 @@

    What Is Generative Modeling?











    -

    Example of generative modeling, taken from Generative Deeep Learning by David Foster

    +

    Example of generative modeling, taken from Generative Deep Learning by David Foster



    @@ -678,79 +756,7 @@

    Possible paths for the projects

    - -

    The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including Physics informed machine -learning. Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. -

    - -

    For those interested in mathematical aspects of deep learning, this could also be included.

    - -









    -

    The generative models

    - -

    This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. -

    - -









    -

    Paths for projects, writing own codes

    - -

    The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. -

    - -









    -

    The application path

    - -

    The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. -

    - -









    -

    Gaussian processes and Bayesian analysis

    - -

    The Gaussian processes/Bayesian statistics path: Kernel regression -(Gaussian processes) and Bayesian -statistics are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. -

    - -









    -

    HPC path

    - -

    Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. -

    - -









    -

    What are the basic Machine Learning ingredients?

    +

    Reminder on the basic Machine Learning ingredients

    diff --git a/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz b/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz index 39a8ea8..07310da 100644 Binary files a/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz and b/doc/pub/week1/ipynb/ipynb-week1-src.tar.gz differ diff --git a/doc/pub/week1/ipynb/week1.ipynb b/doc/pub/week1/ipynb/week1.ipynb index e63d572..17ade54 100644 --- a/doc/pub/week1/ipynb/week1.ipynb +++ b/doc/pub/week1/ipynb/week1.ipynb @@ -2,7 +2,7 @@ "cells": [ { "cell_type": "markdown", - "id": "35a2aa16", + "id": "4dfdb2fb", "metadata": { "editable": true }, @@ -14,7 +14,7 @@ }, { "cell_type": "markdown", - "id": "41f19827", + "id": "bd264ebc", "metadata": { "editable": true }, @@ -27,7 +27,7 @@ }, { "cell_type": "markdown", - "id": "53d99299", + "id": "94146d6f", "metadata": { "editable": true }, @@ -47,7 +47,7 @@ }, { "cell_type": "markdown", - "id": "377ad32f", + "id": "92cf2427", "metadata": { "editable": true }, @@ -56,25 +56,25 @@ "\n", "1. Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics\n", "\n", - "2. Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics \n", + "2. Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics \n", "\n", "3. We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants\n", "\n", - "4. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.\n", + "4. No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade\n", "\n", "5. All info at the GitHub address " ] }, { "cell_type": "markdown", - "id": "b3d101e3", + "id": "6ac71ef7", "metadata": { "editable": true }, "source": [ "## Deep learning methods covered, tentative\n", "\n", - "1. **Deep learning, classics**\n", + "1. **Deep learning**\n", "\n", "a. Feed forward neural networks and its mathematics (NNs)\n", "\n", @@ -102,12 +102,12 @@ "\n", "f. Autoregressive methods (tentative)\n", "\n", - "7. **Physical Sciences (often just called Physics informed) informed machine learning**" + "7. **Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning**" ] }, { "cell_type": "markdown", - "id": "7feacd31", + "id": "b48670dd", "metadata": { "editable": true }, @@ -120,55 +120,163 @@ "large number of variables and a particular outcome (dependent\n", "variable).\n", "\n", - "We have not made plans for Reinforcement learning, but this can be another option." + "We have not made plans for Reinforcement learning." ] }, { "cell_type": "markdown", - "id": "2b4d7869", + "id": "31f5ea12", "metadata": { "editable": true }, "source": [ - "## Good books with hands-on material and codes\n", - "* [Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch](https://sebastianraschka.com/blog/2022/ml-pytorch-book.html)\n", + "## Project paths, overarching view\n", "\n", - "* [David Foster, Generative Deep Learning with TensorFlow](https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html)\n", + "The course can also be used as a self-study course and besides the\n", + "lectures, many of you may wish to independently work on your own\n", + "projects related to for example your thesis or research. In general,\n", + "in addition to the lectures, we have often followed five main paths:\n", "\n", - "* [Bali and Gavras, Generative AI with Python and TensorFlow 2](https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2)\n", + "1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.\n", "\n", - "All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as \n", - "from Goodfellow, Bengio and Courville's text [Deep Learning](https://www.deeplearningbook.org/)" + "2. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.\n", + "\n", + "3. Implementing generative methods\n", + "\n", + "4. The own data path. Some of you may have data you wish to analyze with different deep learning methods\n", + "\n", + "5. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work." ] }, { "cell_type": "markdown", - "id": "3f64deee", + "id": "1b143726", "metadata": { "editable": true }, "source": [ - "## Project paths\n", + "## Possible paths for the projects\n", "\n", - "The course can also be used as a self-study course and besides the\n", - "lectures, many of you may wish to independently work on your own\n", - "projects related to for example your thesis or research. In general,\n", - "in addition to the lectures, we have often followed five main paths:\n", + "The differential equation path: Here we propose a set of differential\n", + "equations (ordinary and/or partial) to be solved first using neural\n", + "networks (using either your own code or TensorFlow/Pytorch or similar\n", + "libraries). Thereafter we can extend the set of methods for\n", + "solving these equations to recurrent neural networks and autoencoders\n", + "(AE) and/or Generalized Adversarial Networks (GANs). All these\n", + "approaches can be expanded into one large project. This project can\n", + "also be extended into including [Physics informed machine\n", + "learning](https://github.com/maziarraissi/PINNs). Here we can discuss\n", + "neural networks that are trained to solve supervised learning tasks\n", + "while respecting any given law of physics described by general\n", + "nonlinear partial differential equations.\n", "\n", - "1. The coding path. This leads often to a single project only where one focuses on coding for example CNNs or RNNs or parts of LLMs from scratch.\n", + "For those interested in mathematical aspects of deep learning, this could also be included." + ] + }, + { + "cell_type": "markdown", + "id": "46bb0b04", + "metadata": { + "editable": true + }, + "source": [ + "## The generative models\n", "\n", - "2. The Physics Informed neural network path (PINNs). Here we define some basic PDEs which are solved by using PINNs. We start normally with studies of selected differential equations using NNs, and/or RNNs, and/or GNNs or Autoencoders before moving over to PINNs.\n", + "This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent\n", + "the lectures. Topics for data sets will be discussed." + ] + }, + { + "cell_type": "markdown", + "id": "f715b938", + "metadata": { + "editable": true + }, + "source": [ + "## Paths for projects, writing own codes\n", "\n", - "3. Implementing generative methods\n", + "The computational path: Here we propose a path where you develop your\n", + "own code for a convolutional or eventually recurrent neural network\n", + "and apply this to data selects of your own selection. The code should\n", + "be object oriented and flexible allowing for eventual extensions by\n", + "including different Loss/Cost functions and other\n", + "functionalities. Feel free to select data sets from those suggested\n", + "below here. This code can also be extended upon by adding for example\n", + "autoencoders. You can compare your own codes with implementations\n", + "using TensorFlow(Keras)/PyTorch or other libraries." + ] + }, + { + "cell_type": "markdown", + "id": "7a7fd976", + "metadata": { + "editable": true + }, + "source": [ + "## The application path/own data\n", "\n", - "4. The own data path. Some of you may have data you wish to analyze with different deep learning methods\n", + "The application path: Here you can use the most relevant method(s)\n", + "(say convolutional neural networks for images) and apply this(these)\n", + "to data sets relevant for your own research." + ] + }, + { + "cell_type": "markdown", + "id": "e01b248d", + "metadata": { + "editable": true + }, + "source": [ + "## Gaussian processes and Bayesian analysis\n", "\n", - "5. The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work." + "The Gaussian processes/Bayesian statistics path: [Kernel regression\n", + "(Gaussian processes) and Bayesian\n", + "statistics](https://jenfb.github.io/bkmr/overview.html) are popular\n", + "tools in the machine learning literature. The main idea behind these\n", + "approaches is to flexibly model the relationship between a large\n", + "number of variables and a particular outcome (dependent\n", + "variable). This can form a second part of a project where for example\n", + "standard Kernel regression methods are used on a specific data\n", + "set. Alternatively, participants can opt to work on a large project\n", + "relevant for their own research using gaussian processes and/or\n", + "Bayesian machine Learning." + ] + }, + { + "cell_type": "markdown", + "id": "25e91e6f", + "metadata": { + "editable": true + }, + "source": [ + "## HPC path\n", + "\n", + "Another alternative is to study high-performance computing aspects in\n", + "designing ML codes. This can also be linked with an exploration of\n", + "mathematical aspects of deep learning methods." + ] + }, + { + "cell_type": "markdown", + "id": "ea8c8e03", + "metadata": { + "editable": true + }, + "source": [ + "## Good books with hands-on material and codes\n", + "* [Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch](https://sebastianraschka.com/blog/2022/ml-pytorch-book.html)\n", + "\n", + "* [David Foster, Generative Deep Learning with TensorFlow](https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html)\n", + "\n", + "* [Bali and Gavras, Generative AI with Python and TensorFlow 2](https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2)\n", + "\n", + "All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as \n", + "from Goodfellow, Bengio and Courville's text [Deep Learning](https://www.deeplearningbook.org/)" ] }, { "cell_type": "markdown", - "id": "9d95acb0", + "id": "cf44a2b7", "metadata": { "editable": true }, @@ -185,7 +293,7 @@ }, { "cell_type": "markdown", - "id": "5ce32c14", + "id": "72c49d2b", "metadata": { "editable": true }, @@ -204,7 +312,7 @@ }, { "cell_type": "markdown", - "id": "19bf3058", + "id": "40b22809", "metadata": { "editable": true }, @@ -226,7 +334,7 @@ }, { "cell_type": "markdown", - "id": "c0c7a8c3", + "id": "f3907af6", "metadata": { "editable": true }, @@ -249,12 +357,12 @@ }, { "cell_type": "markdown", - "id": "214a436d", + "id": "29627d3a", "metadata": { "editable": true }, "source": [ - "## Example of generative modeling, [taken from Generative Deeep Learning by David Foster](https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html)\n", + "## Example of generative modeling, [taken from Generative Deep Learning by David Foster](https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html)\n", "\n", "\n", "\n", @@ -265,7 +373,7 @@ }, { "cell_type": "markdown", - "id": "50175294", + "id": "6207eb1d", "metadata": { "editable": true }, @@ -291,7 +399,7 @@ }, { "cell_type": "markdown", - "id": "f38079fd", + "id": "3926710a", "metadata": { "editable": true }, @@ -307,7 +415,7 @@ }, { "cell_type": "markdown", - "id": "5831d8e0", + "id": "cd93995e", "metadata": { "editable": true }, @@ -323,7 +431,7 @@ }, { "cell_type": "markdown", - "id": "102b8710", + "id": "b4887da0", "metadata": { "editable": true }, @@ -343,7 +451,7 @@ }, { "cell_type": "markdown", - "id": "6adc6f10", + "id": "1a9f2d39", "metadata": { "editable": true }, @@ -359,120 +467,12 @@ }, { "cell_type": "markdown", - "id": "5f7cd9b2", - "metadata": { - "editable": true - }, - "source": [ - "## Possible paths for the projects\n", - "\n", - "The differential equation path: Here we propose a set of differential\n", - "equations (ordinary and/or partial) to be solved first using neural\n", - "networks (using either your own code or TensorFlow/Pytorch or similar\n", - "libraries). Thereafter we can extend the set of methods for\n", - "solving these equations to recurrent neural networks and autoencoders\n", - "(AE) and/or Generalized Adversarial Networks (GANs). All these\n", - "approaches can be expanded into one large project. This project can\n", - "also be extended into including [Physics informed machine\n", - "learning](https://github.com/maziarraissi/PINNs). Here we can discuss\n", - "neural networks that are trained to solve supervised learning tasks\n", - "while respecting any given law of physics described by general\n", - "nonlinear partial differential equations.\n", - "\n", - "For those interested in mathematical aspects of deep learning, this could also be included." - ] - }, - { - "cell_type": "markdown", - "id": "3978ce4d", - "metadata": { - "editable": true - }, - "source": [ - "## The generative models\n", - "\n", - "This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent\n", - "the lectures. Topics for data sets will be discussed during the lab sessions." - ] - }, - { - "cell_type": "markdown", - "id": "a743d21b", - "metadata": { - "editable": true - }, - "source": [ - "## Paths for projects, writing own codes\n", - "\n", - "The computational path: Here we propose a path where you develop your\n", - "own code for a convolutional or eventually recurrent neural network\n", - "and apply this to data selects of your own selection. The code should\n", - "be object oriented and flexible allowing for eventual extensions by\n", - "including different Loss/Cost functions and other\n", - "functionalities. Feel free to select data sets from those suggested\n", - "below here. This code can also be extended upon by adding for example\n", - "autoencoders. You can compare your own codes with implementations\n", - "using TensorFlow(Keras)/PyTorch or other libraries." - ] - }, - { - "cell_type": "markdown", - "id": "5e0882e0", - "metadata": { - "editable": true - }, - "source": [ - "## The application path\n", - "\n", - "The application path: Here you can use the most relevant method(s)\n", - "(say convolutional neural networks for images) and apply this(these)\n", - "to data sets relevant for your own research." - ] - }, - { - "cell_type": "markdown", - "id": "1c59878d", - "metadata": { - "editable": true - }, - "source": [ - "## Gaussian processes and Bayesian analysis\n", - "\n", - "The Gaussian processes/Bayesian statistics path: [Kernel regression\n", - "(Gaussian processes) and Bayesian\n", - "statistics](https://jenfb.github.io/bkmr/overview.html) are popular\n", - "tools in the machine learning literature. The main idea behind these\n", - "approaches is to flexibly model the relationship between a large\n", - "number of variables and a particular outcome (dependent\n", - "variable). This can form a second part of a project where for example\n", - "standard Kernel regression methods are used on a specific data\n", - "set. Alternatively, participants can opt to work on a large project\n", - "relevant for their own research using gaussian processes and/or\n", - "Bayesian machine Learning." - ] - }, - { - "cell_type": "markdown", - "id": "3410a9f1", - "metadata": { - "editable": true - }, - "source": [ - "## HPC path\n", - "\n", - "Another alternative is to study high-performance computing aspects in\n", - "designing ML codes. This can also be linked with an exploration of\n", - "mathematical aspects of deep learning methods." - ] - }, - { - "cell_type": "markdown", - "id": "5d4a8730", + "id": "b64ee9bb", "metadata": { "editable": true }, "source": [ - "## What are the basic Machine Learning ingredients?\n", + "## Reminder on the basic Machine Learning ingredients\n", "Almost every problem in ML and data science starts with the same ingredients:\n", "* The dataset $\\boldsymbol{x}$ (could be some observable quantity of the system we are studying)\n", "\n", @@ -485,7 +485,7 @@ }, { "cell_type": "markdown", - "id": "716ce7e8", + "id": "29dcd5f3", "metadata": { "editable": true }, @@ -501,7 +501,7 @@ }, { "cell_type": "markdown", - "id": "27f027db", + "id": "0cfeb173", "metadata": { "editable": true }, @@ -513,7 +513,7 @@ }, { "cell_type": "markdown", - "id": "dbb15a86", + "id": "a7544885", "metadata": { "editable": true }, @@ -531,7 +531,7 @@ }, { "cell_type": "markdown", - "id": "413520c7", + "id": "3b0b5d75", "metadata": { "editable": true }, @@ -543,7 +543,7 @@ }, { "cell_type": "markdown", - "id": "435a658e", + "id": "9f923ae6", "metadata": { "editable": true }, @@ -555,7 +555,7 @@ }, { "cell_type": "markdown", - "id": "3b80ca92", + "id": "ebb80924", "metadata": { "editable": true }, @@ -567,7 +567,7 @@ }, { "cell_type": "markdown", - "id": "00b46f1a", + "id": "87b424b3", "metadata": { "editable": true }, @@ -577,7 +577,7 @@ }, { "cell_type": "markdown", - "id": "5367534b", + "id": "6ed02b88", "metadata": { "editable": true }, @@ -589,7 +589,7 @@ }, { "cell_type": "markdown", - "id": "23c73785", + "id": "f83aa963", "metadata": { "editable": true }, @@ -599,7 +599,7 @@ }, { "cell_type": "markdown", - "id": "c8e9053f", + "id": "f23fcbd8", "metadata": { "editable": true }, @@ -611,7 +611,7 @@ }, { "cell_type": "markdown", - "id": "eb37f6a4", + "id": "843db825", "metadata": { "editable": true }, @@ -623,7 +623,7 @@ }, { "cell_type": "markdown", - "id": "7181571b", + "id": "de919013", "metadata": { "editable": true }, @@ -633,7 +633,7 @@ }, { "cell_type": "markdown", - "id": "0132397d", + "id": "7a277936", "metadata": { "editable": true }, @@ -645,7 +645,7 @@ }, { "cell_type": "markdown", - "id": "0e8158e2", + "id": "dfd81ea3", "metadata": { "editable": true }, @@ -655,7 +655,7 @@ }, { "cell_type": "markdown", - "id": "b65715a8", + "id": "14a066ea", "metadata": { "editable": true }, @@ -667,7 +667,7 @@ }, { "cell_type": "markdown", - "id": "82a710e7", + "id": "9d1cdf74", "metadata": { "editable": true }, @@ -679,7 +679,7 @@ }, { "cell_type": "markdown", - "id": "7d0b3842", + "id": "a411b7a9", "metadata": { "editable": true }, @@ -689,7 +689,7 @@ }, { "cell_type": "markdown", - "id": "40abfdcd", + "id": "26022ac3", "metadata": { "editable": true }, @@ -702,7 +702,7 @@ }, { "cell_type": "markdown", - "id": "55c44fdf", + "id": "8c97f768", "metadata": { "editable": true }, @@ -712,7 +712,7 @@ }, { "cell_type": "markdown", - "id": "f1294bcf", + "id": "d3d8f152", "metadata": { "editable": true }, @@ -724,7 +724,7 @@ }, { "cell_type": "markdown", - "id": "ee47e84d", + "id": "1a014d0e", "metadata": { "editable": true }, @@ -739,7 +739,7 @@ }, { "cell_type": "markdown", - "id": "898a5b59", + "id": "35b8c643", "metadata": { "editable": true }, @@ -752,7 +752,7 @@ }, { "cell_type": "markdown", - "id": "70970d12", + "id": "c4af4ffb", "metadata": { "editable": true }, @@ -764,7 +764,7 @@ }, { "cell_type": "markdown", - "id": "566e03af", + "id": "42451e95", "metadata": { "editable": true }, @@ -776,7 +776,7 @@ }, { "cell_type": "markdown", - "id": "457a4e36", + "id": "0ae6abcf", "metadata": { "editable": true }, @@ -788,7 +788,7 @@ }, { "cell_type": "markdown", - "id": "d11bb124", + "id": "63448ab1", "metadata": { "editable": true }, @@ -798,7 +798,7 @@ }, { "cell_type": "markdown", - "id": "e11d31f7", + "id": "e6af7dfb", "metadata": { "editable": true }, @@ -811,7 +811,7 @@ }, { "cell_type": "markdown", - "id": "77171dac", + "id": "5e735518", "metadata": { "editable": true }, @@ -822,7 +822,7 @@ }, { "cell_type": "markdown", - "id": "a6abe6e0", + "id": "6a33db8e", "metadata": { "editable": true }, @@ -834,7 +834,7 @@ }, { "cell_type": "markdown", - "id": "e40d9e85", + "id": "d194719f", "metadata": { "editable": true }, @@ -845,7 +845,7 @@ }, { "cell_type": "markdown", - "id": "53f2e976", + "id": "f1ba5ef0", "metadata": { "editable": true }, @@ -859,7 +859,7 @@ }, { "cell_type": "markdown", - "id": "b65287b1", + "id": "418a5ed9", "metadata": { "editable": true }, @@ -869,7 +869,7 @@ }, { "cell_type": "markdown", - "id": "42895bb8", + "id": "3ea79bd7", "metadata": { "editable": true }, @@ -881,7 +881,7 @@ }, { "cell_type": "markdown", - "id": "575c91fe", + "id": "a9423235", "metadata": { "editable": true }, @@ -891,7 +891,7 @@ }, { "cell_type": "markdown", - "id": "ff9de4da", + "id": "39309503", "metadata": { "editable": true }, @@ -905,7 +905,7 @@ }, { "cell_type": "markdown", - "id": "18b6baee", + "id": "5a74f803", "metadata": { "editable": true }, @@ -917,7 +917,7 @@ }, { "cell_type": "markdown", - "id": "0e3cb40d", + "id": "2217d3d0", "metadata": { "editable": true }, @@ -928,7 +928,7 @@ }, { "cell_type": "markdown", - "id": "460e9dc9", + "id": "cb9e4fef", "metadata": { "editable": true }, @@ -942,7 +942,7 @@ }, { "cell_type": "markdown", - "id": "79714009", + "id": "0d08a186", "metadata": { "editable": true }, @@ -953,7 +953,7 @@ }, { "cell_type": "markdown", - "id": "9099b9e6", + "id": "97b4504a", "metadata": { "editable": true }, @@ -965,7 +965,7 @@ }, { "cell_type": "markdown", - "id": "f62d85e8", + "id": "fa03aebd", "metadata": { "editable": true }, @@ -975,7 +975,7 @@ }, { "cell_type": "markdown", - "id": "083870b3", + "id": "51693a86", "metadata": { "editable": true }, @@ -987,7 +987,7 @@ }, { "cell_type": "markdown", - "id": "f7dbe771", + "id": "7517b452", "metadata": { "editable": true }, @@ -997,7 +997,7 @@ }, { "cell_type": "markdown", - "id": "1521def5", + "id": "cb03bf6b", "metadata": { "editable": true }, @@ -1012,7 +1012,7 @@ }, { "cell_type": "markdown", - "id": "94c5cac6", + "id": "fcef09bc", "metadata": { "editable": true }, @@ -1024,7 +1024,7 @@ }, { "cell_type": "markdown", - "id": "b6c8888d", + "id": "24d062f9", "metadata": { "editable": true }, @@ -1035,7 +1035,7 @@ }, { "cell_type": "markdown", - "id": "989f68ed", + "id": "1b9eb817", "metadata": { "editable": true }, @@ -1047,7 +1047,7 @@ }, { "cell_type": "markdown", - "id": "df06c46a", + "id": "6bdd47f9", "metadata": { "editable": true }, @@ -1058,7 +1058,7 @@ }, { "cell_type": "markdown", - "id": "e579cc48", + "id": "3c24ee93", "metadata": { "editable": true }, @@ -1070,7 +1070,7 @@ }, { "cell_type": "markdown", - "id": "b4b35959", + "id": "81a1996c", "metadata": { "editable": true }, @@ -1080,7 +1080,7 @@ }, { "cell_type": "markdown", - "id": "4d11297b", + "id": "64071d26", "metadata": { "editable": true }, @@ -1092,7 +1092,7 @@ }, { "cell_type": "markdown", - "id": "e1a915af", + "id": "ed893fc2", "metadata": { "editable": true }, @@ -1102,7 +1102,7 @@ }, { "cell_type": "markdown", - "id": "f57c68c2", + "id": "206da3b1", "metadata": { "editable": true }, @@ -1114,7 +1114,7 @@ }, { "cell_type": "markdown", - "id": "dd9c4c0b", + "id": "eaa4f630", "metadata": { "editable": true }, @@ -1124,7 +1124,7 @@ }, { "cell_type": "markdown", - "id": "fcf86c8b", + "id": "9a3b722f", "metadata": { "editable": true }, @@ -1141,7 +1141,7 @@ }, { "cell_type": "markdown", - "id": "38c3be34", + "id": "494d75ea", "metadata": { "editable": true }, @@ -1157,7 +1157,7 @@ }, { "cell_type": "markdown", - "id": "29a3351c", + "id": "a0b5ce27", "metadata": { "editable": true }, @@ -1173,7 +1173,7 @@ }, { "cell_type": "markdown", - "id": "7fd03ede", + "id": "5257a3ee", "metadata": { "editable": true }, @@ -1197,7 +1197,7 @@ }, { "cell_type": "markdown", - "id": "c2dc7dd1", + "id": "50fc6f0d", "metadata": { "editable": true }, @@ -1213,7 +1213,7 @@ }, { "cell_type": "markdown", - "id": "894ff6d3", + "id": "cb1cc6c0", "metadata": { "editable": true }, @@ -1228,7 +1228,7 @@ }, { "cell_type": "markdown", - "id": "62d5f483", + "id": "704acee9", "metadata": { "editable": true }, @@ -1252,7 +1252,7 @@ }, { "cell_type": "markdown", - "id": "139cc5f0", + "id": "b7ea77c8", "metadata": { "editable": true }, @@ -1263,7 +1263,7 @@ }, { "cell_type": "markdown", - "id": "6ca3e375", + "id": "39d6fefa", "metadata": { "editable": true }, @@ -1275,7 +1275,7 @@ }, { "cell_type": "markdown", - "id": "5c865846", + "id": "81bbe6ad", "metadata": { "editable": true }, @@ -1285,7 +1285,7 @@ }, { "cell_type": "markdown", - "id": "ae762ebe", + "id": "62ce94f3", "metadata": { "editable": true }, @@ -1297,7 +1297,7 @@ }, { "cell_type": "markdown", - "id": "1126e139", + "id": "7a34282f", "metadata": { "editable": true }, @@ -1307,7 +1307,7 @@ }, { "cell_type": "markdown", - "id": "592fa452", + "id": "522f80ba", "metadata": { "editable": true }, @@ -1324,7 +1324,7 @@ }, { "cell_type": "markdown", - "id": "689f9ffd", + "id": "db8755cf", "metadata": { "editable": true }, @@ -1340,7 +1340,7 @@ }, { "cell_type": "markdown", - "id": "197bc3e1", + "id": "972cac82", "metadata": { "editable": true }, @@ -1352,7 +1352,7 @@ }, { "cell_type": "markdown", - "id": "e7ce9275", + "id": "06f4d2cf", "metadata": { "editable": true }, @@ -1362,7 +1362,7 @@ }, { "cell_type": "markdown", - "id": "f52af1de", + "id": "56600d5c", "metadata": { "editable": true }, @@ -1376,7 +1376,7 @@ }, { "cell_type": "markdown", - "id": "f97fd295", + "id": "ec029770", "metadata": { "editable": true }, @@ -1388,7 +1388,7 @@ }, { "cell_type": "markdown", - "id": "7cff5293", + "id": "929f1262", "metadata": { "editable": true }, @@ -1402,7 +1402,7 @@ }, { "cell_type": "markdown", - "id": "c99aa27b", + "id": "50e597c1", "metadata": { "editable": true }, @@ -1414,7 +1414,7 @@ }, { "cell_type": "markdown", - "id": "d40db488", + "id": "97fdf0be", "metadata": { "editable": true }, @@ -1424,7 +1424,7 @@ }, { "cell_type": "markdown", - "id": "f9391369", + "id": "3c5f16ae", "metadata": { "editable": true }, @@ -1436,7 +1436,7 @@ }, { "cell_type": "markdown", - "id": "8c0f2148", + "id": "0c67df3a", "metadata": { "editable": true }, @@ -1446,7 +1446,7 @@ }, { "cell_type": "markdown", - "id": "bf80200e", + "id": "eb065eb4", "metadata": { "editable": true }, @@ -1458,7 +1458,7 @@ }, { "cell_type": "markdown", - "id": "6e1b6f1e", + "id": "d3e2e715", "metadata": { "editable": true }, @@ -1477,7 +1477,7 @@ }, { "cell_type": "markdown", - "id": "533de579", + "id": "c6b84221", "metadata": { "editable": true }, @@ -1490,7 +1490,7 @@ }, { "cell_type": "markdown", - "id": "c26cb04f", + "id": "5d6de6a8", "metadata": { "editable": true }, @@ -1500,7 +1500,7 @@ }, { "cell_type": "markdown", - "id": "52a96687", + "id": "884f06e5", "metadata": { "editable": true }, @@ -1512,7 +1512,7 @@ }, { "cell_type": "markdown", - "id": "22ed5532", + "id": "6cb0b614", "metadata": { "editable": true }, @@ -1531,7 +1531,7 @@ }, { "cell_type": "markdown", - "id": "59eb29ee", + "id": "47a6fa2c", "metadata": { "editable": true }, @@ -1553,7 +1553,7 @@ }, { "cell_type": "markdown", - "id": "80be220c", + "id": "3e652cff", "metadata": { "editable": true }, @@ -1565,7 +1565,7 @@ }, { "cell_type": "markdown", - "id": "b0e77b3a", + "id": "fc7c0b8c", "metadata": { "editable": true }, @@ -1577,7 +1577,7 @@ }, { "cell_type": "markdown", - "id": "9702e0a5", + "id": "34ec95d3", "metadata": { "editable": true }, @@ -1592,7 +1592,7 @@ }, { "cell_type": "markdown", - "id": "3e210930", + "id": "cead29bf", "metadata": { "editable": true }, @@ -1610,7 +1610,7 @@ }, { "cell_type": "markdown", - "id": "0537cb26", + "id": "7d9cc184", "metadata": { "editable": true }, @@ -1630,7 +1630,7 @@ }, { "cell_type": "markdown", - "id": "640e7a49", + "id": "f2e10b39", "metadata": { "editable": true }, @@ -1651,7 +1651,7 @@ }, { "cell_type": "markdown", - "id": "e188958c", + "id": "e0f7c5b2", "metadata": { "editable": true }, @@ -1663,7 +1663,7 @@ }, { "cell_type": "markdown", - "id": "3a1de38f", + "id": "9c495913", "metadata": { "editable": true }, @@ -1675,7 +1675,7 @@ }, { "cell_type": "markdown", - "id": "ce5c22b6", + "id": "ec2f2aea", "metadata": { "editable": true }, @@ -1685,7 +1685,7 @@ }, { "cell_type": "markdown", - "id": "a8d72e5e", + "id": "bfb380a2", "metadata": { "editable": true }, @@ -1697,7 +1697,7 @@ }, { "cell_type": "markdown", - "id": "497eb132", + "id": "c0c28844", "metadata": { "editable": true }, @@ -1707,7 +1707,7 @@ }, { "cell_type": "markdown", - "id": "3b13bf50", + "id": "5d07baf0", "metadata": { "editable": true }, @@ -1719,7 +1719,7 @@ }, { "cell_type": "markdown", - "id": "8b2275c7", + "id": "ce27147c", "metadata": { "editable": true }, @@ -1729,7 +1729,7 @@ }, { "cell_type": "markdown", - "id": "d69f29da", + "id": "f6ffb646", "metadata": { "editable": true }, @@ -1741,7 +1741,7 @@ }, { "cell_type": "markdown", - "id": "d9d62fdf", + "id": "0758d584", "metadata": { "editable": true }, @@ -1755,7 +1755,7 @@ }, { "cell_type": "markdown", - "id": "81e67bf9", + "id": "d0bab345", "metadata": { "editable": true }, @@ -1774,7 +1774,7 @@ }, { "cell_type": "markdown", - "id": "3ffa4fab", + "id": "382b36a0", "metadata": { "editable": true }, @@ -1784,7 +1784,7 @@ }, { "cell_type": "markdown", - "id": "6c63148b", + "id": "ce31f5d6", "metadata": { "editable": true }, @@ -1796,7 +1796,7 @@ }, { "cell_type": "markdown", - "id": "0885af80", + "id": "e28bdeb0", "metadata": { "editable": true }, @@ -1815,7 +1815,7 @@ }, { "cell_type": "markdown", - "id": "13eca820", + "id": "14a118ed", "metadata": { "editable": true }, @@ -1825,7 +1825,7 @@ }, { "cell_type": "markdown", - "id": "81fe3854", + "id": "75bc955e", "metadata": { "editable": true }, @@ -1838,7 +1838,7 @@ }, { "cell_type": "markdown", - "id": "ca3bfee7", + "id": "c55fa6d8", "metadata": { "editable": true }, @@ -1848,7 +1848,7 @@ }, { "cell_type": "markdown", - "id": "30da8e40", + "id": "7a115a08", "metadata": { "editable": true }, @@ -1859,7 +1859,7 @@ }, { "cell_type": "markdown", - "id": "f633fb21", + "id": "8a73fa5f", "metadata": { "editable": true }, @@ -1871,7 +1871,7 @@ }, { "cell_type": "markdown", - "id": "39403aa2", + "id": "0c54bac6", "metadata": { "editable": true }, @@ -1881,7 +1881,7 @@ }, { "cell_type": "markdown", - "id": "0cffc47a", + "id": "2ba00df8", "metadata": { "editable": true }, @@ -1906,7 +1906,7 @@ }, { "cell_type": "markdown", - "id": "1d256940", + "id": "847b049e", "metadata": { "editable": true }, @@ -1934,7 +1934,7 @@ }, { "cell_type": "markdown", - "id": "97acfaee", + "id": "07bc8dfd", "metadata": { "editable": true }, @@ -1960,7 +1960,7 @@ }, { "cell_type": "markdown", - "id": "88b2cd39", + "id": "d0948364", "metadata": { "editable": true }, @@ -1972,7 +1972,7 @@ }, { "cell_type": "markdown", - "id": "2579370c", + "id": "86e80bae", "metadata": { "editable": true }, @@ -1984,7 +1984,7 @@ }, { "cell_type": "markdown", - "id": "17dd7f8c", + "id": "2f1caf78", "metadata": { "editable": true }, @@ -1994,7 +1994,7 @@ }, { "cell_type": "markdown", - "id": "7f9e08ea", + "id": "7bb01594", "metadata": { "editable": true }, @@ -2006,7 +2006,7 @@ }, { "cell_type": "markdown", - "id": "1a98428c", + "id": "ce3f872f", "metadata": { "editable": true }, @@ -2018,7 +2018,7 @@ }, { "cell_type": "markdown", - "id": "460f0d43", + "id": "c025b821", "metadata": { "editable": true }, @@ -2030,7 +2030,7 @@ }, { "cell_type": "markdown", - "id": "b25f2a9e", + "id": "7e82ea13", "metadata": { "editable": true }, @@ -2040,7 +2040,7 @@ }, { "cell_type": "markdown", - "id": "e8b701a2", + "id": "f65abf0f", "metadata": { "editable": true }, @@ -2052,7 +2052,7 @@ }, { "cell_type": "markdown", - "id": "c4cf2770", + "id": "bc12a4cb", "metadata": { "editable": true }, @@ -2062,7 +2062,7 @@ }, { "cell_type": "markdown", - "id": "44a9ee86", + "id": "0ccc6d31", "metadata": { "editable": true }, @@ -2074,7 +2074,7 @@ }, { "cell_type": "markdown", - "id": "a35436e7", + "id": "ee9d6ebf", "metadata": { "editable": true }, @@ -2086,7 +2086,7 @@ }, { "cell_type": "markdown", - "id": "c7563ad7", + "id": "a240e4f6", "metadata": { "editable": true }, @@ -2098,7 +2098,7 @@ }, { "cell_type": "markdown", - "id": "67c803c7", + "id": "d24054ff", "metadata": { "editable": true }, @@ -2108,7 +2108,7 @@ }, { "cell_type": "markdown", - "id": "46be8e16", + "id": "49e6bb24", "metadata": { "editable": true }, @@ -2120,7 +2120,7 @@ }, { "cell_type": "markdown", - "id": "29a3af6b", + "id": "4cb81ee6", "metadata": { "editable": true }, @@ -2131,7 +2131,7 @@ }, { "cell_type": "markdown", - "id": "791c7ee1", + "id": "168d234e", "metadata": { "editable": true }, @@ -2143,7 +2143,7 @@ }, { "cell_type": "markdown", - "id": "5d86ec49", + "id": "48cc4d89", "metadata": { "editable": true }, @@ -2153,7 +2153,7 @@ }, { "cell_type": "markdown", - "id": "f6f5b5c4", + "id": "8c3d01d0", "metadata": { "editable": true }, @@ -2165,7 +2165,7 @@ }, { "cell_type": "markdown", - "id": "3120fb40", + "id": "07d56919", "metadata": { "editable": true }, @@ -2175,7 +2175,7 @@ }, { "cell_type": "markdown", - "id": "5dcb2d7a", + "id": "b8b951c3", "metadata": { "editable": true }, @@ -2187,7 +2187,7 @@ }, { "cell_type": "markdown", - "id": "b05f3e39", + "id": "bcc5c3c5", "metadata": { "editable": true }, @@ -2209,7 +2209,7 @@ }, { "cell_type": "markdown", - "id": "8a8605e1", + "id": "2004a45c", "metadata": { "editable": true }, @@ -2236,7 +2236,7 @@ }, { "cell_type": "markdown", - "id": "06b2e0b3", + "id": "aaf0e252", "metadata": { "editable": true }, @@ -2253,7 +2253,7 @@ }, { "cell_type": "markdown", - "id": "9c0ab15c", + "id": "b211a5e9", "metadata": { "editable": true }, @@ -2265,7 +2265,7 @@ }, { "cell_type": "markdown", - "id": "4f35e91f", + "id": "475e7f01", "metadata": { "editable": true }, @@ -2285,7 +2285,7 @@ }, { "cell_type": "markdown", - "id": "7955334a", + "id": "5f894e44", "metadata": { "editable": true }, @@ -2314,7 +2314,7 @@ }, { "cell_type": "markdown", - "id": "5e647027", + "id": "17509188", "metadata": { "editable": true }, @@ -2328,7 +2328,7 @@ }, { "cell_type": "markdown", - "id": "3773a1a5", + "id": "06c212bd", "metadata": { "editable": true }, @@ -2345,7 +2345,7 @@ }, { "cell_type": "markdown", - "id": "8aea687b", + "id": "6eca6c03", "metadata": { "editable": true }, @@ -2357,7 +2357,7 @@ }, { "cell_type": "markdown", - "id": "2e1251e5", + "id": "535143a4", "metadata": { "editable": true }, @@ -2369,7 +2369,7 @@ }, { "cell_type": "markdown", - "id": "88596be5", + "id": "fbdc270a", "metadata": { "editable": true }, @@ -2386,7 +2386,7 @@ }, { "cell_type": "markdown", - "id": "6610526e", + "id": "d82e0ed4", "metadata": { "editable": true }, @@ -2398,7 +2398,7 @@ }, { "cell_type": "markdown", - "id": "003e8528", + "id": "3ac2a03e", "metadata": { "editable": true }, @@ -2411,7 +2411,7 @@ }, { "cell_type": "markdown", - "id": "f1c7a080", + "id": "36c61468", "metadata": { "editable": true }, @@ -2423,7 +2423,7 @@ }, { "cell_type": "markdown", - "id": "973d2978", + "id": "f06cc3f3", "metadata": { "editable": true }, @@ -2439,7 +2439,7 @@ }, { "cell_type": "markdown", - "id": "e83ed461", + "id": "4972952e", "metadata": { "editable": true }, @@ -2451,7 +2451,7 @@ }, { "cell_type": "markdown", - "id": "e5be1631", + "id": "57ba633f", "metadata": { "editable": true }, @@ -2463,7 +2463,7 @@ }, { "cell_type": "markdown", - "id": "bc4aeb11", + "id": "8aee9a12", "metadata": { "editable": true }, @@ -2475,7 +2475,7 @@ }, { "cell_type": "markdown", - "id": "18408f4b", + "id": "22507011", "metadata": { "editable": true }, @@ -2485,7 +2485,7 @@ }, { "cell_type": "markdown", - "id": "5a3eacfe", + "id": "3ff110e2", "metadata": { "editable": true }, @@ -2497,7 +2497,7 @@ }, { "cell_type": "markdown", - "id": "3fc53083", + "id": "f0599b98", "metadata": { "editable": true }, @@ -2507,7 +2507,7 @@ }, { "cell_type": "markdown", - "id": "9efb4490", + "id": "e6078e70", "metadata": { "editable": true }, @@ -2519,7 +2519,7 @@ }, { "cell_type": "markdown", - "id": "bcc24129", + "id": "8e767777", "metadata": { "editable": true }, @@ -2533,7 +2533,7 @@ }, { "cell_type": "markdown", - "id": "53f545e1", + "id": "4eb7a37f", "metadata": { "editable": true }, @@ -2545,7 +2545,7 @@ }, { "cell_type": "markdown", - "id": "c596eb20", + "id": "bf68e99e", "metadata": { "editable": true }, @@ -2555,7 +2555,7 @@ }, { "cell_type": "markdown", - "id": "082f02b5", + "id": "a1290070", "metadata": { "editable": true }, @@ -2567,7 +2567,7 @@ }, { "cell_type": "markdown", - "id": "e7e622e7", + "id": "841a2f3e", "metadata": { "editable": true }, @@ -2577,7 +2577,7 @@ }, { "cell_type": "markdown", - "id": "3ed43a59", + "id": "82905702", "metadata": { "editable": true }, @@ -2589,7 +2589,7 @@ }, { "cell_type": "markdown", - "id": "b5092530", + "id": "39ad3127", "metadata": { "editable": true }, @@ -2601,7 +2601,7 @@ }, { "cell_type": "markdown", - "id": "71afc34e", + "id": "0b699aa7", "metadata": { "editable": true }, @@ -2613,7 +2613,7 @@ }, { "cell_type": "markdown", - "id": "4e06088e", + "id": "d53c758f", "metadata": { "editable": true }, @@ -2623,7 +2623,7 @@ }, { "cell_type": "markdown", - "id": "67b572ff", + "id": "1febcd97", "metadata": { "editable": true }, @@ -2635,7 +2635,7 @@ }, { "cell_type": "markdown", - "id": "7f2cb56d", + "id": "ac3e8457", "metadata": { "editable": true }, @@ -2645,7 +2645,7 @@ }, { "cell_type": "markdown", - "id": "30befc68", + "id": "bf2247be", "metadata": { "editable": true }, @@ -2657,7 +2657,7 @@ }, { "cell_type": "markdown", - "id": "e9ed2418", + "id": "d9b9fff7", "metadata": { "editable": true }, @@ -2675,7 +2675,7 @@ }, { "cell_type": "markdown", - "id": "3fd46823", + "id": "49888af9", "metadata": { "editable": true }, @@ -2693,7 +2693,7 @@ }, { "cell_type": "markdown", - "id": "32890c10", + "id": "b69de3e4", "metadata": { "editable": true }, @@ -2705,7 +2705,7 @@ }, { "cell_type": "markdown", - "id": "7b90b75f", + "id": "de8ed7bf", "metadata": { "editable": true }, @@ -2715,7 +2715,7 @@ }, { "cell_type": "markdown", - "id": "57702a47", + "id": "ce7032a7", "metadata": { "editable": true }, @@ -2727,7 +2727,7 @@ }, { "cell_type": "markdown", - "id": "5c1faccd", + "id": "dbd0c464", "metadata": { "editable": true }, @@ -2739,7 +2739,7 @@ }, { "cell_type": "markdown", - "id": "6f8a97c2", + "id": "3014ec3d", "metadata": { "editable": true }, @@ -2751,7 +2751,7 @@ }, { "cell_type": "markdown", - "id": "60dbaf5f", + "id": "c5a5ace6", "metadata": { "editable": true }, @@ -2761,7 +2761,7 @@ }, { "cell_type": "markdown", - "id": "1bcfd8b5", + "id": "a2f371f0", "metadata": { "editable": true }, @@ -2773,7 +2773,7 @@ }, { "cell_type": "markdown", - "id": "d5c2881f", + "id": "e28583bc", "metadata": { "editable": true }, @@ -2783,7 +2783,7 @@ }, { "cell_type": "markdown", - "id": "4a4a1ee2", + "id": "d9a81fc4", "metadata": { "editable": true }, @@ -2795,7 +2795,7 @@ }, { "cell_type": "markdown", - "id": "f9de3e2e", + "id": "19e66bb9", "metadata": { "editable": true }, @@ -2813,7 +2813,7 @@ }, { "cell_type": "markdown", - "id": "7cfbf867", + "id": "fe9aa626", "metadata": { "editable": true }, @@ -2823,7 +2823,7 @@ }, { "cell_type": "markdown", - "id": "59feaf54", + "id": "0ea4c73d", "metadata": { "editable": true }, @@ -2841,7 +2841,7 @@ }, { "cell_type": "markdown", - "id": "1dfe0f09", + "id": "9667f728", "metadata": { "editable": true }, @@ -2851,7 +2851,7 @@ }, { "cell_type": "markdown", - "id": "afa7cd6f", + "id": "2247f7e7", "metadata": { "editable": true }, @@ -2869,7 +2869,7 @@ }, { "cell_type": "markdown", - "id": "4dc55224", + "id": "3ae364dc", "metadata": { "editable": true }, @@ -2881,7 +2881,7 @@ }, { "cell_type": "markdown", - "id": "fee85101", + "id": "786ceac4", "metadata": { "editable": true }, @@ -2893,7 +2893,7 @@ }, { "cell_type": "markdown", - "id": "72650035", + "id": "c77a8676", "metadata": { "editable": true }, @@ -2903,7 +2903,7 @@ }, { "cell_type": "markdown", - "id": "39acbc95", + "id": "8adb5956", "metadata": { "editable": true }, @@ -2915,7 +2915,7 @@ }, { "cell_type": "markdown", - "id": "6a4d6203", + "id": "f99747ff", "metadata": { "editable": true }, @@ -2927,7 +2927,7 @@ }, { "cell_type": "markdown", - "id": "f77d408e", + "id": "7428b34b", "metadata": { "editable": true }, @@ -2937,7 +2937,7 @@ }, { "cell_type": "markdown", - "id": "f1086f33", + "id": "b8a1fd30", "metadata": { "editable": true }, @@ -2949,7 +2949,7 @@ }, { "cell_type": "markdown", - "id": "6b1b39f0", + "id": "8aef0917", "metadata": { "editable": true }, @@ -2959,7 +2959,7 @@ }, { "cell_type": "markdown", - "id": "45b0215a", + "id": "1b1c0bf0", "metadata": { "editable": true }, @@ -2971,7 +2971,7 @@ }, { "cell_type": "markdown", - "id": "db70cea8", + "id": "0ba9ebec", "metadata": { "editable": true }, @@ -2983,7 +2983,7 @@ }, { "cell_type": "markdown", - "id": "a1985c70", + "id": "00f467a8", "metadata": { "editable": true }, @@ -3004,7 +3004,7 @@ }, { "cell_type": "markdown", - "id": "5cf1ad9c", + "id": "6f104a37", "metadata": { "editable": true }, @@ -3016,7 +3016,7 @@ }, { "cell_type": "markdown", - "id": "fcccdb63", + "id": "d15fe0e6", "metadata": { "editable": true }, @@ -3028,7 +3028,7 @@ }, { "cell_type": "markdown", - "id": "61896755", + "id": "ef8e62a5", "metadata": { "editable": true }, @@ -3038,7 +3038,7 @@ }, { "cell_type": "markdown", - "id": "41774453", + "id": "01df9b83", "metadata": { "editable": true }, @@ -3050,7 +3050,7 @@ }, { "cell_type": "markdown", - "id": "8d11de07", + "id": "b6d6dd53", "metadata": { "editable": true }, @@ -3064,7 +3064,7 @@ }, { "cell_type": "markdown", - "id": "1f4be3ad", + "id": "ad97fad3", "metadata": { "editable": true }, @@ -3076,7 +3076,7 @@ }, { "cell_type": "markdown", - "id": "225d9d1a", + "id": "807ff58e", "metadata": { "editable": true }, @@ -3088,7 +3088,7 @@ }, { "cell_type": "markdown", - "id": "9f984415", + "id": "78ce49fc", "metadata": { "editable": true }, @@ -3098,7 +3098,7 @@ }, { "cell_type": "markdown", - "id": "a1371ae2", + "id": "3f9285b5", "metadata": { "editable": true }, @@ -3110,7 +3110,7 @@ }, { "cell_type": "markdown", - "id": "502224fb", + "id": "d26df366", "metadata": { "editable": true }, @@ -3122,7 +3122,7 @@ }, { "cell_type": "markdown", - "id": "48aaf38a", + "id": "5c04c58b", "metadata": { "editable": true }, @@ -3132,7 +3132,7 @@ }, { "cell_type": "markdown", - "id": "e205aa5d", + "id": "079f45d8", "metadata": { "editable": true }, @@ -3144,7 +3144,7 @@ }, { "cell_type": "markdown", - "id": "21ed57c1", + "id": "fda3a547", "metadata": { "editable": true }, diff --git a/doc/pub/week1/pdf/week1.pdf b/doc/pub/week1/pdf/week1.pdf index 04aefae..f8c11c5 100644 Binary files a/doc/pub/week1/pdf/week1.pdf and b/doc/pub/week1/pdf/week1.pdf differ diff --git a/doc/src/week1/week1.do.txt b/doc/src/week1/week1.do.txt index 1169b4c..db355f5 100644 --- a/doc/src/week1/week1.do.txt +++ b/doc/src/week1/week1.do.txt @@ -18,15 +18,15 @@ DATE: January 23, 2025 ===== Practicalities ===== o Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics -o Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics +o Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics o We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants -o No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project. +o No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade o All info at the GitHub address URL:"https://github.com/CompPhysics/AdvancedMachineLearning" !split ===== Deep learning methods covered, tentative ===== -o _Deep learning, classics_ +o _Deep learning_ o Feed forward neural networks and its mathematics (NNs) o Convolutional neural networks (CNNs) o Recurrent neural networks (RNNs) @@ -40,7 +40,7 @@ o _Deep learning, generative methods_ o Variational autoencoders (VAEe) o Generative Adversarial Networks (GANs) o Autoregressive methods (tentative) -o _Physical Sciences (often just called Physics informed) informed machine learning_ +o _Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning_ @@ -53,22 +53,11 @@ main idea behind KMR is to flexibly model the relationship between a large number of variables and a particular outcome (dependent variable). -We have not made plans for Reinforcement learning, but this can be another option. +We have not made plans for Reinforcement learning. !split -===== Good books with hands-on material and codes ===== -!bblock -* "Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch":"https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" -* "David Foster, Generative Deep Learning with TensorFlow":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" -* "Bali and Gavras, Generative AI with Python and TensorFlow 2":"https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" -!eblock - -All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as -from Goodfellow, Bengio and Courville's text "Deep Learning":"https://www.deeplearningbook.org/" - -!split -===== Project paths ===== +===== Project paths, overarching view ===== The course can also be used as a self-study course and besides the lectures, many of you may wish to independently work on your own @@ -86,6 +75,86 @@ o The own data path. Some of you may have data you wish to analyze with differen o The Bayesian ML path is not covered by the present lecture material and leads normally to independent self-study work. +!split +===== Possible paths for the projects ===== + +The differential equation path: Here we propose a set of differential +equations (ordinary and/or partial) to be solved first using neural +networks (using either your own code or TensorFlow/Pytorch or similar +libraries). Thereafter we can extend the set of methods for +solving these equations to recurrent neural networks and autoencoders +(AE) and/or Generalized Adversarial Networks (GANs). All these +approaches can be expanded into one large project. This project can +also be extended into including "Physics informed machine +learning":"https://github.com/maziarraissi/PINNs". Here we can discuss +neural networks that are trained to solve supervised learning tasks +while respecting any given law of physics described by general +nonlinear partial differential equations. + +For those interested in mathematical aspects of deep learning, this could also be included. + +!split +===== The generative models ===== + +This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent +the lectures. Topics for data sets will be discussed. + +!split +===== Paths for projects, writing own codes ===== + +The computational path: Here we propose a path where you develop your +own code for a convolutional or eventually recurrent neural network +and apply this to data selects of your own selection. The code should +be object oriented and flexible allowing for eventual extensions by +including different Loss/Cost functions and other +functionalities. Feel free to select data sets from those suggested +below here. This code can also be extended upon by adding for example +autoencoders. You can compare your own codes with implementations +using TensorFlow(Keras)/PyTorch or other libraries. + +!split +===== The application path/own data ===== + +The application path: Here you can use the most relevant method(s) +(say convolutional neural networks for images) and apply this(these) +to data sets relevant for your own research. + +!split +===== Gaussian processes and Bayesian analysis ===== + +The Gaussian processes/Bayesian statistics path: "Kernel regression +(Gaussian processes) and Bayesian +statistics":"https://jenfb.github.io/bkmr/overview.html" are popular +tools in the machine learning literature. The main idea behind these +approaches is to flexibly model the relationship between a large +number of variables and a particular outcome (dependent +variable). This can form a second part of a project where for example +standard Kernel regression methods are used on a specific data +set. Alternatively, participants can opt to work on a large project +relevant for their own research using gaussian processes and/or +Bayesian machine Learning. + +!split +===== HPC path ===== + +Another alternative is to study high-performance computing aspects in +designing ML codes. This can also be linked with an exploration of +mathematical aspects of deep learning methods. + + + + +!split +===== Good books with hands-on material and codes ===== +!bblock +* "Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch":"https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" +* "David Foster, Generative Deep Learning with TensorFlow":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" +* "Bali and Gavras, Generative AI with Python and TensorFlow 2":"https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" +!eblock + +All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as +from Goodfellow, Bengio and Courville's text "Deep Learning":"https://www.deeplearningbook.org/" + !split ===== Types of machine learning ===== @@ -145,7 +214,7 @@ novel, realistic images of horses that did not exist in the original dataset. !split -===== Example of generative modeling, "taken from Generative Deeep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" ===== +===== Example of generative modeling, "taken from Generative Deep Learning by David Foster":"https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" ===== FIGURE: [figures/generativelearning.png, width=900 frac=1.0] @@ -211,77 +280,10 @@ data (for example an image), rather than trying to predict a label for say a gi FIGURE: [figures/generativemodels.png, width=900 frac=1.0] -!split -===== Possible paths for the projects ===== - -The differential equation path: Here we propose a set of differential -equations (ordinary and/or partial) to be solved first using neural -networks (using either your own code or TensorFlow/Pytorch or similar -libraries). Thereafter we can extend the set of methods for -solving these equations to recurrent neural networks and autoencoders -(AE) and/or Generalized Adversarial Networks (GANs). All these -approaches can be expanded into one large project. This project can -also be extended into including "Physics informed machine -learning":"https://github.com/maziarraissi/PINNs". Here we can discuss -neural networks that are trained to solve supervised learning tasks -while respecting any given law of physics described by general -nonlinear partial differential equations. - -For those interested in mathematical aspects of deep learning, this could also be included. - -!split -===== The generative models ===== - -This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent -the lectures. Topics for data sets will be discussed during the lab sessions. - -!split -===== Paths for projects, writing own codes ===== - -The computational path: Here we propose a path where you develop your -own code for a convolutional or eventually recurrent neural network -and apply this to data selects of your own selection. The code should -be object oriented and flexible allowing for eventual extensions by -including different Loss/Cost functions and other -functionalities. Feel free to select data sets from those suggested -below here. This code can also be extended upon by adding for example -autoencoders. You can compare your own codes with implementations -using TensorFlow(Keras)/PyTorch or other libraries. - -!split -===== The application path ===== - -The application path: Here you can use the most relevant method(s) -(say convolutional neural networks for images) and apply this(these) -to data sets relevant for your own research. - -!split -===== Gaussian processes and Bayesian analysis ===== - -The Gaussian processes/Bayesian statistics path: "Kernel regression -(Gaussian processes) and Bayesian -statistics":"https://jenfb.github.io/bkmr/overview.html" are popular -tools in the machine learning literature. The main idea behind these -approaches is to flexibly model the relationship between a large -number of variables and a particular outcome (dependent -variable). This can form a second part of a project where for example -standard Kernel regression methods are used on a specific data -set. Alternatively, participants can opt to work on a large project -relevant for their own research using gaussian processes and/or -Bayesian machine Learning. - -!split -===== HPC path ===== - -Another alternative is to study high-performance computing aspects in -designing ML codes. This can also be linked with an exploration of -mathematical aspects of deep learning methods. - - !split -===== What are the basic Machine Learning ingredients? ===== +===== Reminder on the basic Machine Learning ingredients ===== !bblock Almost every problem in ML and data science starts with the same ingredients: * The dataset $\bm{x}$ (could be some observable quantity of the system we are studying)