Skip to content
/ EM Public

Expectation Maximisation for a Gaussian Mixture Model Implemetation of the expectation maximisation algorithm for Gaussian Mixture Models in C++

Notifications You must be signed in to change notification settings

Linus-J/EM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Expectation Maximisation for a Gaussian Mixture Model

Implemetation of the expectation maximisation algorithm for Gaussian Mixture Models in C++ based off of this article's [1] EM algorithm implentation for Poisson Mixture Models in Python.

After the predicted paramters of the GMM are computed, they are stored as a JSON file to allow for easy export and plotting in Python.

Setup and run:

Assuming g++ is the installed compiler,

git clone https://github.com/Linus-J/EM
cd EM
g++ EM_GMM.cpp -o EM_GMM
./EM_GMM

Make plots:

Use the Make_Figures.ipynb notebook to generate both true and predicted plots for your GMM.

Example output for a GMM with 2 Gaussian Models:

For the distribution $Z_1$, $\mu_1 = 0, \sigma_1 = 1.0, \pi_1 = 0.2.$

For $Z_2$, $\mu_2 = 4, \sigma_2 = 2.0, \pi_2 = 0.8.$

Distribution

True_Distribution

Predicted_Distribution

References

[1] Gerry Christian Ongko, Implementing Expectation-Maximisation Algorithm from Scratch with Python, Towards Data Science.

About

Expectation Maximisation for a Gaussian Mixture Model Implemetation of the expectation maximisation algorithm for Gaussian Mixture Models in C++

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published