Search code examples
machine-learningcomputer-visiongaussianmontecarlomixture-model

How do I evaluate a sample in a weighted Gaussian Mixture Model?


Short version:

If I have a MoG model with n components each with individual weights w^n. I have a sample s. I wish to calculate the probability that this sample was drawn from the MoG. I can evaluate the individual gaussians easily but I don't know how to take their weights into account or aggregate their scores.

Longer Version:

I am using a MoG model in matlab for a machine learning algorithm. I am sampling Monte Carlo style and thus need to perform importance re-weighting which involves evaluating the likelihood of drawing a specific sample from the MoG model. I can easily evaluate a single Gaussian but I'm unsure how to go about it for the entire MoG model, taking into account all the components and weights.


Solution

  • I guess the mathematical answer would be:

    y = p(x | M) = \sum_i p(x | N_i) * w_i
    

    where p(x | M) is the probability of x being sampled form the mixture M, which is translated to the weighted sum of the probability of x being sampled from each of the gaussians N_i weighted by the prior probability of sampling from the normal N_i (w_i, a weight obtained during training).

    Find here a detailed document on how to train or sample from a GMM:

    http://guneykayim-msc.googlecode.com/svn-history/r20/trunk/doc/common/GMM.pdf