I am doing some work with Gaussian mixture models and we want to find the standard deviation of samples from the model. Our current methodology is to run a Monte-Carlo sim, and just take a bunch of samples then evaluate the std of those samples. This method is time-consuming and kinda ugly.
I am curious if anyone knows of a closed-form solution?
I started making some headway on calculating the variance of a gaussian using a new mean. aka E((x-c)^2) | x ~ N(u,s). Then say Var(sample_gmm) = sum_i(pi_i^2 * E((x-c)^2) | x ~ N_i) and c = sum_i(pi_i * u_i). However, the resulting formula is Very ugly. I haven't been able to find any online discussion on this problem.