When working with (gaussian) mixture models, I always took it for a mathematical fact that the marginal likelihood increases with every iteration step. If it were not the case, it always meant an error in the code or some other technical problem.
I am now experimenting with Dirichlet mixture models, where the maximization step is done numerically. One is then not guaranteed to find the absolute maximum of the expected likelihood, so intuitively non-monotonous increase of the marginal likelihood seems not fully disallowed. And I do see it in my simulations.
Is this known behavior? Or are there mathematical results showing that the likelihood should still increase monotonically?
I will appreciate help from experienced people.
Remark: (Two weeks on) I did find an error in my DMM code, but I think the question still stands.