0

I have been going through the minimization of Variational inference and have a good understanding of all the steps taken: enter image description here

However, there is a part that relies on KL >= 0: enter image description here

I have derived the proof that D(P||Q) getting the same results as in Why KL divergence is non-negative?. Although, I don't see how this result applies to D(q(z)||p(z|x)) >= 0, and am having a hard time deriving the proof to apply this fact in this, is it possible to derive KL >= 0 for this case?

Frank
  • 3
  • 1
  • 1) Nothing you've written so far relies on KL divergence being non-negative. But, non-negativity may be important in future steps (e.g. to show a lower bound on the log evidence). 2) KL divergence is always non-negative. There's nothing special about $q(z)$ and $p(z \mid x)$ compared to other distributions. Since you've already proved that KL divergence is non-negative, what are your concerns about this particular case? – user20160 Dec 06 '21 at 15:49
  • In this particularly case, the KL divergence is KL(q(z)||p(z|x)), which I end up getting log$\sum_z P(Z|x)$ at the final step is it valid to assume that this is still log(1) = 0? Thanks – Frank Dec 06 '21 at 15:54

0 Answers0