2

I want to calculate the Kullbach Leibler Divergence of two multivariate Gaussians as in

KL divergence between two multivariate Gaussians.

At one point one has to solve the following expression (which is also equation 380 in the matrix cookbook):

$$ E_q[(x-\mu_1)^T\Sigma_2^{-1}(\mu_1-\mu_2)]$$

where the expectation is taken across another multivariate gaussian $$ q(x) = N(\mu_1, \Sigma_1)$$.

And this should evaluate to zero, but I dont understand how and why

guest1
  • 585
  • 3
  • 13

1 Answers1

2

Note that $ (x - \mu_1)\Sigma_2^{-1}(\mu_1 - \mu_2)$ is just a linear function of x. You can write it as $(x-b)A$ where $b = \mu_1$ and $A = \Sigma_2^{-1}(\mu_1 - \mu_2)$ are constants.

By linearity of expectation, $E[(x-b)A] = (E[x] - b)A$. Note that $E[x] = b =\mu_1$ and thus the whole expression is zero.

Ankitp
  • 176
  • 5