0

I want to multiply two Normal probability density functions,

$$ {\displaystyle f_{\mathbf {X} }(x_{1},\ldots ,x_{k})={\frac {\exp \left(-{\frac {1}{2}}({\mathbf {x} }-{\boldsymbol {\mu }})^{\mathrm {T} }{\boldsymbol {\Sigma }}^{-1}({\mathbf {x} }-{\boldsymbol {\mu }})\right)}{\sqrt {(2\pi )^{k}|{\boldsymbol {\Sigma }}|}}}}$$

They are each bi-variate, but have correlations. Thus their covariance matrices $\Sigma_A$ and $\Sigma_B$ looks something like this:

$$\Sigma_A=\pmatrix{ \sigma_x^2 & \sigma_x\sigma_y\rho\cr \sigma_x\sigma_y\rho & \sigma_y^2\cr}$$

In the uni-variate case, variances are summed ($\sigma^2=\sigma_A^2+\sigma_B^2$), which is the simple Gaussian error propagation. This also makes intuitive sense, because the product of the PDFs implies that the second-degree polynomial in the exponential, $-\frac{(x-\mu)^2}{2\sigma^2}$, need to be summed.

In the bi-variate case, the variances also need to be summed, I believe. For the covariance term, I suspect this is also true. It makes sense to me that again, the exponential terms are polynomials, and they need to be summed. Is it correct that

$$\Sigma=\Sigma_A + \Sigma_B$$

But this paper suggests (eq. 28):

$$\Sigma^{-1}=\Sigma_A^{-1} + \Sigma_B^{-1}$$

I guess this is perhaps related to the Law of total covariance, but I don't quite see the explicit connection.


To be clear, I am not asking for the PDF of the product of two normal random variables (see here).

Questions that are potentially related, but where I do not see the answer to my question clearly:

j13r
  • 203
  • 1
  • 7

1 Answers1

0

If two Gaussian variables A and B are added: C=A+B, then the variance of C is the sum of the variances of A and B. This corresponds to a convolution of two gaussian PDFs, with each measurement corresponding some blurring.

But the situation that corresponds to multiplying two Gaussian PDFs is different: it corresponds to having two measurements of the same random variable. Then, the better measurement should dominate the total variance, i.e., the inverse variances must be added.

Simple Derivation:

As the Gaussian formula makes clear, multiplying Gaussian PDFs means summing the exponent, and

$$\left(-{\frac {1}{2}}({\mathbf {x} }-{\boldsymbol {\mu }})^{\mathrm {T} }{\boldsymbol {\Sigma }_A}^{-1}({\mathbf {x} }-{\boldsymbol {\mu }})\right) + \left(-{\frac {1}{2}}({\mathbf {x} }-{\boldsymbol {\mu }})^{\mathrm {T} }{\boldsymbol {\Sigma }_B}^{-1}({\mathbf {x} }-{\boldsymbol {\mu }})\right) = $$ $$ -{\frac {1}{2}}({\mathbf {x} }-{\boldsymbol {\mu }})^{\mathrm {T} }\left({\boldsymbol {\Sigma }_A}^{-1} + {\boldsymbol {\Sigma }_B}^{-1}\right)({\mathbf {x} }-{\boldsymbol {\mu }})$$

by distributivity of matrix multiplication.

Thus, indeed, the inverse covariance matrices (precision matrices) need to be summed to obtain a total covariance matrix:

$${\boldsymbol {\Sigma }}^{-1} = {\boldsymbol {\Sigma }_A}^{-1} + {\boldsymbol {\Sigma }_B}^{-1}$$

j13r
  • 203
  • 1
  • 7