0

The following seems to hold for $X$ distributed as 0-centered normal with covariance $\Sigma$, any suggestions how to show this analytically?

$$E[\|X\|^4] = \operatorname{tr}(\Sigma)^2+2\operatorname{tr}(\Sigma^2)$$

StubbornAtom
  • 16,360
  • 4
  • 31
  • 84
Yaroslav Bulatov
  • 4,515
  • 2
  • 22
  • 41
  • Someone already worked out an answer, but the distribution of $X^T X$ is also given [here](https://math.stackexchange.com/questions/442472/sum-of-squares-of-dependent-gaussian-random-variables/442916) – LinAlg Oct 29 '19 at 18:02
  • The linked thread above does not concern itself with a general $\Sigma$. – StubbornAtom Oct 29 '19 at 18:14
  • @StubbornAtom yes it does – LinAlg Oct 30 '19 at 14:07
  • 1
    @StubbornAtom that is not a contradiction with the answer I referred to. I have posted an elegant answer based on that. – LinAlg Oct 30 '19 at 15:39

2 Answers2

1

Since $\lVert X\rVert^4=(X^TX)^2$, we have for $X\sim N(0,\Sigma)$,

\begin{align} E\,[\lVert X\rVert^4]=E\,[(X^TX)^2]&=\operatorname{Var}[X^TX]+(E\,[X^TX])^2 \\&=2\operatorname{tr}(\Sigma^2)+(\operatorname{tr}(\Sigma))^2 \end{align}

The expectation of $X^T X$ for any vector $X$ is relatively straightforward, as shown here.

The variance of $X^TX$ is a bit more work, but for normal $X$ it can be done by generalizing this approach here.

For a reference regarding derivation of variance of quadratic forms in general, you can look up Linear Regression Analysis by Seber and Lee (second edition, section 1.5).

StubbornAtom
  • 16,360
  • 4
  • 31
  • 84
1

Write $\Sigma = P^T \Lambda P$, then:

$$X^TX = (\Sigma^{-1/2}X)^T P^T \Lambda P (\Sigma^{-1/2}X) = \sum_i \lambda_i U_i^2,$$ where $U = P \Sigma^{-1/2}X \sim N(0,I)$. So $X^TX$ is a weighted sum of independent $\chi^2_1$ variables. Therefore, the expected value of $X^TX$ is $\sum_i \lambda_i = \text{tr}(\Sigma)$ and the variance is $2 \sum_i \lambda^2 = 2\text{tr}(\Sigma^2)$.

LinAlg
  • 19,512
  • 2
  • 13
  • 32