4

If $A$ is symmetric and $Y\sim\mathcal N(0,V)$, how can I show that $Y'AY\sim\sum_{i=1}^{t}(c_i * \chi^2(1))$ with 1 degree of freedom), where $c_i$ can be any scalar?

I multiplied out the canonical case where length($Y$) is two, and got $$ \begin{aligned} Y'AY &= [Y_1 Y_2] \left[\begin{array}{cc} a_{11} & a_{12} \\ a_{12} & a_{22} \\ \end{array}\right] \left[\begin{array}{cc} Y_1 \\ Y_2 \end{array}\right] \\ &= [Y_1 Y_2] [a_{11}Y_1 + a_{12}Y_2 ; a_{12}Y_1 + a_{22}Y_2] \\ &= a_{11} * Y_2^2 + 2a_{12}Y_1Y_2 + a_{22}Y_2^2 \end{aligned} $$ obviously the first and last term are chi-squared of degree 1, but the cross term? As $n$ increases (where the above example is $n = 2$), the cross terms increase as well.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
rmstmppr
  • 43
  • 1
  • 4
  • 2
    Uncorrelate the $Y$s using Cholesky decomposition (say $Z = V^{-1/2} Y$); set up the quadratic form of interest in terms of the uncorrelated variables $Z$; solve the eigenproblem for the matrix in the middle; go back to $Z$s and rotate them using the orthogonal matrix made up of eigenvectors (which preserves their uncorrelatedness); look again at your quadratic form in terms of $Z$ and conclude that $c_i$ must be the eigenvalues of $AV$ or something like that. So they are not arbitrary. – StasK Oct 23 '12 at 04:56

1 Answers1

1

Perform eigen decomposition on $A = U \Lambda U'$ and let $X = U'Y$

$$Y' A Y = Y'U \Lambda U'Y = X' \Lambda X$$

(1) Show $X = U'Y = [x_1, x_2, ..., x_t]'$ is in multivariate normal distribution

First as Y is multivariate normal, X is also multivariate normal. $$E(X) = E(U' Y) = U' 0 = 0$$ $$Var(X) = U' Var(Y) U = U' U = I$$

So each $x_i$ is normally distributed.

(2) $X' \Lambda X = \lambda_1 x_1^2 + ... + \lambda_t x_t^2$

From (1), $x_i$ is normally distributed, then in $x_1^2, .. , x_t^2$, each is chi-square distributed with 1 df.

If you let $\lambda_i = c_i$, you will prove that $Y'AY = \sum_{i=1}^t c_i \chi^2(1)$

Toaster
  • 3
  • 4
zhanxw
  • 542
  • 3
  • 6
  • sorry, but how do you know that $U' Var(Y) U = U' U$? I guess you probably mean $Cov(Y)$ (the whole covariance matrix), but this was in the question defined as $V$ ... which would give you $U' V U$, but how can you then know that this is equal to $U' U$? – Tomas Jun 22 '20 at 22:00
  • This proof is wrong - see https://stats.stackexchange.com/a/403940 for correct proof - and is missing some key part... and where do you have $V$? You completely ignore $V$ so it can't be correct. But anyway thanks for inspiration! It helped me indirectly. – Tomas Jun 25 '20 at 14:33
  • @Curious you're right, the original proof is not correct, but can be corrected easily. You just eigen decompose V, say V = M * L * I * L' * M', where M is orthogonal matrix and L is a diagonal matrix. We will have L^(-1) M' Y ~ N(0, I). Next, observe Y'AY = Y' M L^(-1) * A * L^(-1) M' Y. Instead of decompose A, we can decompose (M L^(-1) A L^(-1) M'). The rest of the proof will follow now. Thanks again for pointing that out. – zhanxw Jun 26 '20 at 00:47