3

Suppose that $A$ is a symmetric non-random matrix and $X\sim N(\mu,\Sigma)$ and $b \in R^n$ is a non-random vector. Then what is the distribution of $$X^tAX+b^tX \quad ?$$

The distribution without the linear term is solved in the answer here(Transformation of multivariate normal sum of chi-squared).

In the case of an invertible $A$ we can write $X^tAX+b^tX=(X-h)^tA(X-h)+k$ where $h=-\frac{1}{2}A^{-1}b$ and $k=-\frac{1}{4}b^tA^{-1}b$. However, the case where $A$ is not invertible is also of interest as it arises in practice. In the case where $n=1$ this corresponds to $A=0$ and thus the distribution above is simply a normal with mean $\mu_0 = b^t\mu$ and $\sigma_0 = b^t\Sigma b$. Is there a reducible solution in the more general case of $n>1$ for arbitrary non-invertible $A$? Perhaps an appropriate transformation can disentangle the quadratic form and the linear so that we have in some basis independent sum a normal and a linear combination of scaled non-central chi-squared with 1 degree of freedom.

Attempt: Write $A=P\Lambda P^t$, where $\Lambda$ is a diagonal of the eigenvalues and $P$ has the corresponding unit eigenvectors in it's rows, $PP^t = I$. As $A$ is not invertible there are $r>0$ zero eigenvalues. Then we can write $$X^tAX+b^tX = X^tP\Lambda P^tX+b^tPP^tX $$ Set $Y=P^tX$, then $$ X^tAX+b^tX = Y^t \Lambda Y + b^t PY$$ Assume w.l.o.g. that the eigenvalues which are $0$ and corresponding eigenvectors in $P$ are the last $r$ rows/columns in $\Lambda$. Then $$X^tAX+b^tX = (Y^\star)^t \Lambda^\star Y^\star + (b^\star)^tY^\star+(b')^tY' $$ $Y^\star$ are the $n-r$ first entries of $Y$ and $Y'$ the rest, $b^\star$ is the $n-r$ first entries of $P^tb$ and $b'$ the $r$ last. $\Lambda^\star \in R^{(n-r)\times (n-r)}$. Now the first two terms can be used to fill the square and the last term involves parts of $Y$ not involved in the first two terms but it is not independent of them as the covariance matrix of $Y$, $Cov(Y) = P^t\Sigma P$ is not diagonal. The goal is to identify the distribution of $X^tAX+b^tX $.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Raxel
  • 317
  • 1
  • 2
  • 11
  • 1
    Look at: https://stats.stackexchange.com/questions/262604/what-is-the-moment-generating-function-of-the-generalized-multivariate-chi-squ/318908#318908 https://stats.stackexchange.com/questions/144893/what-is-the-expected-norm-mathbb-e-lvert-x-rvert-for-a-multivariate-normal/144936#144936 and links therein. – kjetil b halvorsen Jul 04 '18 at 09:33
  • 1
    Please look up "completing the square" to learn how to convert your question into the same one with $b=0.$ The answer to that version is the [generalized chi-squared distribution](https://stats.stackexchange.com/questions/67533). – whuber Jul 04 '18 at 10:27
  • 2
    Thanks for the informative responses. This solves my problems. – Raxel Jul 04 '18 at 10:37
  • @whuber : What to do in completing the square part when the symmetric $A$ is not invertible? That is we can write $X^t AX +b^tX = (X-h)^t A (X-h)+k$ where $h=-0.5 A^{-1}b$ and $k=-.25b^tA^{-1}b$. – Raxel Jul 04 '18 at 18:58
  • 1
    I invite you to contemplate the non-invertible situation when $n=1:$ you should immediately be able to determine the answer in that case. In higher dimensions it's the same, because you first will diagonalize $A,$ thereby reducing it to a collection of one-dimensional problems. Clearly this creates some complications, so if your interest lies in non-invertible $A,$ then please--by all means--edit your post to reflect that and we can re-open it. – whuber Jul 04 '18 at 19:19
  • @whuber : Indeed, it is happening in my hypothesis testing. I have edited the question above with update on the progress and this more specific question. – Raxel Jul 04 '18 at 19:47

0 Answers0