2

Let $X$ be a $p$-dimensional vector that is asymptotically normal such that $$\sqrt{n}(X - \mu_X) \stackrel{d}\longrightarrow N(0, \Sigma)$$, and let $H$ be a random $p\times p$ symmetric matrix, where each element $h_{i,j}$ is asymptotically normal such that $$\sqrt{n}(h_{i,j} - \eta_{i,j}) \stackrel{d}\longrightarrow N(0, \sigma_{ij}^2)$$ This is a pretty strong assumption: certainly stronger than convergence in probability. In any case, we know that there is some "true" limiting symmetric matrix, let's call it $K$, and that each element of $H$ is asymptotically normal for each element of $K$ (does this even make sense?).

Consider the random quadratic form $X^THX$, which is a scalar value. It seems obvious that $$X^THX \stackrel{p}\longrightarrow \mu^TK\mu$$ (by Slutsky's Theorem and the continuous mapping theorem). From what I've seen, in general, the asymptotic distribution of quadratic form is usually taken to be some sum of $\chi^2$ variables. However, can we say anything about potential asymptotic normality instead?

For instance, I'm interested in making a statement $$\sqrt{n}(X^THX - \mu^TK\mu) \stackrel{d}\longrightarrow N(0, \tau^2)$$ for some $\tau^2 < \infty$. Is there any sort of control we must make on the joint distribution of $X$ and elements of $H$ (for instance, finite 2nd or 4th cross-moments)? What if $X$ and $H$ are not independent (for instance, if of $H$ are somehow correlated with (or even functions of) elements of $X$)?

Just for personal curiosity! As well, if there are any good references, I'd love to read them.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
Dave
  • 21
  • 1
  • "...certainly stronger than convergence in probability": convergence in distribution is indeed stronger than convergence in probability (in fact it implies convergence in quadratic mean), but only if two "regularity" conditions are satisfied, see https://stats.stackexchange.com/a/379971/28746 – Alecos Papadopoulos Jul 25 '19 at 22:39

0 Answers0