0

Assume maximum likelihood estimators $a,b$ of size $p$, with corresponding estimated covariance matrices $V^a,V^b$. In fact $a,b$ are two regression coefficient vectors.

Denote $q=a-b$ the vector of distances, and let $W=q^T(V^a+V^b)^{-1}q$ be the Wald statistic, which is also the squared Mahalanobis norm of $q$. I know that $W$ is bounded, say $W<k$.

Now, let $x$ be a vector of size $p$. How do I show that $E[|x^Ta-x^Tb|]$ is also bounded? It looks very trivial and intuitive but somehow I miss something along the way.

I have started by using the existence of a boundary $W<k$ to claim that there exists a $p$-component vector of corresponding per-component boundaries, but then got somehow stuck.

Any ideas?

Spätzle
  • 2,331
  • 1
  • 10
  • 25
  • Isn't this just the [power norm inequality](https://stats.stackexchange.com/a/509139/919)? – whuber Jan 20 '22 at 15:52
  • The Mahalanobis isn't a $L^p$ norm – Spätzle Jan 20 '22 at 18:05
  • $W,$ as a function of $q$ for fixed $V^a+V^b,$ is *explicitly* the square of an $L^2$ norm. The meaning and status of your symbols are unclear. Are you perhaps considering $V^a$ and $V^b$ as *random variables* (estimated variance matrices)? – whuber Jan 20 '22 at 19:43
  • yes, they are estimated. I thought it was clear from the definition of $a,b$ as regression coefficient vectors. – Spätzle Jan 20 '22 at 21:02
  • No, that is not implied. Because there is an important distinction between true values and estimated values, one usually is careful to write "*estimated* covariance matrix" in such situations. – whuber Jan 20 '22 at 22:00

0 Answers0