0

I am working on making a conjecture about necessary and sufficient conditions for a singular covariance matrix of a p-dimensional random vector.

To get to this conjecture I have to find the conditions that the covariance matrix of a 2-dimensional random vector, $X=(X_1, X_2)^T$, is singular. Knowing that a matrix is singular if its determinant is 0, I said that the covariance matrix is singular iff $\sigma_{X_1}^2 \sigma_{X_2}^2 = \sigma_{X_1X_2}^2$. However, I am not sure how to generalize this to p-dimension.

EM823823
  • 151
  • 4
  • Your formula doesn't look like a determinant. Please tell us what you mean by the $\sigma^2_{X_i}.$ What exactly do you need to generalize, given that determinants are well-defined, understood, explained in many texts, and you have already articulated a correct criterion for singularity in any number of dimensions? – whuber Jul 29 '20 at 20:55
  • @whuber Thank you for pointing that out for me. I rewrote it. – EM823823 Jul 29 '20 at 20:56
  • 1
    Do you mean that $\sigma_{X_1X_2}$ is the covariance? If so, then your equality is equivalent to the determinant being zero. – whuber Jul 29 '20 at 20:57
  • @whuber Finding the determinant of a p-dimensional matrix is very tedious. I was hoping there is a simpler way to get the conditions. – EM823823 Jul 29 '20 at 20:57
  • @whuber Yes, $\sigma_{X_1,X_2}$ is the covariance of $X_1, X_2$. – EM823823 Jul 29 '20 at 20:59
  • Okay. What, then, would the generalization look like? Are you looking for a formula for the determinant? For alternative characterizations of singularity? An efficient algorithm to check for singularity? (They exist: look up "row reduction".) Something else? – whuber Jul 29 '20 at 21:00
  • @whuber Either or. – EM823823 Jul 29 '20 at 21:03
  • @StubbornAtom I believe it could be about sample covariance matrix. The problem states that $X= (X_1, ..., X_p)$ is a random vector. – EM823823 Jul 30 '20 at 15:16
  • I guess dispersion matrix of $X$ is singular iff $a'X$ is a constant with probability $1$ for any non-zero vector $a$. – StubbornAtom Jul 30 '20 at 15:39

1 Answers1

0

A matrix $A$ is a covariance matrix if and only if it is a symmetric positive semi-definite matrix (see here).

A symmetric matrix is positive definite if and only if all of its leading principal minors are strictly positive (see here).

A symmetric matrix is positive semi-definite if and only if all of its principal minors are nonnegative (see here).

I suppose you know what (leading) principal minors are. However, if $A$ is a $n\times n$ matrix, then (see here):

  • a minor is a square submatrix $A_{IJ}$ where $I$ and $J$ are subsets of $\{1,2,\dots,n\}$
  • a principal minor is the determinant of $A_{IJ}$, $I=J$;
  • a leading principal minor is the determinant of $A_{IJ}$ when $I=J=\{1\}$, or $I=J=\{1,2\}$, or $I=J=\{1,2,3\}$, etc.

You need a symmetric matrix which is positive semi-definite, but not positive definite: at least one $|A_{1,\dots,k;1;\dots,k}|$, i.e. at least one leading principal minor, must be null.

If $A$ is a $2\times 2$ matrix, the easiest solution is $a_{11}=0$ (the first leading principal minor is null), i.e. if $X_1$ is a degenerate random variable, then: $$\begin{vmatrix} 0 & 0 \\ 0 & a \end{vmatrix}=0$$ Another example is: $Z\sim N(0,1)$, $\mathbf{X}=(Z,-Z)$, because (see here): $$\mathbf{\Sigma}=\begin{bmatrix} 1 & -1 \\ -1, & 1 \end{bmatrix},\quad |\mathbf{\Sigma}|=0$$ i.e. the first leading principal minor is strictly positive, but the second one is null.

In general, degenerate multivariate distributions have singular covariance matrices. See Does $(X,X)'$ follow a bivariate normal distribution?.

Sergio
  • 5,628
  • 2
  • 11
  • 27
  • Positive-definiteness is not equivalent to non-singular. – whuber Jul 30 '20 at 13:40
  • Right, because a non positive definite matrix may be non-singular: a negative definite matrix, which doesn't make sense as a variance matrix, is non-singular. But a symmetric positive semi-definite matrix which is not positive definite does make sense as a variance matrix and is singular. Isn't it? – Sergio Jul 30 '20 at 16:16
  • Yes, that's true: but you don't need to apply all of Sylvester's criteria to determine whether the matrix is singular. There are simpler and more efficient solutions. The most efficient I can think of is to triangulize the matrix using Gaussian reduction: as soon as a zero is found on the diagonal you can stop and declare the matrix singular, but if you complete the process with no zeros on the diagonal, the matrix is invertible. – whuber Jul 30 '20 at 17:17
  • But as soon as a zero is found on the diagonal I can't stop and declare the matrix positive semi-definite. Singular is not equivalent to positive semi-definiteness, non-singular is not equivalent to positive definiteness :) – Sergio Jul 30 '20 at 17:40
  • The question concerns determining whether a *covariance matrix* is singular. Thus, no testing for positive semi-definiteness is needed. – whuber Jan 07 '22 at 18:17