If a covariance matrix is non-singular, does this implies that correlation matrix is also non-singular.
My guess is it depends on mean vector in $K_{X} = R_{X} - m_X.{m_X}^H$
Not sure though.
If a covariance matrix is non-singular, does this implies that correlation matrix is also non-singular.
My guess is it depends on mean vector in $K_{X} = R_{X} - m_X.{m_X}^H$
Not sure though.
If $X$ is a random vector with mean (vector) $\mu$, then the covariance matrix is given by $\DeclareMathOperator{\E}{\mathbb{E}} \Sigma = \E (X-\mu)(X-\mu)^T$. The variances is on the diagonal matrix $D= \operatorname{Diag}(\Sigma)$. If $\Sigma$ is nonsingular then so is $D$, that is, all the variances are positive.
Now the correlation matrix can be written $$ R= D^{-1/2}\Sigma D^{-1/2} $$ and it follows that $R$ and $\Sigma$ is jointly singular or non-singular.
To the extra question in comment: If the covariance (and then correlation matrix$^\dagger$) is singular, then there is some linear subspace (of dimension equal to the range of the covariance matrix) such that (with probability 1) the random variables $X$ "lives" in that subspace. So there is some deterministic structure, but still, $X$ are random variables. Even a constant is a random variable! random variables are just functions, see What is meant by a "random variable"?. Just as in calculus, constants can be seen as functions. Some details here.
$^\dagger$If there is zeros on the diagonal of $D$, then replace the inverse by the Moore-Penrose generalized inverse.