7

Consider three normally distributed random variables, $X,Y,Z$ where $Cov(X,Y)<0$ and $Cov(X,Z)>0$. Can we say anything about the sign of $Cov(Y,Z)$?

Intuitively, $Y$ goes down when $X$ goes up. Next, $Z$ goes up when $X$ goes up. I would guess $Cov(Y,Z)<0$.


Set $\rho=Corr(X,Y)<0$. Then, $Y=\rho X+\sqrt{1-\rho^2}\epsilon$ where $\epsilon\sim N(0,1)$ is orthogonal to $X$. Then, \begin{align} Cov(Y,Z) &= Cov\left(\rho X+\sqrt{1-\rho^2}\epsilon,Z\right) \\ &=\underbrace{\rho Cov(X,Z)}_{<0} + \sqrt{1-\rho^2}Cov(\epsilon,Z). \end{align} Is there an argument why $Cov(\epsilon,Z)$ should be zero or negative? Or are additional assumptions necessary?


Let's write $Z=r X+\sqrt{1-r^2}\epsilon_Z$ where $\epsilon_Z\sim N(0,1)$ is orthogonal to $X$.. Then, \begin{align} Cov(\epsilon,Z) = \sqrt{1-r^2}Cov(\epsilon,\epsilon_Z). \end{align} I cannot see why $Cov(\epsilon,\epsilon_Z)$ should be zero/negative.

Alex
  • 313
  • 1
  • 9
  • 2
    The answer is definitely **yes.** See https://stats.stackexchange.com/questions/72790 for one particular case and how it can be analyzed. A correct answer has been posted after you accepted an incorrect one. Please reconsider your decision. – whuber Dec 19 '21 at 15:39
  • 3
    Oh, **yes** means here “yes, we can say something about the sign of the cov(y,z)”, which is the answer for the question in the body, and I think Jarle’s answer is the better one. I, on the other hand, despite reading the whole text, answered the question directly in the title, i.e. “Does cov(x,y) < 0 and cov(x,z) > 0 imply cov(y,z) < 0”, and the answer is **no**, generally speaking. – gunes Dec 19 '21 at 16:47

2 Answers2

9

If the correlations between the three variables are $a$, $b$ and $c$, the eigenvalues of the correlation matrix satisfies $$ \left|\begin{matrix} 1-\lambda & a & c \\ a & 1-\lambda & b \\ c & b & 1-\lambda \end{matrix}\right|=0 $$ which after some algebra simplifies to $$ (1-\lambda)^3 - (1-\lambda)(a^2+b^2+c^2)+2abc=0. $$ For given values of $a$ and $b$, the correlation matrix is positive semi-definite when $c$ lies in a closed interval. At the endpoints of this interval, one of the eigenvalues are zero implying that $$ 1-a^2-b^2-c^2+2abc=0. $$ Solving this quadratic equation for $c$, we find that the endpoints of the interval of possible values of $c$ (plotted below) for given values of $a$ and $b$ are $$ ab\pm \sqrt{(a^2-1)(b^2-1)}. $$ If $a$ and $b$ are of opposite sign, the whole interval for $c$ thus contains only negative values if $$ ab+\sqrt{(a^2-1)(b^2-1)}<0. $$ Moving $ab$ to the right hands side and squaring both sides yields $$ (a^2-1)(b^2-1)<a^2b^2 $$ which simplifies to the condition $$ 1<a^2+b^2. \newcommand{corr}{\operatorname{corr}} $$ Thus, to summarise, if $a=\corr(X,Y)$ and $b=\corr(X,Z)$ are of opposite signs and $(a,b)$ lies outside the unit circle, then $c=\corr(Y,Z)$ is necessarily negative.

enter image description here

Jarle Tufto
  • 7,989
  • 1
  • 20
  • 36
5

No (to the title question). Any positive semi-definite matrix is also a covariance matrix. So, if there exists a $\rho$ that the following matrix is PSD, then you've a contradiction:

$$\Sigma=\begin{bmatrix}1&-\rho&\rho\\-\rho &1&\rho\\\rho&\rho&1\end{bmatrix}$$

This occurs for $\rho=0.1$ for example. The conditions on when it implies are correctly laid out in Jarle’s answer.

gunes
  • 49,700
  • 3
  • 39
  • 75
  • 1
    This answers a different question: it shows that the sign might not be determined in a particular case. It does not answer the general question that was asked, though. – whuber Dec 19 '21 at 15:40