2

What is the covariance matrix of $$f = 2x + 3y$$ if random variables x,y are independent and have a covariance matrices $\sum_{x}$ $\sum_{y}$?

I know the covarince matrix of a random variable is given as

$\sum_{x}$ = $\sum_{i=1}^N(\vec{x}_i - \vec{\mu})(\vec{x}_i - \vec{\mu})^T$.

But how it should be derived when it is a sum of two random variables. Thanks.

P.S: I am studying Model Identification course and I am trying to figure out the concepts of statistics. It would be really helpful if someone could provide the links to Web resources and MOOC courses where these kind of questions are discussed.

1 Answers1

5

The covariance operator is additive over uncorrelated sums. For example, if $u$ and $v$ are independent, then they are uncorrelated, hence $\text{cov}(u+v) = \text{cov}(u) + \text{cov}(v)$.

In your example, by independence of $x$ and $y$, which implies independence of $2x$ and $3y$, we have $$ \text{cov}(2x+3y) = \text{cov}(2x) + \text{cov}(3y) = 4 \text{cov}(x) + 9\text{cov}(y) = 4\Sigma_x + 9\Sigma_y. $$


Why $\text{cov}(2x) = 4 \text{cov}(x)$? Consider a zero-mean random vector $x$ and let $z = \alpha x$ where $\alpha \in \mathbb R$ is deterministic. We note that $z$ is also zero-mean (why?), and have $$ \text{cov}(\alpha x) = \text{cov}(z) = E (zz^T) = E [(\alpha x) (\alpha x)^T] = \alpha^2 E(xx^T)= \alpha^2 \text{cov}(x). $$

passerby51
  • 1,573
  • 8
  • 11