I often see multiplications with covariance matrices in literature. However I never really understood what is achieved by multiplication with the covariance matrix. Given $\Sigma * r = s$ with $\Sigma$ being the covariance matrix of $n$ random variables $X_i$, can someone give me an intuitive explanation what $s$ gives me?
What I already (at least think) to understand is the principle of covariance in general and a meaning of the covariance matrix in terms of a linear basis with the $i$th basis vectors being the covariance between random variable $X_i$ and $X_j$ for $1 \leq j \leq n$.
Some intuition I already gathered is as follows:
By multiplying $\Sigma * r$ we weight the samples $X_i$ according to $r$. With fixed $i$, $y_i$ gives us then the sum over the weighted covariances with $X_i$ and $X_j$ for $1 \leq j \leq n$, which means a value of how well $X_i$ "covaries" in the direction of $r$.
But what would mean "in the direction of $r$" and what is this result really useful for?
Often I see a value like this: $r^T * \Sigma^{-1} * r$
What would this value be useful for?
(And I know about the nice properties of the Eigenvalues and Eigenvectors for $\Sigma$)