Say I have a collection of $n$ random variables $X_i$ that don't necessarily have any special properties like independence or identical distribution.
Is it true in general that $\text{Var}\left(\sum_{i=1}^n a_iX_i\right)=\sum_{i=1}^n\sum_{j=1}^n a_ia_j\text{Cov}\left(X_i,X_j\right)$, where the $a_i$ are constants?
I got to that equation by looking at the formula $\text{Var}\left(aX+bY\right)=a^2\text{Var}\left(X\right)+b^2\text{Var}\left(Y\right)+ab\text{Cov}\left(X,Y\right)$, viewing variance as a special case of covariance, and thinking about adding successively more variables.
If so, it looks like I could calculate the variance of a sum of random variables by adding up all the elements in their variance-covariance matrix--which would be interesting, since the combination of random variables itself is just a one-dimensional thing.
I think it clearly holds in the two-variable case, but I didn't find any discussion on the topic in the case of $n$ variables.