Ok, from doing some more reading, the assumption in the question is wrong. Covariance actually defines an inner product, if (as I understand it) you first remove the mean of each variable. That is, covariance satisfies the following:
- Conjugate symmetry: $Cov(X,Y) = Cov(Y,X)$
- Linearity in the first argument: $Cov(aX+bY,Z) = a\cdot Cov(X,Z) + b\cdot Cov(Y,Z)$
- Positive semi-definiteness: $Cov(X,X) = 0$ with equality only for X constant (This is where the removal of the mean comes in: $Cov(X,Y) = \langle X-\bar{X},Y-\bar{Y}\rangle$, such that $Var(X) = 0 \iff X_i = \bar{X}\ \forall\ i$ [X is constant, so removing the mean gives the zero vector])
This inner product induces a norm, also with the mean of the random variable removed: $\|X-\bar{X}\| = Var(X)$
And the norm in turn induces a metric: $d(X,Y) = \|(X-\bar{X})-(Y-\bar{Y})\| = Var(X-Y)$
So in a sense, Covariance does define a metric, just not in the way I first thought. It's been a whole 6 months since I studies any linear algebra, so I might have some of those slightly screwed up. Please correct me if I'm wrong!
The same applies to distance covariance, as is pretty clear from the wikipedia article, although it's a different inner product.