11

Simple question, yet surprisingly difficult to find an answer online.

I know that for a RV $X$, we define the kth moment as $$\int X^k \ d P = \int x^k f(x) \ dx$$ where the equality follows if $p = f \cdot m$, for a density $f$ and Lebesgue measure $m$.

So, what is the kth moment of, say, $(X,Y)$? $\int (X,Y) \ d P$ doesn't really seem like the answer to me....

Silverfish
  • 20,678
  • 23
  • 92
  • 180
Charac
  • 111
  • 3

2 Answers2

10

There isn't a "the" with respect to moments, since there are many of them, but moments of bivariate variables are indexed by two indices, not one.

So rather than $k$-th moment, $\mu_k$ you have $(j,k)$-th moments, $\mu_{j,k}$ (sometimes written $\mu_{jk}$ when that's not ambiguous). We might speak of $\mu_{1,1}$, the $(1,1)$ moment or $\mu_{1,2}$, the $(1,2)$ moment, or $\mu_{2,2}$, and so on.

These are sometimes called mixed moments.

So generalizing your one-dimensional continuous example,

$\mu_{j,k} = \int\int x^j y^k f(x,y) \, dx dy$

This generalizes to higher dimensions.

Glen_b
  • 257,508
  • 32
  • 553
  • 939
1

As @Glen_b♦ has mentioned, moment generalized to cross-moment (relate concepts: joint moment generating function, joint characteristic function and cumulant) in higher dimensions.

That said, to me this definition doesn't feel like an equivalent to the univariate moment, because cross-moment evaluates to a real number, but for, say, a multivariate normal vector, the mean is a vector and the variance is a matrix. I speculate that one might define higher-dimensional "moments" using derivatives of the joint characteristic function $\varphi_\mathbf{X}(\mathbf{t})=E[e^{i\mathbf{t}'\mathbf{X}}]$, here derivatives are generalized using rank-$k$ tensors (so the second order derivative would be a Hessian matrix).

There are many other interesting related topics, such as: Measures of Multivariate Skewness and Kurtosis with Applications.

Xi'an
  • 90,397
  • 9
  • 157
  • 575
Francis
  • 2,972
  • 1
  • 20
  • 26