26

I am trying to understand the link between the moment-generating function and characteristic function. The moment-generating function is defined as: $$ M_X(t) = E(\exp(tX)) = 1 + \frac{t E(X)}{1} + \frac{t^2 E(X^2)}{2!} + \dots + \frac{t^n E(X^n)}{n!} $$

Using the series expansion of $\exp(tX) = \sum_0^{\infty} \frac{(t)^n \cdot X^n}{n!}$, I can find all the moments of the distribution for the random variable X.

The characteristic function is defined as: $$ \varphi_X(t) = E(\exp(itX)) = 1 + \frac{it E(X)}{1} - \frac{t^2 E(X^2)}{2!} + \ldots + \frac{(it)^n E(X^n)}{n!} $$

I don't fully understand what information the imaginary number $i$ gives me more. I see that $i^2 = -1$ and thus we don't have only $+$ in the characteristic function, but why do we need to subtract moments in the characteristic function? What's the mathematical idea?

Ferdi
  • 4,882
  • 7
  • 42
  • 62
Giuseppe
  • 1,211
  • 3
  • 14
  • 23
  • 8
    One important point is that the moment-generating function is not always finite! (See [this question](http://stats.stackexchange.com/q/32706/2970), for example.) If you want to build a general theory, say, about convergence in distribution, you'd like to be able to have it work with as many objects as possible. The characteristic function is, of course, finite for any random variable since $|\exp(itX)| \leq 1$. – cardinal Nov 22 '12 at 15:59
  • 1
    The similarities in the Taylor expansions still allow one to read off the moments, when they exist, but note that not all distributions have moments, so the interest in these functions goes far beyond this! :) – cardinal Nov 22 '12 at 15:59
  • 7
    Another point to note is that the MGF is the Laplace transformation of a random variable and the CF is the Fourier transform. There are fundamental relationships between these integral transforms, see [here](http://en.wikipedia.org/wiki/Laplace_transform#Relationship_to_other_transforms). – tchakravarty Nov 22 '12 at 17:52
  • I thought CF is the inverse fourier transform (and not the fourier transform) of a propability distribution? – Giuseppe Nov 23 '12 at 10:04
  • 1
    The distinction is only a matter of sign in the exponent, and possibly a multiplicative constant. – Glen_b Mar 13 '13 at 14:28
  • @tchakravarty. +1 This is exactly what i was looking for difference between MGF, Laplace transform and Fourier transform! – GENIVI-LEARNER Feb 14 '20 at 18:30

1 Answers1

17

As mentioned in the comments, characteristic functions always exist, because they require integration of a function of modulus $1$. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.

When we know that $E[e^{tX}]$ is integrable for all $t$, we can define $g(z):=E[e^{zX}]$ for each complex number $z$. Then we notice that $M_X(t)=g(t)$ and $\varphi_X(t)=g(it)$.

Davide Giraudo
  • 2,100
  • 2
  • 14
  • 23