8

The following lemma can be found in Hayashi's Econometrics:

Lemma 2.1 (convergence in distribution and in moments): Let $\alpha_{sn}$ be the $s$-th moment of $z_{n}$, and $\lim_{n\to\infty}\alpha_{sn}=\alpha_{s}$ where $\alpha_{s}$ is finite (i.e., a real number). Then:

"$z_{n} \to_{d} z$" $\implies$ "$\alpha_{s}$ is the $s$-th moment of $z$."

Thus, for example, if the variance of a sequence of random variables converging in distribution converges to some finite number, then that number is the variance of the limiting distribution

As far as I understand, there are no additional assumptions on $z_{n}$ that can be inferred from the context. Now consider a sequence of random variables defined by $z_{n} = n\mathbb{1}_{[0,\frac{1}{n}]}$ in uniform probability measure on $[0,1]$.

Then $z_{n} \to_{d} 0$, but $(\forall n)\ E(z_{n}) = 1 \to 1 \neq 0 = E(0)$.

If I am reading the above lemma correctly, $\{z_n\}$ provides a counterexample.

Question: Is the lemma false? Is there a related result that specifies general conditions under which convergence in distribution implies convergence in moments?

  • What does the notation $z_n = n 1_{[0,\frac{1}{n}]}$ mean? Is it $$z_n = \begin{cases} 0, \text{Pr }\frac{n-1}{n} \\ n, \text{Pr }\frac{1}{n} \\ \end{cases}$$? – PaulG Jan 16 '22 at 15:23

2 Answers2

7

A sufficient additional condition is that of uniform integrability, i.e., that $$\lim_{M\to\infty} \sup_n \int_{|X_n|>M}|X_n|dP= \lim_{M\to\infty} \sup_n E [|X_n|1_{|X_n|>M}]=0.$$ Then, one gets that $X$ is integrable and $\lim_{n\to\infty}E[X_n]=\mathbb{E}[X]$.

Heuristically, this condition rules out that there are still "extreme" contributions to the integral (expectation) asymptotically.

Now, this is indeed precisely what happens in your counterexample, as - never mind with vanishing probability - $z_n$ may take the diverging value $n$. Somewhat more precisely, $E[|z_n|1_{\{|z_n|>M\}}]=E[z_n1_{\{z_n>M\}}]=1$ for all $n>M$. Hence, $E[z_n1_{\{z_n>M\}}]$ does not uniformly converge to zero, as we cannot find an $N$ such that $E[z_n1_{\{z_n>M\}}]<\epsilon$ for all $n\geq N$, all $\epsilon>0$ and all $M$.

A sufficient condition for uniform integrability is $$\sup_n E[|X_n|^{1+\epsilon}]<\infty$$ for some $\epsilon>0$.

And while not satisfying the sufficient condition is of course no proof of lack of uniform integrability, it is even more direct to see that this condition is not satisfied, as $$E[|X_n|^{1+\epsilon}]=n^\epsilon,$$ which evidently does not have a finite $\sup$ over $n$.

Christoph Hanck
  • 25,948
  • 3
  • 57
  • 106
5

Indeed, it is a known erratum of this book (see its website in the errata .pdf), that the specific lemma does not state the moment-boundedness condition $$\exists \; \delta : E(|z_n|^{s+\delta}) < M < \infty\;\; \forall n$$

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241