6

Does the average $\frac{\sum^n_i X_i}{n}$ converge to a normal when $n \to \infty $. Here $X_i$ are independently distributed Laplace samples, with zero mean, and different standard deviation $\sigma_i$.

I know this can be applied from the general Central Limit Theorem However, I am not sure if the Laplace distribution satisfies Lyapunov condition?

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
Phong Le
  • 341
  • 1
  • 6
  • The Lyapunov condition seems to hold (do you want the calculations?), but the Lyapunov CLT says something about $\sum\frac{X_i}{\sigma_i}$, not $\sum\frac{X_i}{n}$. The first will converge to a standard normal, but the second may not (if $(\sigma_i)_i$ grows fast enough). Typo, or intended? If the latter, the Lyapunov CLT may not be what you are looking for. – Stephan Kolassa Jul 28 '20 at 08:28
  • Update: it looks like the Lyapunov condition does not hold in general, either, at least not for all possible sequences $(\sigma_i^2)$. – Stephan Kolassa Jul 28 '20 at 09:50
  • Incidentally, [*The Laplace Distribution and Generalizations* by Kotz, Kozubowski & Podgórski](https://link.springer.com/book/10.1007/978-1-4612-0173-1) looks helpful. – Stephan Kolassa Jul 29 '20 at 07:18

2 Answers2

4

TL;DR

You cannot use either the Lyapunov or the Lindeberg CLT to say anything about the convergence in distribution of $\frac{1}{s_n}\sum_{i=1}^n X_i$ (where $s_n^2=\sum_{i=1}^n\sigma_i^2$) without additional conditions on the sequence of variances $(\sigma_i^2)$.

Neither CLT would say anything about $\frac{1}{n}\sum_{i=1}^n X_i$. If the sequence of variances $(\sigma_i^2)$ grows fast enough, I strongly doubt that this average converges to anything reasonable.


Assume that $X_i\sim\text{Laplace}(0,b_i)$ for a parameter $b_i>0$. Then $\sigma_i^2=2b_i^2$. As above, let

$$ s_n^2=\sum_{i=1}^n\sigma_i^2=2\sum_{i=1}^n b_i^2.$$

The key property we need is that $|X_i|\sim\text{Exp}\big(\frac{1}{b_i}\big)$. This allows us to easily calculate the expectations we need in the Lyapunov or Lindeberg CLTs.


The condition for the Lyapunov CLT is that there is some $\delta>0$ such that

$$\lim_{n\to\infty}\frac{1}{s_n^{2+\delta}} \sum_{i=1}^nE\big(|X_i|^{2+\delta}\big) =0. $$

We have $$ E\big(|X_i|^{2+\delta}\big) = b_i^{2+\delta}\Gamma(\delta+3), $$ so $$ \frac{1}{s_n^{2+\delta}} \sum_{i=1}^nE\big(|X_i|^{2+\delta}\big) = \frac{\sum_{i=1}^n b_i^{2+\delta}\Gamma(\delta+3)}{\big(\sum_{i=1}^n 2b_i^2\big)^\frac{2+\delta}{2}} = \frac{\Gamma(\delta+3)}{2^\frac{2+\delta}{2}}\frac{\sum_{i=1}^n b_i^{2+\delta}}{\big(\sum_{i=1}^n b_i^2\big)^\frac{2+\delta}{2}}. $$ So the condition is that there is some $\delta>0$ such that $$\frac{\sum_{i=1}^n b_i^{2+\delta}}{\big(\sum_{i=1}^n b_i^2\big)^\frac{2+\delta}{2}} \to 0. $$ However, this does not hold in general. Consider $b_i=\frac{1}{i}$. Recall that $\sum_{i=1}^\infty\frac{1}{i^2}=\frac{\pi^2}{6}$. So the denominator in the fraction goes to $\frac{\pi^{2+\delta}}{\sqrt{6}^{2+\delta}}$, whereas the numerator has some other finite but nonzero limit. So the condition that the fraction goes to zero does not hold for this choice of $(b_i)$.


The condition for the Lindeberg CLT is that for all $\epsilon>0$, $$ \lim_{n\to\infty}\frac{1}{s_n^2}\sum_{i=1}^nE\big(X_i^21_{|X_i|>\epsilon s_n}\big) = 0. $$ The expectation here is just a moment of a left-truncated exponential distribution. We have $$ E\big(X_i^21_{|X_i|>k}\big) = \int_k^\infty \frac{x^2}{b_i}e^{-\frac{x}{b_i}}\,dx = e^{-\frac{k}{b_i}}(2b_i^2+2b_ik+k^2). $$ So the Lindeberg condition is that $$ \frac{1}{s_n^2}\sum_{i=1}^nE\big(X_i^21_{|X_i|>\epsilon s_n}\big) = \sum_{i=1}^n e^{-\frac{\epsilon s_n}{b_i}}\frac{2b_i^2+2b_i\epsilon s_n+\epsilon^2 s_n^2}{s_n^2} \to 0. $$ But that again does not hold in general: consider any sequence $(b_i)$ such that the series of variances $(s_n)$ stays bounded.

Stephan Kolassa
  • 95,027
  • 13
  • 197
  • 357
2

The other answer by Stephen Kolassa gives you an excellent analysis of the Lyapunov condition in this case. However, I think it is also fruitful to look at this problem using moment generating functions. In your problem you have independent values $X_i \sim \text{Laplace}(0, \sigma_i/\sqrt{2})$, so these random variables have scaled moment generating functions given by:

$$\begin{align} \varphi_{i}(t/n) \equiv \mathbb{E}(\exp(tX_i/n)) = \frac{1}{1 - \sigma_i^2 t^2/2n^2} &= 1 + \frac{\sigma_i^{2}}{2} \cdot \frac{t^2}{n^2} + \mathcal{O}(n^{-4}). \\[6pt] \end{align}$$

Letting $\bar{X}_n \equiv \sum_{i=1}^n X_i/n$ denote the sample mean of interest, this latter random variable has moment generating function, we have the characteristic function:

$$\begin{align} \varphi_{\bar{X}_n}(t) = \prod_{i=1}^n \varphi_{i}(t/n) &= \prod_{i=1}^n \frac{1}{1 - \sigma_i^2 t^2/2n^2}. \\[6pt] \end{align}$$

Taking $n \rightarrow \infty$ gives the asymptotic form:

$$\begin{align} \varphi_{\bar{X}_n}(t) &\rightarrow \prod_{i=1}^n \Bigg( 1 + \frac{\sigma_i^{2}}{2} \cdot \frac{t^{2}}{n^{2}} \Bigg). \\[6pt] \end{align}$$

In the special case where $\sigma_1 = \sigma_2 = \sigma_3 = \cdots$ this function converges to an exponential function in $t^2$, which is the moment generating function for the normal distribution. In the more general case, the moment generating function will not converge to the exponential function in $t^2$, and so the distribution of the sample mean does not converge to the normal distribution.

If you would like to go further than this, I suggest you look into conditions on the $\sigma_i$ values that will allow you to get a useful convergence result for the above asymptotic form. It may be possible to simplify this asymptotic form under some conditions on these values, but I will leave this to you to investigate.

Ben
  • 91,027
  • 3
  • 150
  • 376
  • 1
    (+1). Ben, when I calculate the scaled MFG, I got $1/(1 \color{red}{-} \frac{s^{2}t^{2}}{2n^{2}})$ (note the minus sign in the denominator) but I could have made an error. – COOLSerdash Jul 29 '20 at 06:32
  • 1
    You're right --- corrected. – Ben Jul 29 '20 at 09:54