2

Let $\{X_k\}$ be a sequence of dependent random variables with mean 0. Define $\bar{Y}_k = \frac{1}{\sqrt k}\sum_{i=1}^k X_i$.

Let $\{W_k\}$ be a sequence of i.i.d. random variables with mean 1 and variance 1. Define $\bar{Z}_k = \frac{1}{\sqrt k}\sum_{i=1}^k W_i X_i$.

It is known that $\bar{Y}_k \implies \mathcal{N}(0,V)$, where $\implies$ denotes convergence in distribution, and $V$ is some covariance matrix.

Can we also say that $\bar{Z}_k$ converges to the same distribution? How would one prove this rigorously? And does this imply that $\bar{Y}_k$ and $\bar{Z}_k$ are asymptotically equivalent, in some sense?'

The main problem I see here is the dependence in the sequence $\{X_k\}$. If they were i.i.d., then the variance of the sum becomes the sum of variances, and the proof becomes straightforward.

user3294195
  • 723
  • 1
  • 4
  • 16

1 Answers1

2

In general, no.

For example, suppose $A_k$ are iid $N(0,1)$ and $B_k$ are also iid $N(0,1)$ and $X_{2k}=A_k+B_k$, $X_{2k+1}= A_k-B_k$. Then the $B_k$ all cancel and $$\bar Y_{2k}= \frac{2}{\sqrt{2k}}\sum_k A_k\sim N(0, 2)$$

Suppose $W_k$ is binary with values 0 and 2. Instead of the $B_k$ all cancelling, about half of them will be present once, and the variance of $\bar Z$ will be larger than that of $\bar Y$.

Update: more generally, the $W_kX_k$ are uncorrelated, $$\mathrm{var}[\bar Z_k] = \frac{1}{k}\sum_k \mathrm{var}[W_kX_k]=\frac{1}{k}\sum_k \mathrm{var}[X_k]$$

Independence (or at least exchangeability) actually matters for these multiplier results; it's not just a technical convenience.

Thomas Lumley
  • 21,784
  • 1
  • 22
  • 73