Follow on question to this, answered negatively by Thomas Lumley. We reprint it here for convenience.
In this SE question, it is stated that there is a central limit theorem for the sample median, namely
$$ \sqrt{n}(Y_n - m) \xrightarrow{d} N(0, [2f(m)]^{-2}), $$
as $n\to\infty$ where
- $Y_n$ is the sample median from $n$ iid samples,
- $m$ is the population median,
- $f$ is the PDF (assumed to exist) of the distribution we're sampling from.
If I'm not mistaken, this result holds even if the original distribution doesn't have finite variance (e.g. the Cauchy distribution).
Is it necessarily true that the variances converge? I.e that
$$ nE[(Y_n-m)^2] \to [2f(m)]^{-2}? $$
As answered in the linked question, the answer is no. Let $F$ be the CDF of the distribution. First, note that the PDF of the sample median is
$$ f_{Y_{2k+1}}(x) = (2k+1) \binom{2k}{k} F(x)^k (1-F(x))^k f(x). $$
Thomas Lumley's counterexample is a distribution whose CDF $F$ looks like the following.
$$ F(x) \overset{x\to -\infty}{\sim} \frac{1}{\log |x|} \\ 1-F(x) \overset{x\to \infty}{\sim} \frac{1}{\log |x|}. $$
In this case, the second moment of $Y_{2k+1}$ looks like the following.
$$ \begin{align*} E[Y_{2k+1}^2] &= \int_{-\infty}^\infty x^2f_{Y_{2k+1}}(x) \mathrm{d}x \\ &= C\int_{-\infty}^\infty x^2 F(x)^k (1-F(x))^k f(x) \mathrm{d}x \\ &= C\int_{-\infty}^\infty x^2 F(x)^k (1-F(x))^k f(x) \mathrm{d}x. \end{align*} $$
this will be infinite for all $k$, from which the negative result follows.
My follow up question is: if we add the extra constraint that the CDF $F$ satisfies
$$ F(x) \overset{x\to-\infty}{=} o(|x|^{-\epsilon}) \\ 1-F(x) \overset{x\to\infty}{=} o(|x|^{-\epsilon}) \\ $$
for some $\epsilon>0$ (that is, the CDF approaches infinity polynomialy), is it now true that the variances converge?
As a particular case, is true for a Cauchy distribution?