9

I have been struggling quite a bit with reconciling my intuitive understanding of probability distributions with the weird properties that almost all topologies on probability distributions possess.

For example, consider a mixture random variable $X_n$: pick a Gaussian centered at 0 with variance 1, and with probability $\frac{1}{n}$, add $n$ to the result. A sequence of such random variables would converge (weakly and in total variation) to a Gaussian centered at 0 with variance 1, but the mean of the $X_n$ is always $1$ and the variances converge to $+\infty$. I really don't like saying that this sequence converges because of that.

I took me quite some time to remember everything I've forgotten about topologies, but I finally figured out what was so unsatisfying to me about such examples: the limit of the sequence is not a conventional distribution. In the example above, the limit is a weird "Gaussian of mean 1 and of infinite variance". In topological terms, the set of probability distributions isn't complete under the weak (and TV, and all the other topologies I've looked at).

I then face the following question:

  • does there exist a topology such that the ensemble of probability distributions is complete ?

  • If no, does that absence reflect an interesting property of the ensemble of probability distributions ? Or is it just boring ?

Note: I have phrased my question about "probability distributions". These can't be closed because they can converge to Diracs and stuff like that which don't have a pdf. But measures still aren't closed under the weak topology so my question remains

crossposted to mathoverflow https://mathoverflow.net/questions/226339/topologies-for-which-the-ensemble-of-probability-measures-is-complete?noredirect=1#comment558738_226339

Guillaume Dehaene
  • 2,137
  • 1
  • 10
  • 18
  • 2
    You discovered that the set of all probability distributions is noty **compact**. I think compactness is the word you need, not completeness. The relevant concept of compactness in this setting is often called **tightness**. See for instance http://stats.stackexchange.com/questions/180139/implications-of-point-wise-convergence-of-the-mgf-reference-request/181038#181038 – kjetil b halvorsen Dec 17 '15 at 16:03
  • @kjetilbhalvorsen I think it is **precompact** instead of compact due to Skorohod's Theorem. – Henry.L Mar 15 '17 at 19:09
  • What exactly is the problem with the example given? Is it that (weak, say) convergence does not imply convergence of moments? Why should it? And what does this have to do with completeness (limit exists in the given example)? – Michael Aug 24 '19 at 00:07

1 Answers1

1

Looking at the question from a more narrow statistical angle (the general mathematical topological issue is valid), the fact that the sequence of moments may not converge to the moments of the limiting distribution is a well-known phenomenon. This in principle, does not automatically set in doubt the existence of a well behaved limiting distribution of the sequence.

The limiting distribution of the above sequence $\{X_n + n Bern(1/n)\}$ is a well-behaved $N(0,1)$ distribution with finite moments. It is the sequence of the moments that does not converge. But this is a different sequence, a sequence comprised of functions of our random variables (integrals, densities and such), not the sequence of the random variables themselves whose limiting distribution we are interested at.

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241