4

I`ve stumbled upon the following paper by Alán Hajek https://www.jstor.org/stable/40267419,

in which the author states that the Frequentist interpretation of probabilities as limiting frequencies violates countable additivity:

"To see this, start with a countably infinite event space—for definiteness, consider an infinite lottery, with tickets 1, 2, 3, … Let $A_i$ = ‘ticket i is drawn’. Suppose that we have a denumerable sequence of draws (with replacement), and as it happens, each ticket is drawn exactly once. Then the limiting relative frequency of each ticket being drawn is 0; and so according to the hypothetical frequentist, $P(A_i)$ = 0 for all i, and so $\sum _{i=1}^\infty P(A_i ) = 0 $. But $A1 \cup A2 \cup A3 \cup ...$ is an event that happens every time, so its limiting relative frequency is 1. "

So what I take from this example is that not every imaginable probability scenario can be translated into Kolmogorov`s probability axiomatization, as there is no Probability distribution for the Natural numers that assigns equal probability to every natural number. So it is not really surprising that countable additivity is violated for the example. I think the Bayesian notion of probability suffers from a similar problem. As there is no prior probability distribution that expresses that someone is indifferent between 1, 2, 3, ... .

My question is therefore whether there is a solution to that problem? Surely there are situations (as the lottery example) where we would like to model something that the usual axiomatization does not allow. Is there some way out of this, e.g. an alternative axiomatization that allows us to model the above lottery?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Sebastian
  • 2,733
  • 8
  • 24
  • 2
    In fact some Bayesian formalizations insist on only finite additivity. See https://stats.stackexchange.com/q/126056/17230. – Scortchi - Reinstate Monica Mar 14 '18 at 17:52
  • 1
    But Ive read that in order to be coherent your beliefs have to satisfy countable additivity https://www.jstor.org/stable/40072238 (I didn`t read the prove). So giving up countable additivity seems to give up the whole Coherence thing. – Sebastian Mar 14 '18 at 17:56
  • That paper (which I haven't read either) post-dates most of the argument about countable additivity. If it's right, I suspect coherence would be considered a clincher – Scortchi - Reinstate Monica Mar 14 '18 at 18:21
  • 1
    Before concluding that such an example destroys frequentist probability, one would have to investigate under what conditions countable additivity is saved. Furthermore, what you get in your example is a random measure, and maybe one could show that, insome sense, such bad examples occur only with probability (limiting frequency) zero? For some alternatives to limiting frequency definitions, see https://stats.stackexchange.com/questions/332026/can-we-think-of-a-probability-in-both-the-classical-and-subjective-sense-simulta/332218#332218 – kjetil b halvorsen Mar 14 '18 at 19:18
  • @kjetilbhalvorsen can you please elaborate your comment? What exactly is the random measure? – Sebastian Apr 20 '18 at 12:29

0 Answers0