I`ve stumbled upon the following paper by Alán Hajek https://www.jstor.org/stable/40267419,
in which the author states that the Frequentist interpretation of probabilities as limiting frequencies violates countable additivity:
"To see this, start with a countably infinite event space—for definiteness, consider an infinite lottery, with tickets 1, 2, 3, … Let $A_i$ = ‘ticket i is drawn’. Suppose that we have a denumerable sequence of draws (with replacement), and as it happens, each ticket is drawn exactly once. Then the limiting relative frequency of each ticket being drawn is 0; and so according to the hypothetical frequentist, $P(A_i)$ = 0 for all i, and so $\sum _{i=1}^\infty P(A_i ) = 0 $. But $A1 \cup A2 \cup A3 \cup ...$ is an event that happens every time, so its limiting relative frequency is 1. "
So what I take from this example is that not every imaginable probability scenario can be translated into Kolmogorov`s probability axiomatization, as there is no Probability distribution for the Natural numers that assigns equal probability to every natural number. So it is not really surprising that countable additivity is violated for the example. I think the Bayesian notion of probability suffers from a similar problem. As there is no prior probability distribution that expresses that someone is indifferent between 1, 2, 3, ... .
My question is therefore whether there is a solution to that problem? Surely there are situations (as the lottery example) where we would like to model something that the usual axiomatization does not allow. Is there some way out of this, e.g. an alternative axiomatization that allows us to model the above lottery?