0

To my understanding:

Flipping a coin has a discrete 1/2 probability to be heads or tails, and every iteration of that trial resets the probability back to 1/2. So, it could be heads every time, or, heads 20% of the time. Both of these outcomes are perfectly valid. Yet, the more you do the experiment, the more the results tend to normalize into 50% heads and 50% tails as you approach infinity.

So my question is: what governs these outcomes and causes distributions to equalize/find equilibrium?

Edit: To clarify, I am not looking for mathematical proofs or theorems, I am simply looking for intuition regarding the ontology of this matter as it applies to our reality and the law of large numbers. I'm simply demonstrating my question in the context of a coin toss experiment.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
spyter
  • 101
  • 1
  • 2
    Statistics - there is not a physical law, per se, just math. –  Jul 13 '21 at 19:27
  • 6
    This is the law of large numbers. – Dave Jul 13 '21 at 19:36
  • 2
    There is nothing to cause as the later flips do not compensate for the earlier flips, they just make the earlier flips less influential by weight of numbers. – Henry Jul 13 '21 at 19:41
  • @Dave yes that's exactly what I'm referring to :)! I didn't know the LLN was a thing (I'm new to this), but that's the phenomenon I am referencing. Why does it happen and what in our universe governs that? – spyter Jul 13 '21 at 19:42
  • 3
    @spyter Is it fair to say that you're asking for [tag:intuition] about how the [tag:law-of-large-numbers] comes about, rather than a restatement of the proof of the theorem? Note that "every iteration of that trial resets the probability back to 1/2" describes *[tag:independence]*. – Sycorax Jul 13 '21 at 19:46
  • 1
    @Sycorax yes, I'm asking an ontological question here. I believe the LLN works, and there are mathematical proofs for it. My question is - why does it work? Is there something in our reality's nature that is innately governing the distributions and causing them to eventually equalize? – spyter Jul 13 '21 at 20:35
  • @spyter Perhaps you could [edit] your question to clarify that you're asking for intuition about the law of large numbers, and that you're motivating that inquiry in the context of a coin toss experiment. – Sycorax Jul 13 '21 at 20:40
  • 7
    This "equalization" is reminiscent of a common misunderstanding. The *counts* of heads and tails do *not* tend to equality. Quite the contrary: they tend to diverge. The equalization is that of *relative frequencies.* Ultimately this is a postulate, rather than a fact about the universe, because the universe has no way to test an asymptotic proposition. Far before any asymptotics kick in, the "coin" will wear down or otherwise physically change its characteristics. This is why commenters are right to insist on the purely mathematical nature of this phenomenon. – whuber Jul 13 '21 at 21:27
  • 2
    Does this answer your question? [Regression to the mean vs gambler's fallacy](https://stats.stackexchange.com/questions/204397/regression-to-the-mean-vs-gamblers-fallacy) – Peter O. Jul 13 '21 at 21:46
  • @Sycorax - I updated my question appropriately. I think you're exactly on topic with what I'm trying to ask. Basically, I am curious as to "why" the law of large numbers works. I am using the coin toss analogy to keep things simple and demonstrate my question. To my understanding, the coin will land heads nearly 50% of the time if the trials are done in repetition approaching infinity. Yet, each trial is independent from the others and it should be truly random as to the outcomes. I would expect random distributions, rather than 50/50. That's where my confusion is - if that makes sense? – spyter Jul 14 '21 at 01:11
  • @PeterO. - In some ways, yes I think its somewhat correlated. I'm actually asking the opposite of the gambler's fallacy. I believe each coin toss should be truly random and there should be no influence from previous trials (contrary to gamblers fallacy). But, the law of large numbers says that if I do the trials a large amount of times (approaching infinity), I'll end up with roughly 50/50 for heads/tails in the end. To me this seems to not imply randomness. – spyter Jul 14 '21 at 01:16
  • 3
    You seem to be missing how "randomness" works across sums of variates and hence averages. Let $X_i$ be $1$ if the $i$th toss is a head and $0$ otherwise and let the coin be fair & the trials independent. While the multivariate distribution of $X_1, X_2 ... X_L$ (for some large value $L$, say) is uniform over $\{0,1\}^L$ (which is what I assume you mean by "random"), when you *sum* those $L$ variates, there are vastly more ways to get a sum close to $L/2$ than there are close to $0$ or $L$, i.e. many more combinations that yield a proportion near $1/2$. As $L$ increases this effect increases... – Glen_b Jul 14 '21 at 01:46
  • 4
    ... and the proportion-of-heads distribution become more concentrated around $1/2$. You see this already start at $L=2$; there's two ways to get 50% heads but only one way to get 0% or 100% heads. – Glen_b Jul 14 '21 at 01:48
  • 2
    “I believe the LLN works, and there are mathematical proofs for it. My question is - why does it work?”—That’s precisely what the proofs exist to show. – Arya McCarthy Jul 14 '21 at 05:11
  • 2
    As the number of tosses increases, although one gets closer to the mean as a percentage error, one gets further from the mean in actual number of coin tosses. So, it "works" and it "doesn't work" depending on your POV. Same thing happens for Poisson distributions as binomial distributions. Proving it is simple, just look at the calculations for mean and standard deviation (SD), i.e., $\lim_{n\to\infty}\frac{\text{SD}}{\text{mean}}\to0$ but $\text{SD}\to\infty$. – Carl Jul 14 '21 at 22:07

0 Answers0