0

Let's say I have an 'unfair' coin, for which I'm interested in estimating the 'heads' likelihood or 'p' value.

Knowing nothing about the coin, the distribution of probable 'p' values is a uniform distribution from 0 to 1.

How can I update this distribution of probable 'p' values after observing X heads out of Y trials?

Aaron
  • 1

2 Answers2

0

Seems like Bayesian is the natural approach here. Since prior of p is unif(0,1), posterior is Beta(X+1, Y-X+1) as already answered by jbowman.

So I'll add the details here. Let $x_i$, $i=1,\ldots,Y$ be the outcomes from coin tosses. Given: $\sum_{i=1}^Y x_i = X$, the number of heads. Let $\theta = Pr(X_i = 1)$, where we'll treat 'head' as 1. Therefore, the random variable $X_i \sim Bernoulli(\theta)$, with 'head' as the 'success'.

Given prior for $\theta$: $f(\theta) = 1 I(\theta \in (0,1))$, which is unif(0,1). Since $\sum_{i=1}^Y x_i$ is $Binomial(Y,\theta)$, the likelihood is $Pr(\sum_{i=1}^Y x_i = X) = K\theta^X(1-\theta)^{Y-X}$, where $K$ is $Y$ choose $X$.

Now, posterior of $\theta$: $f(\theta\mid D) = \frac{f(D\mid\theta)f(\theta)}{\int_{0}^{1}f(D\mid\theta)f(\theta)d\theta}$, where $D$ is $\sum_{i=1}^Y x_i$, the 'data'. Therefore, $f(\theta\mid D) = \frac{K\theta^X(1-\theta)^{Y-X}}{K\int_{0}^{1}\theta^{X}(1-\theta)^{Y-X}d\theta} = \frac{\theta^X(1-\theta)^{Y-X}}{Beta(X+1, Y-X+1)}$, which is the density function of the Beta distribution with parameters $X+1$ and $Y-X+1$. $Beta(,)$ is the Beta function.

saipk
  • 95
  • 1
  • 8
0

I have gone through an example of updating the probability of $p$ after each observation, starting with a flat prior - with a graph of the posterior probability of $p$ after each update. Basically the same as the other 2 answers, but graphed it, and done sequentially after each new flip.

https://stats.stackexchange.com/a/371520/223056

ken
  • 423
  • 2
  • 8