3

Say we have two coins with unknown success probabilities $p_1$ and $p_2$. To know more about the probabilities, say that we use Bayesian approach.

To do so, we first set our prior: $P_1\sim Beta(1,1)$ and $P_2\sim Beta(1,1)$.

Tossing both of the coins together, we update each of the prior as usual.

For example, we ran 10 rounds of tossing, 4 Heads on coin 1 and 8 Heads on coin 2.

The posterior should be $P_1\sim Beta(5,7)$ and $P_2\sim Beta(9,3)$.

My question is that what happens if we receive an "extra" piece of information that says "$p_1$ is greater than $p_2$"

Is there any systematic way to accommodate this extra piece of information into the posterior?

Andeanlll
  • 381
  • 2
  • 10
  • 2
    You could simply constrain the support of the joint prior over $(P_1, P_2)$ to the set $\{ P_1 > P_2\}$. – πr8 Aug 24 '21 at 08:17

2 Answers2

2

What you're referring to in the first part of the question, is the beta-binomial model. where binomial distribution is assumed as the likelihood and beta as a prior, hence by conjugacy posterior is also a beta distribution.

Your problem description in the second part describes a different scenario because it is multivariate. If you know that $p_1 > p_2$, this means that the parameters are dependent and you are talking about some multivariate distribution for the parameters (vs two univariate beta distributions). In such a case, you cannot use two (independent) beta-binomial models. The constraint can be imposed by choosing a multivariate prior for the parameters. For such a model you won't have a closed-form solution, so you would need to use MCMC or some other kind of approximate inference.

Tim
  • 108,699
  • 20
  • 212
  • 390
  • Thank you for the answer. I got your point. So, the new information correlates the two separate inference problems and it makes the problem more complicated.. Then, alternatively, if we think about only one coin $p_1$ and if the new information is $p_1\geq 1/2$. If this is the case, what would be the resulting inference? – Andeanlll Aug 24 '21 at 09:18
  • In our example, do we simply have $P_1\sim Beta(5,7)$ and $P_1\geq 1/2$ as the posterior? or we cannot have some thing like $P_1\sim Beta(5,7)$ as $Beta(5,7)$ has its mean lower than 1/2? – Andeanlll Aug 24 '21 at 09:20
  • @Andeanlll I believe this thread answers your question https://stats.stackexchange.com/questions/538935/what-happens-if-i-change-the-range-of-a-flat-prior-for-bayesian-inference – Tim Aug 24 '21 at 09:31
2

For the case where you want to use the restriction $p_{1}, \geq 1/2$ your prior will be of the following form

$$\pi(p_{1})=\left\{\begin{matrix} Beta(a,b), & p_{1}\geq 1/2 \\ 0, & otherwise \end{matrix}\right. = Trunc-Beta(a,b)$$

Hence, your posterior can be derived as

$$\pi(p_{1}|x) \propto L(p;x)*Beta(p_{1};a,b)*\mathbb{I}(p_{1}\geq 1/2) = L(p;x)*Trunc-Beta(a,b)$$

The truncated Beta distribution can be calculated in the following way restricted to the interval $[1/2,1]$

$$Trunc-Beta(a,b) = \frac{Beta(a,b)}{F(1)-F(1/2)}$$ for exact calculations you can check here https://en.wikipedia.org/wiki/Beta_distribution.

Essentially what will happen to your posterior by adding this extra information $p_{1}\geq 1/2$, is that even though your likelihood might give importance to values of $p_{1}$ less than $1/2$ your prior will make them zero, because it gives zero mass to those values less than $1/2$.

In the case where you have $p_{1}\geq p_{2}$, then your prior will be of the form

$$\pi(p_{1},p_{2}) = \left\{\begin{matrix} f(p_{1},p_{2}), & p_{1}\geq p_{2} \\ 0,& otherwise \end{matrix}\right.$$

and the posterior

$$\pi(p_{1},p_{2}|x)\propto L_{1}(p_{1};x_{1})*L_{2}(p_{2};x_{2})*f(p_{1},p_{2})*\mathbb{I}(p_{1}>p_{2})$$ similarly, the product of the likelihoods $L_{1}(p_{1};x_{1})*L_{2}(p_{2};x_{2})$ might give importance to cases where $p_{1}<p_{2}$ but your prior will give again zero mass to those choices.

Fiodor1234
  • 1,679
  • 6
  • 15