3

Suppose that we have $X_1, ..., X_n$ iid such that $X_i| \theta \sim Ber(\theta)$ and $\theta \sim g(\theta)$ such that

$$g(\theta) = 0.6 Beta(2,1) + 0.4 Beta(1,1) = 1.2 \theta + 0.4$$

Doing the calculations, it is possible to show that $\theta | X$ is also a mixture of betas

$$A = \frac{\Gamma(x+2) \Gamma(n-x+1)}{\Gamma(n+3)}, \qquad B = \frac{\Gamma(x+1) \Gamma(n-x+1)}{\Gamma(n+2)}$$

$\Rightarrow f(\theta | X)= \frac{1,2A}{1,2A+0,4B} Beta(x+2, n-x+1) + \frac{0,4B}{1,2A+0,4B} Beta(x+1, n-x+1)$

But as we can see, the weights, in this case, depends on the sample $X$. Usually, to sample from a mixture of distributions, we use the process described in this example. But what should I do if the weights depends on $X$?

I just gave a random example to make it clear, but I want to know how to approach in a case like this.

Giiovanna
  • 1,108
  • 1
  • 8
  • 21

1 Answers1

1

You are in a Bayesian setting: That the weight depends on $x$ (and not $X$) does not matter since $x$ is observed. Simulating from this posterior is thus like simulating from any other mixture. (Not that you really need to simulate.)

Hence, once $X$ is observed to take the value $x$, you can simplify the terms $A$ and $B$ into $$A = x+1, \qquad B = 1\,$$ and select the component of the mixture with probabilities $1.2A, 0.4B$, respectively.

Xi'an
  • 90,397
  • 9
  • 157
  • 575