4
Algorithm 1)

Step 1:Obtain a sample $y$ from distribution $Y$ and a sample $u$ from $(0,1)$

Step 2: Check whether or not $u < f(y)/ M.g(y)$

if true accept $y$ as a sample from $f$

Else, reject the value of $y$

return to the sampling step


Algorithm 2)

There are model parameters $\theta$ described by a prior $\pi(\theta)$, and a forwards-simulation model for the data $x$, defined by $\pi(x|\theta)$. It is clear that a simple algorithm for simulating from the desired posterior $\pi(\theta|x)$ can be obtained as follows. First simulate from the joint distribution $\pi(\theta,x)$ by simulating $\theta^\star\sim\pi(\theta)$ and then $x^\star\sim \pi(x|\theta^\star)$. This gives a sample $(\theta^\star,x^\star)$ from the joint distribution. A simple rejection algorithm which rejects the proposed pair unless $x^\star$ matches the true data $x$ clearly gives a sample from the required posterior distribution. https://darrenjw.wordpress.com/2013/03/31/introduction-to-approximate-bayesian-computation-abc/

  1. Sample $\theta^\star \sim \pi(\theta^\star)$
  2. Sample $x^\star\sim \pi(x^\star|\theta^\star)$
  3. If $x^\star=x$, keep $\theta^\star$ as a sample from $\pi(\theta|x)$, otherwise reject.
  4. Return to step 1
Xi'an
  • 90,397
  • 9
  • 157
  • 575

1 Answers1

5

The two algorithms are related, if at a formal level.

The first one is called acceptance-rejection and is one of the most common generic algorithms for producing pseudo-random variates from a target density $f$. Check our book (Chapter 2) for instance.

The second algorithm operates in a Bayesian setting when one wants to simulate a posterior distribution $\pi(\theta|x)$. It is called ABC, for Approximate Bayesian computation (and is again mentioned in our book). First, it is more specialised than acceptance-rejection in that it aims at a conditional distribution and uses specific simulation densities like $\pi$ and $f(\cdot|\theta)$. Second, it can be seen as a case of acceptance-rejection where the acceptance probability is either $0$ or $1$:

  1. Simulate $(\theta,x^\star)\sim \pi(\theta)\times f(x|\theta)=g(\theta,x)$
  2. Accept the simulation with probability $$\dfrac{\pi(\theta)\times f(x^\star|\theta)\mathbb{I}_{x^\star=x}}{g(\theta,x)}=\mathbb{I}_{x^\star=x}$$

since the constant $M$ is then equal to $1$.

Xi'an
  • 90,397
  • 9
  • 157
  • 575