Let $z$ be observations and $w$ be the parameter that we want to infer. Assuming that we know the prior $p(x)$, by using Bayes law, we have $p(x|z) = p(z|x)p(x)/p(z)$ where $Z$ is the marginal likelihood. For the purpose of sampling from $p(x|z)$, if we can compute $p(z|x)$ and $p(x)$ we can use MCMC.
However, what if the function $p(z|x)$ is not easy to compute? Such a case occurs a lot in my application. For example, if $x$ is the initial state and $z$ is the observed final state resulted from the propagation of the discrete stochastic dynamics. In that case $p(z|x)$ can only be computed by multiple integral, which often intractable to compute. Note that if the stochastic dynamics from $x_{t}$ to $x_{t+1}$ is written by $p(x_{t+1}|x_t)$, then $p(x_N) = \int \ldots \int p(x_N|x_{N-1})\ldots p(x_1|x_0) dx_0\ldots dx_N$.
It seems that probabilistic programing language can solve such inference mainly by MCMC, so I guess there is some way to do this...