2

Let $z$ be observations and $w$ be the parameter that we want to infer. Assuming that we know the prior $p(x)$, by using Bayes law, we have $p(x|z) = p(z|x)p(x)/p(z)$ where $Z$ is the marginal likelihood. For the purpose of sampling from $p(x|z)$, if we can compute $p(z|x)$ and $p(x)$ we can use MCMC.

However, what if the function $p(z|x)$ is not easy to compute? Such a case occurs a lot in my application. For example, if $x$ is the initial state and $z$ is the observed final state resulted from the propagation of the discrete stochastic dynamics. In that case $p(z|x)$ can only be computed by multiple integral, which often intractable to compute. Note that if the stochastic dynamics from $x_{t}$ to $x_{t+1}$ is written by $p(x_{t+1}|x_t)$, then $p(x_N) = \int \ldots \int p(x_N|x_{N-1})\ldots p(x_1|x_0) dx_0\ldots dx_N$.

It seems that probabilistic programing language can solve such inference mainly by MCMC, so I guess there is some way to do this...

  • Have you considered completion by auxiliary variables (the ones appearing in the integrals) or a form of slice sampling? If worse comes to worse, `ABC` is a likelihood-free solution that has become popular for handling complex likelihood functions. – Xi'an Jul 10 '21 at 08:41
  • If you have differential equations or integrals (can usually be recast as differential equations) that are not too crazy, then in Stan there are various solvers. That may be short of slow, but Stan and brms now let you use not just parallelization of chains, but even within chain (good idea if the likelihood is computationally expensive to evaluate). – Björn Jul 10 '21 at 10:20

0 Answers0