I want to estimate the KL divergence between two continuous distributions $f$ and $g$. However, I can't write down the density for either $f$ or $g$. I can sample from both $f$ and $g$ via some method (for example, Markov chain Monte Carlo).
The KL divergence from $f$ to $g$ is defined like this:
$$\operatorname{D_{\mathrm{KL}}}(f \parallel g) = \int_{-\infty}^{\infty} f(x) \log\left(\frac{f(x)}{g(x)}\right) \operatorname{d}x$$
This is the expectation of $\log\left(\frac{f(x)}{g(x)}\right)$ with respect to $f$, so you could imagine some Monte Carlo estimate
$$\frac{1}{N}\sum_{i=1}^N \log\left(\frac{f(x_i)}{g(x_i)}\right)$$
where $i$ indexes $N$ samples that are drawn from $f$ (i.e. $x_i \sim f()$ for $i = 1, \ldots, N$).
However, since I don't know $f()$ and $g()$, I can't even use this Monte Carlo estimate. What is the standard way of estimating the KL in this situation?
EDIT: I do NOT know the unnormalized density for either $f()$ or $g()$.