We have a bivariate random variable $(X,Y)$ for which sampling is challenging.
If we were to know how to sample from the conditionals $(X|Y)$ and $(Y|X)$, we could get samples from the joint using Gibbs sampling by iterating: $$x_{t+1} \sim (X|y_t)$$ $$y_{t+1} \sim (Y|x_t).$$
Assume however that we do not know how to sample from one of the conditionals (say, $Y|X$), but that, for each $x$, we know how to sample from an approximation $\tilde{Y}_x \approx (Y|X=x)$.
If we assume that $\tilde{Y}_x$ is close to $(Y|X=x)$ in some sense (for example, that the Kullback-Leibler divergence between these two distributions is smaller than $\varepsilon$ for all $x$), could we get results about the convergence of the pseudo-Gibbs chain: $$x_{t+1} \sim (X|Y=y_t)$$ $$y_{t+1} \sim \tilde{Y}_{x_t}?$$