Consider the following setup: Let $\Omega$ be a finite (but humongous) state space and $\pi:\Omega\to[0,1]$ be a probability mass function. It seems to me that when people want to "sample" in this setting, their main motivation is to estimate the expected value of a function $f:\Omega\to X$ on $\Omega$ under $\pi$, i.e.
$$\mathbb{E}_\pi(f)=\sum_{\omega\in\Omega}\pi(\omega)f(\omega).$$
This is the case for asymptotic counting, $p$-value estimation in Hypothesis testing, and many more (It seems to me the whole MCMC business is about just that). To approximate $\mathbb{E}_\pi(f)$, one tries to get as much samples $\omega_1,\ldots,\omega_n\in\Omega$ from $\pi$ as possible, because the law of large numbers ensures that $\frac{1}{n}\sum_{i=1}^nf(\omega_i)$ converges almost surely to $\mathbb{E}_\pi(f)$. If sampling from $\pi$ is hard, then one constructs a Markov chain $(X_t)_{t\in\mathbb{N}}$ on $\Omega$ whose stationary distribution is $\pi$ (for example with the Metropolis-Hastings algorithm).
In this scenario, the number of samples $n$ is supposed to be large!
Now, my question is the following: Are there situations or applications in statistics (different from approximate an expected value!) where only one single sample $x\in\Omega$ from the distribution $\pi$ is needed?
EDIT: Triggered by the discussion in the comments of this post, let me be more precise: Does anybody know of applications in statistics where the task is to finds only one single (say uniform) sample from the lattice points of a polytope, that is a set that looks like
$$ \Omega=\{u\in\mathbb{Z}^d: Au\le b\}? $$