How can one measure the accuracy of the probability distribution of, say, a physical magnitude? I know one good candidate is the entropy, which measures the amount of information one has about the system---cf. Appendix A of Ref. [E. Jaynes, Physical Review 106, 620 (1957)]---. However, one can also think of the variance as a way to evaluate uncertainty: the lower it is, the better we know the system.$^*$
Imagine now that one has two probability distributions of the same variable, $P_1$ and $P_2$. Let $P_1$ have the lowest entropy and $P_2$ the lowest variance. Is it even possible to say that one provides better knowledge of the system than the other? Would another figure of merit different from the entropy and the variance answer such a question?
$^*$ As a side comment, the Heisenberg uncertainty principle is actually formulated in terms of the variance, but not the entropy of the probability distribution $|\psi(x)|^2$, $\psi(x)$ being the wavefunction.