Estimating the mean or expected value of a continuous random variable's (r.v.) empirical distribution is known to be difficult, moreso than estimating the variance. Estimates of the mean and variance are therefore considered to be prone to estimation error.
Entropy (for discrete r.v.'s) and differential entropy (for continuous) are sometimes considered to be measures that capture the entire statistical distribution of a r.v. and can outshine extensions to higher moments. But how reliable is it to estimate the entropy of an empirical distribution compared to the mean and variance of a distribution? Is entropy less prone to estimation error?