I'm trying to derive which univariate probability distribution maximizes entropy, assuming finite mean $\mu$ and non-negative support $[0, \infty)$. I know that the answer is the exponential distribution, but I'm struggling to get there.
I start by defining the Lagrangian:
$L(p(x), \lambda_1, \lambda_2) = H[p(x)] + \lambda_1(\int_0^{\infty} p(x) dx - 1) + \lambda_2(\int_0^{\infty} x p(x) dx - \mu)$
Taking the functional derivative with respect to $p(x)$, I find that:
$\log p(x) = -1 + \lambda_1 + \lambda_2 x$
I should be able to solve for $\lambda_1, \lambda_2$ using the two constraints, but I haven't been able to. My most promising attempt involves setting the mean constraint equal to $\mu$ times the normalization constraint:
$\mu \int_0^{\infty} p(x) dx = \int_0^{\infty} x p(x) dx$
But this leaves me with a problem that I don't know how to solve:
$\mu \int_0^{\infty} e^{\lambda_2 x} dx = \int_0^{\infty} x e^{\lambda_2 x} dx$