I believe you are asking what is, if any, the distribution of an r.v. $X$, such that, if we have an i.i.d. sample of size $n>1$ from that distribution, it will hold that
$$E[GM] = E\left[\left(\prod_{i=1}^n X_{i}\right)^{1/n}\right] = E(X)$$
Due to the i.i.d. assumption, we have
$$E\left[\left(\prod_{i=1}^n X_{i}\right)^{1/n}\right] = E\left(X_1^{1/n}\cdot ...\cdot X_n^{1/n}\right) = E\left (X_1^{1/n}\right)\cdot ...\cdot E\left(X_n^{1/n}\right) = \left[E\left(X^{1/n}\right)\right]^n$$
and so we are asking whether we can have
$$\left[E\left(X^{1/n}\right)\right]^n = E(X)$$
But by Jensen's inequality, and the fact that the power function is strictly convex for powers higher than unity, we have that, almost surely for a non-degenerate (non-constant) random variable,
$$\left[E\left(X^{1/n}\right)\right]^n < E\left[\left(X^{1/n}\right)\right]^n = E(X)$$
So no such distribution exists.
Regarding the mention of the log-normal distribution in a comment, what holds is that the geometric mean ($GM$) of the sample from a log-normal distribution is a biased but asymptotically consistent estimator of the median. This is because, for the lognormal distribution it holds that
$$E(X^s) = \exp\left\{s\mu + \frac {s^2\sigma^2}{2}\right \}$$
(where $\mu$ and $\sigma$ are the parameters of the underlying normal, not the mean and variance of the log-normal).
In our case, $s = 1/n$ so we get
$$E(GM) = \left[E\left(X^{1/n}\right)\right]^n = \left[\exp\left\{(\mu/n) + \frac {\sigma^2}{2n^2}\right \}\right]^n = \exp\left\{\mu + \frac {\sigma^2}{2n}\right \}$$
(which tells us that it is a biased estimator of the median). But
$$\lim \left[E\left(X^{1/n}\right)\right]^n = \lim \exp\left\{\mu + \frac {\sigma^2}{2n}\right \} = e^{\mu}$$
which is the median of the distribution. One can also show that the variance of the geometric mean of the sample converges to zero, and these two conditions are sufficient for this estimator to be asymptotically consistent - for the median,
$$GM \to_p e^{\mu}$$