Your question can be read in two ways: (1) How does expected value relate to mean of a distribution? or (2) How does expected value of a random variable relate to arithmetic mean of a sample?
In first case, the answer would be that they are the synonyms.
As about the second case, let's look at them a little bit closer. Recall that expected value of a discrete random variable is defined as $E(X) = \sum_x x\,P(x)$. Say that you take a random sample of size $N$ and observe the $x_1,x_2,\dots,x_N$ samples, that all follow the same probability distribution. It can happen that some of the observed samples have the same value, say $x_2,x_5,x_{N-3}$ all are equal to $x$, so we can say that we observed $n(x) = 3$ values $x$ in the sample. Given that the sample is random and large enough, you can expect that proportion of observing any particular value would be close to the probability of drawing this value from the distribution they follow, i.e. $\tfrac{n(x)}{N} \approx P(x)$. Now, if we calculate arithmetic mean, we get
$$
\frac{1}{N} \sum_{i=1}^N x_i = \frac{1}{N} \sum_x x\,n(x) = \sum_x x\tfrac{n(x)}{N} \ \approx \sum_x x\,P(x)
$$
Same is true for continuous random variables, where we define expected value as $E(x) = \int x\,f(x)\,dx$. Probability density is the probability per foot. Notice that $P(t_i < x \le t_{i+1}) = \int_{t_i}^{t_{i+1}} f(t)\,dt$. If we binned the continuous variable into some number of buckets, then you could take the integrals to calculate probability that some $x$ falls into some particular bucket $(t_i, t_{i+1}]$. Calculating expected value for such binned variable is the same as with discrete random variable, because by binning we discretized it. As we move from finite number of buckets, into infinite number of infinitesimally small bins, we re talking about probability densities instead of probabilities and we are talking about continuous random variables again, so there comes all the calculus, but the basic ideas are the same.