I am confused in applying expectation in denominator.
$E(1/X)=\,?$
can it be $1/E(X)\,$?
I am confused in applying expectation in denominator.
$E(1/X)=\,?$
can it be $1/E(X)\,$?
can it be 1/E(X)?
No, in general it can't; Jensen's inequality tells us that if $X$ is a random variable and $\varphi$ is a convex function, then $\varphi(\text{E}[X]) \leq \text{E}\left[\varphi(X)\right]$. If $X$ is strictly positive, then $1/X$ is convex, so $\text{E}[1/X]\geq 1/\text{E}[X]$, and for a strictly convex function, equality only occurs if $X$ has zero variance ... so in cases we tend to be interested in, the two are generally unequal.
Assuming we're dealing with a positive variable, if it's clear to you that $X$ and $1/X$ will be inversely related ($\text{Cov}(X,1/X)\leq 0$) then this would imply $E(X \cdot 1/X) - E(X) E(1/X) \leq 0$ which implies $E(X) E(1/X) \geq 1$, so $E(1/X) \geq 1/E(X)$.
I am confused in applying expectation in denominator.
Use the law of the unconscious statistician
$$\text{E}[g(X)] = \int_{-\infty}^\infty g(x) f_X(x) dx$$
(in the continuous case)
so when $g(X) = \frac{1}{X}$, $\text{E}[\frac{1}{X}]=\int_{-\infty}^\infty \frac{f(x)}{x} dx$
In some cases the expectation can be evaluated by inspection (e.g. with gamma random variables), or by deriving the distribution of the inverse, or by other means.
As Glen_b says that's probably wrong, because the reciprocal is a non-linear function. If you want an approximation to $E(1/X)$ maybe you can use a Taylor expansion around $E(X)$:
$$ E \bigg( \frac{1}{X} \bigg) \approx E\bigg( \frac{1}{E(X)} - \frac{1}{E(X)^2}(X-E(X)) + \frac{1}{E(X)^3}(X - E(X))^2 \bigg) = \\ = \frac{1}{E(X)} + \frac{1}{E(X)^3}Var(X) $$ so you just need mean and variance of X, and if the distribution of $X$ is symmetric this approximation can be very accurate.
EDIT: the maybe above is quite critical, see the comment from BioXX below.
Others have already explained that the answer to the question is NO, except trivial cases. Below we give an approach to finding $\DeclareMathOperator{\E}{\mathbb{E}} \E \frac1{X}$ when $X>0$ with probability one, and the moment generating function $M_X(t) = \E e^{tX}$ do exist. An application of this method (and a generalization) is given in Expected value of $1/x$ when $x$ follows a Beta distribution, we will here also give a simpler example.
First, note that $\int_0^\infty e^{-t x}\; dt = \frac1{x}$ (simple calculus exercise). Then, write $$ \E \left(\frac1{X}\right) = \int_0^\infty x^{-1} f(x)\; dx = \int_0^\infty \left( \int_0^\infty e^{-tx}\; dt \right) f(x)\; dx =\\ \int_0^\infty \left( \int_0^\infty e^{-tx} f(x) \; dx \right) \; dt = \int_0^\infty M_X(-t) \; dt $$ A simple application: Let $X$ have the exponential distribution with rate 1, that is, with density $e^{-x}, x>0$ and moment generating function $M_X(t)=\frac1{1-t}, t<1$. Then $\int_0^\infty M_X(-t)\; dt = \int_0^\infty \frac1{1+t} \; dt= \ln(1+t) \bigg\rvert_0^\infty = \infty$, so definitely do not converge, and is very different from $\frac1{\E X}=\frac11=1$.
An alternative approach to calculating $E(1/X)$ knowing X is a positive random variable is through its moment generating function $E[e^{-\lambda X}]$. Since by elementary calculas $$ \int_0^\infty e^{-\lambda x} d\lambda =\frac{1}{x} $$ we have, by Fubini's theorem $$ \int_0^\infty E[e^{-\lambda X}] d\lambda =E[\frac{1}{X}]. $$
To first give an intuition, what about using the discrete case in finite sample to illustrate that $\text{E}(1/X)\neq 1/\text{E}(X)$ (putting aside cases such as $\text{E}(X)=0$)?
In finite sample, using the term average for expectation is not that abusive, thus if one has on the one hand
$\text{E}(X) = \frac{1}{N}\sum_{i=1}^N X_i$
and one has on the other hand
$\text{E}(1/X) = \frac{1}{N}\sum_{i=1}^N 1/X_i$
it becomes obvious that, with $N>1$,
$\text{E}(1/X) = \frac{1}{N}\sum_{i=1}^N 1/X_i \neq \frac{N}{\sum_{i=1}^N X_i} = 1/\text{E}(X)$
Which leads to say that, basically, $\text{E}(1/X)\neq 1/\text{E}(X)$ since the inverse of the (discrete) sum is not the (discrete) sum of inverses.
Analogously in the asymptotic $0$-centered continuous case, one has
$\text{E}(1/X)=\int_{-\infty}^\infty \frac{f(x)}{x} dx \neq 1/\int_{-\infty}^\infty xf(x) dx = 1/\text{E}(X)$.