This is the Jensen's inequality I saw in my textbook:
$$E{ f(X) } \geq f( E(X) ),$$
where $f$ is a convex function.
Is this also applicable for two random variables--independent or otherwise--like this:
$$E{ f(X,Y) } \geq f( E(X,Y) )?$$
This is the Jensen's inequality I saw in my textbook:
$$E{ f(X) } \geq f( E(X) ),$$
where $f$ is a convex function.
Is this also applicable for two random variables--independent or otherwise--like this:
$$E{ f(X,Y) } \geq f( E(X,Y) )?$$
Yes Jensen inequality holds for multiple variable.
We can find a general formulation in the mesure theoretic article in Wikipedia.
"Let (Ω, A, μ) be a measure space, such that μ(Ω) = 1. If g is a real-valued function that is μ-integrable, and if g is a convex function on the real line. $$\varphi\left(\int_\Omega g\, d\mu\right) \le \int_\Omega \varphi \circ g\, d\mu$$"
This generalizable for convex function on vectorial spaces. (To be more precise, the real case is the restriction to one dimension). See article general inequality in probabilistic setting.
More intuitively: pose Z = (X,Y), Z is a random variable, apply Jensen inequality to Z.
To compute it: (from wikipedia again...)
If one considers the joint probability density function of ''X'' and ''Y'', say ''j(x,y)'', then the expectation of ''XY'' is
$$\operatorname{E}[(X,Y)] = \int\int xy \, j(x,y)\,dx\,dy$$
For g:
$$\operatorname{E}[g(X,Y)] = \int\int g(x,y) \, j(x,y)\,dx\,dy$$
Note that with didn't not discussed this issue before, but you have to make sure that X is integrable, g measurable.