5

Consider the function $Y = 1 - 2 \Phi((c_j - \mu)/\sigma) + 2 \Phi^2((c_j - \mu)/\sigma)$

where $\Phi$ is the cumulative distribution function for the standard normal distribution and $c_j$ is a uniformly distributed random variable on the range -1 to 1. I would like to be able to express the expected value of $Y$ in terms of $\mu$ and $\sigma$.

It seems clear that $E(Y)$ is increasing in the distance between $E[c_j]$ and $\mu$, and decreasing as $\sigma$ increases. However, I can also see (from simulating) that the effect of $\sigma$ on $Y$ is dependent on $c_j - \mu$. This suggests that there is an interaction between $\sigma$ and $\mu$.

My question is whether there is a way of describing this relationship analytically? I'm afraid that I'm a bit useless at this, and although the simulated results might be sufficient for my purposes, I would like to be able to show this a little more concisely, and suspect I am missing something obvious. Any suggestions appreciated!

The plot below shows the results of the simulation: enter image description here

Notes: in the simulation I am treating $c_j$ as uniformly distributed, varying the values of $\sigma$ and $\mu$ and simply plugging them into the formula above before taking an average.

user2728808
  • 350
  • 2
  • 8
  • 2
    Please check your parentheses carefully in the latter half of your formula. Writing something like $\Phi(x)^2$ (where $x$ is $(c_j-\mu)/\sigma)$) is ambiguous: did you mean $[\Phi(x)]^2$ (which could also be written as $\Phi^2(x)$ in analogy with $\sin^2(x)$ denoting the square of the number that is the sine of $x$? or was it a typo for $\Phi\left(x^2\right)$? – Dilip Sarwate Dec 22 '15 at 23:22
  • 1
    Apologies, I've clarified the equation according to your suggestion - it was supposed to denote $[\Phi(x)]^2$, though I have adopted your suggested notation. Thanks. – user2728808 Dec 22 '15 at 23:37

2 Answers2

5

As $C$ varies from $-1$ to $+1$, the function $\Phi\left(\frac{C-\mu}{\sigma}\right)$ is a slowly increasing function whose value increases from varies $\Phi\left(\frac{-1-\mu}{\sigma}\right)$ to $\Phi\left(\frac{1-\mu}{\sigma}\right)$.

A more generic question is:

What is $E[\Phi(X)]$ when $X$ is uniformly distributed on (a,b)?

The answer can be obtained via integration by parts and use of the result $\frac{\mathrm d}{\mathrm dx}\phi(x) = -x\cdot \phi(x)$ where $\phi(x)$ is the standard normal density function. We have that \begin{align} E[\Phi(X)] &= \frac{1}{b-a}\int_a^b \Phi(x)\,\mathrm dx\\ &= \left.\left.\left.\frac{1}{b-a}\right[\Phi(x)\cdot x\right\vert_a^b - \int_a^b \phi(x)\cdot x \,\mathrm dx\right]\\ &= \left. \frac{b\Phi(b) - a\Phi(a)}{b-a} + \frac{\phi(x)}{b-a}\right\vert_a^b\\ &= \frac{b\Phi(b) - a\Phi(a) + \phi(b)-\phi(a)}{b-a}. \end{align}

Similarly, \begin{align} E\left[\Phi^2(x)\right] &= \frac{1}{b-a}\int_a^b \Phi^2(x)\,\mathrm dx\\ &= \left.\left.\left.\frac{1}{b-a}\right[\Phi^2(x)\cdot x\right\vert_a^b - \int_a^b 2\Phi(x)\phi(x)\cdot x \,\mathrm dx\right]\\ &= \frac{b\Phi^2(b) - a\Phi^2(a)}{b-a} - \frac{1}{b-a}\int_a^b 2\Phi(x)\phi(x)\cdot x \,\mathrm dx\\ &= \frac{b\Phi^2(b) - a\Phi^2(a)}{b-a} - \left.\left.\left.\frac{2}{b-a}\right[-\Phi(x)\phi(x)\right\vert_a^b + \int_a^b \phi^2(x)\,\mathrm dx\right]\\ &= \frac{\left(b\Phi^2(b) - a\Phi^2(a)\right)+2\left(\Phi(b)\phi(b) \right)-2\left(\Phi(a)\phi(a)\right)}{b-a}\\ &\qquad\qquad- \frac{1}{(b-a)\sqrt{\pi}}\int_a^b \frac{e^{-x^2}}{\sqrt{\pi}}\, \mathrm dx \end{align} Now, $\displaystyle \frac{e^{-x^2}}{\sqrt{\pi}}$ is the density of a normal random variable $Z$ with mean $0$ and variance $\frac 12$, and so that last integral is just $P\{a < Z < b\} = \Phi\left(\sqrt{2}b\right)-\Phi\left(\sqrt{2}a\right)$. I will leave to you the task of working out the details, then plugging in $\frac{\pm 1 - \mu}{\sigma}$ for $b$ and $a$ in the above formulas and finally figuring out $E[Y]$.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
  • This is great, thanks so much for the help. I am still working through this (I'm new to calculus!) but was wondering if you could explain a little more what you mean by "Now, $\displaystyle \phi^2(x) = \frac{e^{-x^2}}{2\pi}$ is just $\frac{1}{\sqrt{\pi}}$ times the density of a $N(0,\frac 12)$ random variable and so $\int_a^b \phi^2(x)\,\mathrm dx$ can be evaluated in terms of $\Phi(\cdot)$." Apologies if I'm missing something obvious, but I don't quite follow. – user2728808 Dec 23 '15 at 10:55
  • @user2728808 See revised version. – Dilip Sarwate Dec 23 '15 at 13:27
2

Random variable $Y$ can be expressed as:

enter image description here

where Erf[z] denotes the error function $\frac{2}{\sqrt{\pi }}\int _0^z e^{-t^2}d t$, and where $X \sim \text{Uniform}(-1,1)$ with pdf $f(x)$:

enter image description here

Then, $E[Y]$ can be solved analytically as:

enter image description here

where I am using the Expect function from the mathStatica add-on to Mathematica to do the nitty-gritties.

While the result is not necessarily pretty, it is exact and symbolic (which is what the OP was seeking), and one can differentiate it, or plot it etc.

Here is a plot of the solution $E[Y]$, as $\sigma$ increases, when $\mu = 0$ (blue), $\mu = 1$ (orange), and $\mu = 2$ (green)

enter image description here

wolfies
  • 6,963
  • 1
  • 22
  • 27