7

I'm reviewing for a test, and I am not sure if I am getting the right solution.

Let $X$ and $Y$ be iid $\mathcal{N}(0, \sigma^2)$ random variables.

a. Find the distribution of $U = X^2 + Y^2$, $V = \frac{X}{\sqrt{X^2 + Y^2}}$,

b. are $U,V$ independent?

c. Suppose $\sin(\theta) = V$. Find distribution of $\theta$ when $0 \le \theta \le \pi/2$.

(tentative answers):

I get

  1. $f_{U,V}(u,v) = \frac{1}{4\pi} \sigma^{-2} \exp \left[ -u/(2\sigma^2) \right]| (1-v^2)^{1/2} + (1-v^2)v^2|$,

  2. yes (density factors and supports dont rely on each other)and

  3. $g(\theta) = \left[\cos^2(\theta) + \cos^3(\theta)\sin^2(\theta)\right]\frac{1}{8 \pi \sigma^4}$.

Anybody recognize any of these distributions?

Taylor
  • 18,278
  • 2
  • 31
  • 66
  • 2
    $U$ should work out to be an exponential random variable and $V$ is the distribution of $\cos \Theta$ for $\Theta \sim U[0,2\pi)$. – Dilip Sarwate Jan 13 '13 at 04:43
  • 1
    1. Your answer to (3) is puzzling, because the total probability (the integral of $g$) will depend on $\sigma$, whereas it cannot: it must always equal $1$. 2. For some insight into this question, read about the [Box Muller transform](http://en.wikipedia.org/wiki/Box%E2%80%93Muller_transform). – whuber Jan 13 '13 at 17:26
  • Thanks guys. (1) simplifies to $\left[ \frac{1}{2\pi}\frac{1}{\sqrt{1-v^2}}\right] \left[\frac{1}{2\sigma^2}\exp\left[ \frac{-u}{2\sigma^2}\right] \right]$, and that makes it easier to find the density of $\theta$, as well. – Taylor Jan 13 '13 at 18:24

2 Answers2

2

Because there's a subtlety here, this question is worth a correct answer. But let's develop it with as little work as possible, in the most straightforward manner.

What subtlety? The variables $(U,V)$ do not determine $(X,Y).$

The change of variables from $(X,Y)$ to $(U,V)$ is two-to-one: because $(U,V)$ gives us information about $Y$ only in the form of $Y^2,$ whenever $(X,Y)$ corresponds to $(U,V),$ so does $(X,-Y).$ Almost surely, $Y\ne -Y$ (the chance of this for a Normal distribution of $Y$ is zero). This means the density of $(U,V)$ will be twice what a mindless application of the routine Calculus formulas indicates.

Those routine formulas are, of course, the computation of the Jacobian. This is just an old-fashioned term for computing the differential element $\mathrm{d}x\,\mathrm{d}y$ in terms of the new variables. One of the easiest ways to work it out starts with the formulas

$$X = V\sqrt{U};\ Y = \sqrt{1-V^2}\sqrt{U}.$$

Taking differentials,

$$\begin{aligned} \mathrm{d}x\,\mathrm{d}y &= \left(\frac{v}{2\sqrt{u}}\mathrm{d}u + \sqrt{u}\mathrm{d}v\right)\,\left(\frac{\sqrt{1-v^2}}{2\sqrt{u}}\mathrm{d}u - \frac{v\sqrt{u}}{\sqrt{1-v^2}}\mathrm{d}v\right) \\ &= \frac{1}{2\sqrt{1-v^2}}\mathrm{d}v\,\mathrm{d}u. \end{aligned}$$

Substitute everything into the original probability element for the bivariate Normal distribution taking care to indicate what the possible values of the variables $u$ and $v$ can be. Omitting this is another pitfall that plagues the uninitiated, so I will be explicit, using $\mathcal{I}$ to represent the indicator function:

$$\begin{aligned} f_{X,Y}(x,y)\,\mathrm{d}x\,\mathrm{d}y &= \frac{1}{2\pi\sigma^2} \exp\left(-\frac{x^2+y^2}{2\sigma^2}\right)\,\mathrm{d}x\,\mathrm{d}y \\ &= \frac{1}{2\pi\sigma^2} \exp\left(-\frac{u}{2\sigma^2}\right)\, \frac{1}{2\sqrt{1-v^2}}\mathrm{d}v\,\mathrm{d}u\ \mathcal{I}(u\ge 0)\,\mathcal{I}(-1\le v\le 1). \end{aligned}$$

(It would be incorrect to write this formula without the indicator functions. Usually readers are expected to notice that $\sqrt{1-v^2}$ is defined only for $-1\le v\le 1,$ so in informal settings we can get away without indicating that explicitly; but it's not quite as noticeable that although $\exp(-u/(2\sigma^2))$ is defined everywhere, its integral diverges unless $u$ is explicitly restricted.)

Introduce the factor of $2$ from the two-to-one transformation and notice the probability element splits into a factor depending only on $u$ and one depending only on $v:$

$$f_{U,V}(u,v)\,\mathrm{d}v\,\mathrm{d}u = \left[\frac{1}{2\sigma^2} \exp\left(-\frac{u}{2\sigma^2}\right)\,\mathrm{d}u\ \mathcal{I}(u\ge 0)\right]\, \left[\frac{1}{\pi\sqrt{1-v^2}}\,\mathrm{d}v\ \mathcal{I}(-1\le v\le 1)\right].$$

In one stroke this answers (a) and (b): because the probability element factors, the variables $U$ and $V$ are independent.

As a check, you may integrate the two factors separately over the set of real numbers: each integrates to $1,$ as it must for any univariate probability distribution.

Question (c) is a routine exercise in a univariate change of variable, so discussing it doesn't add any further interest.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
0

By Box Muller transformation

$X=r\cos(\theta) \hspace{.5cm} Y=r\sin(\theta) \hspace{.5cm} X,Y \sim normal(0,1) \Leftrightarrow \theta \sim Uniform(0,2\pi) \hspace{.5cm} r^2\sim chi(2)$.
$X$ and $Y$ are independent $\Leftrightarrow $ $\theta$ and $r$ are independent.

also $\sin(\theta) \sim \cos(\theta) \sim \sin(2\theta) \sim 2\sin(\theta) \cos(\theta) \sim \cos(2\theta) \sim \cos(2\theta) \sim f $ that $ f(z) =\frac{1}{\pi \sqrt(1-z^2)} I_{[-1,1]}(z) $ since $z=\sin(\theta) \Rightarrow f(z)=|\frac{d}{dz} \sin^{-1}(z)| f_{\theta}(\sin^{-1}(z)) + |\frac{d}{dz} (\pi-\sin^{-1}(z))| f_{\theta}(\pi -\sin^{-1}(z)) =\frac{1}{\sqrt(1-z^2)} \frac{1}{2\pi} +\frac{1}{\sqrt(1-z^2)} \frac{1}{2\pi} =\frac{1}{\pi\sqrt(1-z^2)} $

similar for others.

in hence if $X,Y(i.i.d)\sim N(0,\sigma^2)$ so $X=\sigma r \cos(\theta)$ and $Y=\sigma r \sin(\theta)$.

in hence

$U=\sigma^2 r^2$ and $ V=\frac{\sigma r\cos(\theta)}{r\sigma}=\frac{r\cos(\theta)}{r}=\cos(\theta) \sim f $ and $r$ is independent from any function of $\theta$ like $\sin(\theta)$.

another example $\frac{2XY}{\sqrt(X^2+Y^2)}=\frac{2r^2 \sigma^2\cos(\theta) \sin(\theta)}{r\sigma}=2r \cos(\theta) \sin(\theta) =r \sigma \sin(2\theta) \sim \sigma r \sin(\theta) \sim N(0,\sigma^2)$

Masoud
  • 1,189
  • 4
  • 19