Generation of Normal Random Variables

Example
Let (X,Y) denote a random point in the plane and assume that the rectangular coordinates X and Y are independent unit normal random variables. We are interested in the joint distribution of $R,\Theta$, the polar coordinate representation of this point. as follow,

Letting $r=g_1(x,y)=\sqrt{x^2+y^2}$ and $\theta=g_2(x,y)=\tan^{-1} y/x$, we see that

$\begin{array}{cc}
\displaystyle\frac{\partial g_1}{\partial x}=\frac{x}{\sqrt{x...
...^2}&
\displaystyle\frac{\partial g_2}{\partial y}=\frac{x}{x^2+y^2}
\end{array}$

Hence

$\begin{array}{rcl}
J(x,y)&=&\displaystyle\frac{x^2}{(x^2+y^2)^{3/2}}+\frac{y^2}...
...^{3/2}} \\ \\
&=&\displaystyle\frac{1}{\sqrt{x^2+y^2}}=\frac{1}{r}
\end{array}$
As the joint density function of X and Y is
$f(x,y)=\displaystyle\frac{1}{2\pi}e^{-(x^2+y^2)/2}$
we see that joint density function of $R=\sqrt{x^2+y^2}$, $\Theta=\tan^{-1}y/x$, is given by
$f(r,\theta)=\displaystyle\frac{1}{2\pi}re^{-r^2/2}\qquad 0<\theta<2\pi, 0<r<\infty$
As this joint density factors into the marginal densities for $R,\Theta$, we obtain that $R,\Theta$ are independent random variables, with $\Theta$being uniformly distributed over $(0,2\pi)$ and R having the Rayleigh distribution with density $f(r)=re^{-r^2/2}\qquad 0<r<\infty$ (Thus, for instance, when one is aiming at a target in the plane, if the horizontal and vertical miss distances are independent unit normals, then the absolue value of the error has the above Rayleigh distribution.)

The above result is quite interesting, for it certainly is not evident a priori that a random vector whose coordinates ae independent unit normal random variables will have an angle of orientation that is not only uniformly distributed, but is also independent of the vector's distance from the origin.

If we wanted the joint distribution of $R^2, \Theta$, then, as the transformation d=g1(x,y)=x2+y2 and $\theta=g_2(x,y)=\tan^{-1} y/x$ has a Jacobian

$J=\left \vert
\begin{array}{cc}
2x&2y \\ \\
\displaystyle\frac{-y}{x^2+y^2}&\displaystyle\frac{x}{x^2+y^2}
\end{array}\right \vert=2$
we see that
$f(d,\theta)=\displaystyle\frac{1}{2}e^{-d/2}\frac{1}{2\pi}
\qquad 0<d<\infty,0<\theta<2\pi$
Therefore, $R^2, \Theta$ are independent, with R2 having an exponential distribution with parameter $\frac{1}{2}$. But as R2=X2+Y2, it follow, by definition, that R2 has a chi-squared distribution with 2 degrees of freedom. Hence we have a verification of the result that the exponential distribution with parameter $\frac{1}{2}$ is the same as the chi-squared distribution with 2 degrees of freedom.

The above result can be used to simulate (or generate) normal random variables by making a suitable transformation on uniform random variables by making a suitable transformation on uniform random variables. Let U1 and U2 be independent random variables each uniformly distributed over (0,1). We will transform U1,U2 into two independent coordinate representation $(R,\Theta)$of the random vector (X1,X2). From the above, $R^2, \Theta$ will be independent, and, in addition, R2=X12+X22 will have an exponential distribution with parameter $\lambda=\frac{1}{2}$. But $-2\log U_1$ has such a distribution since, for x>0

$\begin{array}{rcl}
P\{-2\log U_1<x\}&=&P\left\{\log U_1>-\displaystyle\frac{x}{2}\right\} \\ \\
&=&P\{U_1>e^{-x/2}\} \\ \\
&=&1-e^{-x/2}
\end{array}$
Also, as $2\pi U_2$ is a uniform $(0,2\pi)$ random variable, we can use it to generate $\Theta$. That is, if we let
$\begin{array}{rcl}
R^2&=&-2\log U_1 \\ \\
\Theta&=&2\pi U_2
\end{array}$
then R2 can be taken to be the square of the distance from the origin and $\theta$ as the angle of orientation of (X1,X2). As $X_1=R\cos\Theta$, $X_2=R\sin\Theta$, we obtain that
$\begin{array}{rcl}
X_1&=&\sqrt{-2\log U_1}\cos (2\pi U_2) \\ \\
X_2&=&\sqrt{-2\log U_1}\sin (2\pi U_2)
\end{array}$
are independent unit normal random variables.          $\rule[0.02em]{1.0mm}{1.5mm}$