The Joint Probability Law of Functions of Random Variables

In section 9 we treated in some detail the problem of obtaining the individual probability law of a function of random variables. It is natural to consider next the problem of obtaining the joint probability law of several random variables which arise as functions. In principle, this problem is no different from those previously considered. However, the details are more complicated. Consequently, in this section, we content ourselves with stating an often-used formula for the joint probability density function of random variables , which arise as functions of jointly continuous random variables : We consider only the case in which the functions , have continuous first partial derivatives at all points and are such that the Jacobian

at all points . Let be the set of points such that the equations \begin{align} ended with \end{aligned}\begin{align} y_{1}&=g_{1}\left(x_{1}, x_{2}, \ldots, x_{n}\right),\\ y_{2}&=g_{2}\left(x_{1}, x_{2}, \ldots, x_{n}\right),\\ &\vdots\\ y_{n}&=g_{n}\left(x_{1}, x_{2}, \ldots, x_{n}\right) \end{aligned}\tag{10.3}possess at least one solution . The set of equations in (10.3) then possesses exactly one solution, which we denote by \begin{align} ended with \end{aligned}\begin{align} x_{1}&=g^{-1}\left(y_{1}, y_{2}, \ldots, y_{n}\right),\\ x_{2}&=g^{-1}\left(y_{1}, y_{2}, \ldots, y_{n}\right),\\ &\vdots\\ x_{n}&=g^{-1}\left(y_{1}, y_{2}, \ldots, y_{n}\right). \end{aligned} \tag{10.4}\]

If are jointly continuous random variables, whose joint probability density function is continuous at all but a finite number of points in the space, then the random variables defined by (10.1) are jointly continuous with a joint probability density function given by if belongs to , and are given by (10.4); for not belonging to " transform="translate(0,798.1)">

It should be noted that (10.5) extends (8.18) . We leave it to the reader to formulate a similar extension of (8.22) .

We omit the proof that the random variables are jointly continuous and possess a joint probability density. We sketch a proof of the formula given by (10.5) for the joint probability density function. One may show that for any real numbers The probability on the right-hand side of (10.6) is equal to

in which

Now, if does not belong to , then for sufficiently small values of there are no points in and the probability in (10.7) is 0. From the fact that the quantities in (10.6), whose limit is being taken, are 0 for sufficiently small values of , it follows that for not in . Thus (10.5 ) is proved. To prove (10.5), we use the celebrated formula for change of variables in multiple integrals (see R. Courant, Differential and Integral Calculus , Interscience, New York, 1937, Vol II, p. 253, or T. Apostol, Mathematical Analysis , Addison-Wesley, Reading, Massachusetts, 1957, p. 271) to transform the integral on the right-hand side of (10.7) to the integral

Replacing the probability on the right-hand side of (10.6) by the integral in (10.8) and then taking the limits indicated in (10.6), we finally obtain (10.5).

Example 10A . Let and be jointly continuous random variables. Let . For any real numbers and show that

 

Solution

Let and . The equations and clearly have as their solution and . The Jacobian is given by

 

In view of these facts, (10.9) is an immediate consequence of (10.5).

In exactly the same way one may establish the following result:

Example 10B . Let and be jointly continuous random variables. Let Then for any real numbers and , such that and , It should be noted that we immediately obtain from (10.11) the formula for given by ( 9.13 , since

Example 10C . Rotation of axes . Let and be jointly distributed random variables. Let for some angle in the interval . Then To illustrate the use of (10.14), consider two jointly normally distributed random variables with a joint probability density function given by (9.31), with . Then where

From (10.15) one sees that two random variables and , obtained by a rotation of axes from jointly normally distributed random variables, and , are jointly normally distributed. Further, if the angle of rotation is chosen so that

then , and and are independent normally distributed. Thus by a suitable rotation of axes, two jointly normally distributed random variables may be transformed into two independent normally distributed random variables .

Theoretical Exercises

10.1 . Let and be independent random variables, each exponentially distributed with parameter . Show that the random variables and are independent.

10.2 . Let and be independent random variables, each normally distributed with parameters and . Show that and are independent.

10.3 . Let and be independent random variables, distributed with and degrees of freedom, respectively. Show that and are independent.

10.4 . Let , and be independent identically normally distributed random variables. Let and . Show that and are independent.

10.5 . Generation of a random sample of a normally distributed random variable . Let be independent random variables, each uniformly distributed on the interval 0 to 1. Show that the random variables are independent random variables, each normally distributed with mean 0 and variance 1. (For a discussion of this result, see G. E. P. Box and Mervin E. Muller, “A note on the generation of random normal deviates,” Annals of Mathematics Statistics , Vol. 29 (1958), pp. 610-611.)

Exercises

10.1 . Let and be independent random variables, each exponentially distributed with parameter . Find the joint probability density function of and , in which (i) , (ii) maximum .

 

Answer

(i) if otherwise; (ii) if and otherwise.

 

10.2 . Let and have joint probability density function given by Find the joint probability density function of , in which and . Show that, and explain why, is uniformly distributed but is not.

10.3 . Let and be independent random variables, each uniformly distributed over the interval 0 to 1. Find the individual and joint probability density functions of the random variables and , in which and .

 

Answer

if otherwise; for , for otherwise; for ; for otherwise.

 

10.4 . Two voltages and are independently and normally distributed with parameters and . These are combined to give two new voltages, and . Find the joint probability density function of and . Are and independent? Find .