The Probability Law of a Function of Random Variables

In this section we develop formulas for the probability law of a random variable , which arises as a function of jointly distributed random variables . All of the formulas developed in this section are consequences of the following basic theorem.

Theorem 9A . Let be jointly distributed random variables, with joint probability law . Let , . Then, for any real number

The proof of theorem 9A is immediate, since the event that is logically equivalent to the event that , which is the event that the observed values of the random variables lie in the set of -tuples .

We are especially interested in the case in which the random variables are jointly continuous, with joint probability density . Then (9.1) may be written

To begin with, let us obtain the probability law of the sum of two jointly continuous random variables and , with a joint probability density function . Let . Then By differentiation of the last equation in (9.3), we obtain the formula for the probability density function of : for any real number

If the random variables and are independent, then for any real number

The mathematical operation involved in (9.5) arises in many parts of mathematics. Consequently, it has been given a name. Consider three functions , and , which are such that for every real number the function is then said to be the convolution of the functions and , and in symbols we write .

In terms of the notion of convolution, we may express (9.5) as follows. The probability density function of the sum of two independent continuous random variables is the convolution of the probability density functions and of the random variables. 

One can prove similarly that if the random variables and are jointly discrete then the probability mass function of their sum, , for any real number is given by

In the same way that we proved (9.4) we may prove the formulas for the probability density function of the difference, product, and quotient of two jointly continuous random variables:

We next consider the function of two variables given by and obtain the probability law of . Suppose one is taking a walk in a plane; starting at the origin, one takes a step of magnitude in one direction and then in a perpendicular direction one takes a step of magnitude . One will then be at a distance from the origin given by . Similarly, suppose one is shooting at a target; let and denote the coordinates of the shot, taken along perpendicular axes, the center of which is the target. Then is the distance from the target to the point hit by the shot.

The distribution function of clearly satisfies for , and for

We express the double integral in (9.11) by means of polar coordinates. We have, letting ,

If and are jointly continuous, then is continuous, with a probability density function obtained by differentiating (9.12) with respect to . Consequently, Multiple \tag\begin{array}{rlr} f_{\sqrt{X_{1}^{2}+X_{2}^{2}}}(y) & = \begin{cases} y \displaystyle \int_{0}^{2 \pi} d\theta \, f_{X_{1}, X_{2}}(y \cos \theta, y \sin \theta), & \text{for } y > 0, \\ 0, & \text{for } y < 0 \end{cases}\tag{9.13} \end{array}\] \[\begin{array}{rlr} f_{X_{1}^{2}+X_{2}^{2}}(y) & = \begin{cases} \frac{1}{2} \displaystyle \int_{0}^{2 \pi} d\theta \, f_{X_{1}, X_{2}}(\sqrt{y} \cos \theta, \sqrt{y} \sin \theta), & \text{for } y > 0 \\ 0, & \text{for } y < 0, \end{cases} \tag{9.14} \end{array}where (9.14) follows from (9.13) and (8.8).

The formulas given in this section provide tools for the solution of a great many problems of theoretical and applied probability theory, as examples 9A to 9F indicate. In particular, the important problem of finding the probability distribution of the sum of two independent random variables can be treated by using (9.5) and (9.7). One may prove results such as the following:

Theorem 9B . Let and be independent random variables. 

(i) If is normally distributed with parameters and and is normally distributed with parameters and , then is normally distributed with parameters and

(ii) If obeys a binomial probability law with parameters and and obeys a binomial probability law with parameters and , then obeys a binomial probability law with parameters and

(iii) If is Poisson distributed with parameter and is Poisson distributed with parameter , then is Poisson distributed with parameter

(iv) If obeys a Cauchy probability law with parameters and and obeys a Cauchy probability law with parameters and , then obeys a Cauchy probability law with parameters and

(v) If obeys a gamma probability law with parameters and and obeys a gamma probability law with parameters and , then obeys a gamma probability law with parameters and

A proof of part (i) of theorem 9B is given in example 9A. The other parts of theorem 9B are left to the reader as exercises. A proof of theorem 9B from another point of view is given in section 4 of Chapter 9.

Example 9A . Let and be independent random variables; is normally distributed with parameters and , whereas is normally distributed with parameters and . Show that their sum is normally distributed, with parameters and satisfying the relations

 

Solution

By (9.5), By (6.9) of Chapter 4, it follows that where

 

However, the expression in braces in equation (9.16) is equal to 1. Therefore, it follows that is normally distributed with parameters and , given by (9.15).

Example 9B . The assembly of parts . It is often the case that a dimension of an assembled article is the sum of the dimensions of several parts. An electrical resistance may be the sum of several electrical resistances. The weight or thickness of the article may be the sum of the weights or thicknesses of individual parts. The probability law of the individual dimensions may be known; what is of interest is the probability law of the dimension of the assembled article. An answer to this question may be obtained from (9.5) and (9.7) if the individual dimensions are independent random variables. For example, let us consider two 10 -ohm resistors assembled in series. Suppose that, in fact, the resistances of the resistors are independent random variables, each obeying a normal probability law with mean and standard deviation . The unit, consisting of the two resistors assembled in series, has resistance equal to the sum of the individual resistances; therefore, the resistance of the unit obeys a normal probability law with mean and standard deviation ohms. Now suppose one wishes to measure the resistance of the unit, using an ohmmeter whose error of measurement is a random variable obeying a normal probability law with mean 0 and standard deviation . The measured resistance of the unit is a random variable obeying a normal probability law with mean and standard deviation ohms.

Example 9C . Let and be independent random variables, each normally distributed with parameters and . Then Consequently, for In words, has a Rayleigh distribution with parameter , whereas has a distribution with parameters and .

Example 9D . The probability distribution of the envelope of narrowband noise . A family of random variables , defined for , is said to represent a narrow-band noise voltage [see S. O. Rice, “Mathematical Analysis of Random Noise,” Bell System. Tech.Jour. , Vol. 24 (1945), p. 81] if is represented in the form in which is a known frequency, whereas and are independent normally distributed random variables with means 0 and equal variances . The envelope of is then defined as In view of example 9C, it is seen that the envelope has a Rayleigh distribution with parameter

Example 9E . Let and be independent random variables, such that is normally distributed with mean 0 and variance and has a distribution with parameters and . Show that the quotient has Student’s distribution with parameter .

 

Solution

By (9.10), the probability density function of for any real number is given by

 

where

By making the change of variable , it follows that

from which one may immediately deduce that the probability density function of is given by (4.15) of Chapter 4.

Example 9F . Distribution of the range . A ship is shelling a target on an enemy shore line, firing independent shots, all of which may be assumed to fall on a straight line and to be distributed according to the distribution function with probability density function . Define the range (or span) of the attack as the interval between the location of the extreme shells. Find the probability density function of .

 

Solution

Let be independent random variables representing the coordinates locating the position of the shots. The range may be written , in which and minimum . The joint distribution function is found as follows. If , then is the probability that simultaneously ; consequently,

 

since for . If , then is the probability that simultaneously but not simultaneously ; consequently,

The joint probability density of and is then obtained by differentiation. It is given by

From (9.8) and (9.23) it follows that the probability density function of the range of independent continuous random variables, whose individual distribution functions are all equal to and whose individual probability density functions are all equal to , is given by

The distribution function of is then given by

Equations (9.24) and (9.25) can be explicitly evaluated only in a few cases, such as that in which each random variable is uniformly distributed on the interval 0 to 1. Then from (9.24) it follows that

 

A Geometrical Method for Finding the Probability Law of a Function of Several Random Variables . Consider jointly continuous random variables , and the random variable . Suppose that the joint probability density function of has the property that it is constant on the surface in -dimensional space obtained by setting equal to a constant; more precisely, suppose that there is a function of a real variable, denoted by , such that

If (9.27) holds and is a continuous function, we obtain a simple formula for the probability density function of the random variable for any real number in which represents the volume within the surface in -dimensional space with equation ; in symbols,

We sketch a proof of (9.28). Let . Then, by the law of the mean for integrals,

for some point in the set . Now, as tends to 0 , tends to , assuming is a continuous function, and tends to . From these facts, one immediately obtains .

We illustrate the use of (9.28) by obtaining a basic formula, which generalizes example 9C.

Example 9G . Let be independent random variables, each normally distributed with mean 0 and variance 1. Let . Show that where . In words, has a distribution with parameters and .

 

Solution

Define and . Then (9.27) holds. Now is the volume within a sphere in -dimensional space of radius . Clearly, for , and for so that for some constant . Then . By (9.28), for , and for , for some constant . To obtain , use the normalization condition . The proof of is complete.

 

Example 9H . The energy of an ideal gas is distributed . Consider an ideal gas composed of particles of respective masses . Let denote the velocity components at a given time instant of the th particle. Assume that the total energy of the gas is given by its kinetic energy

Assume that the joint probability density function of the -velocities is proportional to , in which is Boltzmann’s constant and is the absolute temperature of the gas; in statistical mechanics one says that the state of the gas has as its probability law Gibb’s canonical distribution. The energy of the gas is a random variable whose probability density function may be derived by the geometrical method. For for some constant , in which is the volume within the ellipsoid in -dimensional space consisting of all -tuples of velocities whose kinetic energy . One may show that for some constant in the same way that is shown in example 9G to be proportional to . Consequently, for In words, has a distribution with parameters and .

We leave it for the reader to verify the validity of the next example.

Example 9I . The joint normal distribution . Consider two jointly normally distributed random variables and ; that is, and have a joint probability density function

for some constants , , , , , in which the function for any two real numbers and is defined by

The curve constant is an ellipse. Let . Then for .

Theoretical Exercises

Various probability laws (or equivalently, probability distributions), which are of importance in statistics, arise as the probability laws of various functions of normally distributed random variables.

9.1 . The distribution . Show that if are independent random variables, each normally distributed with parameters and , and if , then has a distribution with parameters and .

9.2 . The distribution . Show that if are independent random variables, each normally distributed with parameters and , then

has a distribution with parameters and .

9.3 . Student’s distribution . Show that if are independent random variables, each normally distributed with parameters and , then the random variable

has as its probability law Student’s distribution with parameter (which, it should be noted, is independent of )!

9.4 . The distribution . Show that if and are independent random variables, distributed with and degrees of freedom, respectively, then the quotient obeys the distribution with parameters and . Consequently, conclude that if are independent random variables, each normally distributed with parameters and , then the random variable

has as its probability law the distribution with parameters and . In statistics the parameters and are spoken of as “degrees of freedom.”

9.5 . Show that if has a binomial distribution with parameters and , if has a binomial distribution with parameters and , and and are independent, then has a binomial distribution with parameters and .

9.6 . Show that if has a Poisson distribution with parameter , if has a Poisson distribution with parameter , and and are independent, then is Poisson distributed with parameter .

9.7 . Show that if and are independently and uniformly distributed over the interval to , then

9.8 . Prove the validity of the assertion made in example 9I. Identify the probability law of . Find the probability law of .

9.9 . Let and have a joint probability density function given by (9.31). Show that the sum is normally distributed, with parameters and .

9.10 . Let and have a joint probability density function given by equation (9.31), with . Show that

If and are independent, then the quotient has a Cauchy distribution.

9.11 . Use the proof of example 9G to prove that the volume of an dimensional sphere of radius is given by

Prove that the surface area of the sphere is given by .

9.12 . Prove that it is impossible for two independent identically distributed random variables, and , each taking the values 1 to 6, to have the property that for . Consequently, conclude that it is impossible to weight a pair of dice so that the probability of occurrence of every sum from 2 to 12 will be the same.

9.13 . Prove that if two independent identically distributed random variables, and , each taking the values 1 to 6 , have the property that their sum will satisfy for , , and then for .

Exercises

9.1 . Suppose that the load on an airplane wing is a random variable obeying a normal probability law with mean 1000 and variance 14,400, whereas the load that the wing can withstand is a random variable obeying a normal probability law with mean 1260 and variance 2500. Assuming that and are independent, find the probability that (that the load encountered by the wing is less than the load the wing can withstand).

 

Answer

.

 

In exercises 9.2 to 9.4 let and be independently and uniformly distributed over the intervals 0 to 1.

9.2 . Find and sketch the probability density function of (i) , (ii) , (iii) .

9.3 . (i) Maximum , (ii) minimum .

 

Answer

(i) ; 0 otherwise. (ii) ; 0 otherwise;

 

9.4 . (i) , (ii) .

In exercises 9.5 to 9.7 let and be independent random variables, each normally distributed with parameters and .

9.5 . Find and sketch the probability density function of (i) , (ii) , (iii) , (iv) , (v) .

 

Answer

(i), (ii) Normal with mean 0, variance ; (iii) for ; 0 otherwise; (iv), (v) normal with mean 0, variance .

 

9.6 . (i) , (ii) .

9.7 . (i) , (ii) .

 

Answer

.

 

9.8 . Let , and be independent random variables, each normally distributed with parameters and . Find and sketch the probability density functions of (i) , (ii) , (iii) , (iv) .

9.9 . Let , and be independent random variables, each exponentially distributed with parameter . Find the probability density function of (i) , (ii) minimum ), (iii) maximum , (iv) .

 

Answer

(i) Gamma with parameters and ; (ii) exponential with ;

 

(iii) for otherwise; (iv) for otherwise.

9.10 . Find and sketch the probability density function of if and are independent random variables, each normally distributed with mean 0 and variance .

9.11 . The envelope of a narrow-band noise is sampled periodically, the samples being sufficiently far apart to assure independence. In this way independent random variables are observed, each of which is Rayleigh distributed with parameter . Let , ) be the largest value in the sample. Find the probability density function of .

 

Answer

for .

 

9.12 . Let be the magnitude of the velocity of a particle whose velocity components are independent random variables, each normally distributed with mean 0 and variance is Boltzmann’s constant, is the absolute temperature of the medium in which the particle is immersed, and is the mass of the particle. Describe the probability law of .

9.13 . Let be independent random variables, uniformly distributed over the interval 0 to 1. Describe the probability law of . Using this result, describe a procedure for forming a random sample of a random variable with a distribution with degrees of freedom.

9.14 . Let and be independent random variables, each exponentially distributed with parameter . Find the probability density function of .

9.15 . Show that if are independent identically distributed random variables, whose minimum minimum obeys an exponential probability law with parameter , then each of the random variables obeys an exponential probability law with parameter . If you prefer to solve the problem for the special case that , this will suffice.

Hint : obeys an exponential probability law with parameter if and only if or 0, depending on whether or .

9.16 . Let be independent random variables (i) uniformly distributed on the interval -1 to 1, (ii) exponentially distributed with mean 2. Find the distribution of the range - minimum .

9.17 . Find the probability that in a random sample of size of a random variable uniformly distributed on the interval 0 to 1 the range will exceed 0.8.

 

Answer

.

 

9.18 . Determine how large a random sample one must take of a random variable uniformly distributed on the interval 0 to 1 in order that the probability will be more than 0.95 that the range will exceed 0.90.

9.19 . The random variable represents the amplitude of a sine wave; represents the amplitude of a cosine wave. Both are independently and uniformly distributed over the interval 0 to 1.

(i) Let the random variable represent the amplitude of their resultant; that is, . Find and sketch the probability density function of .

(ii) Let the random variable represent the phase angle of the resultant; that is, . Find and sketch the probability density function of .

 

Answer

See the answer to exercise 10.3.

 

9.20 . The noise output of a quadratic detector in a radio receiver can be represented as , where and are independently and normally distributed with parameters and . If, in addition to noise, there is a signal present, the output is represented by , where and are given constants. Find the probability density function of the output of the detector, assuming that (i) noise alone is present, (ii) both signal and noise are present.

9.21 . Consider 3 jointly distributed random variables , and with a joint probability density function

Find the probability density function of the sum .

 

Answer

for otherwise.