Expectations of Jointly Distributed Random Variables

Consider two jointly distributed random variables and . The expectation of a function of two real variables is defined as follows:

If the random variables and are jointly continuous, with joint probability density function , then

If the random variables and are jointly discrete, with joint probability mass function , then

If the random variables and have joint distribution function , then

where the two-dimensional Stieltjes integral may be defined in a manner similar to that in which the one-dimensional Stieltjes integral was defined in section 6 of Chapter 5.

On the other hand, is a random variable, with expectation

depending on whether the probability law of is specified by its distribution function, probability density function, or probability mass function.

It is a basic fact of probability theory that for any jointly distributed random variables and and any Borel function

in the sense that if either of the expectations in (2.5) exists then so does the other, and the two are equal. A rigorous proof of (2.5) is beyond the scope of this book.

In view of (2.5) we have two ways of computing the expectation of a function of jointly distributed random variables. Equation (2.5) generalizes (1.5). Similarly, (1.11) may also be generalized.

Let , and be random variables such that for some Borel function . Then for any Borel function

The most important property possessed by the operation of expectation of a random variable is its linearity property : if and are jointly distributed random variables with finite expectations and , then the sum has a finite expectation given by Let us sketch a proof of (2.7) in the case that and are jointly continuous. The reader may gain some idea of how (2.7) is proved in general by consulting the proof of (6.22) in Chapter 2.

From (2.5) it follows that

" transform="translate(0,835.8)">

Now

" transform="translate(0,-491.6)">

The integral on the right-hand side of is equal to the sum of the integrals on the left-hand sides of . The proof of (2.7) is now complete.

The moments and moment-generating function of jointly distributed random variables are defined by a direct generalization of the definitions given for a single random variable. For any two nonnegative integers and we define as a moment of the jointly distributed random variables and . The sum is called the order of the moment. For the moments of orders 1 and 2 we have the following names; and are, respectively, the means of and , whereas and are, respectively, the mean squares of and . The moment is called the product moment .

We next define the central moments of the random variables and . For any two nonnegative integers, and , we define " transform="translate(0,798.1)">as a central moment of order . We are again particularly interested in the central moments of orders 1 and 2. The central moments and of order 1 both vanish, whereas and are, respectively, the variances of and . The central moment is called the covariance of the random variables and and is written ; in symbols, We leave it to the reader to prove that the covariance is equal to the product moment, minus the product of the means ; in symbols,

The covariance derives its importance from the role it plays in the basic formula for the variance of the sum of two random variables :

To prove (2.11), we write from which (2.11) follows by (1.8) and (2.10).

The joint moment-generating function is defined for any two real numbers, and , by

The moments can be read off from the power-series expansion of the moment-generating function, since formally

In particular, the means, variances, and covariance of and may be expressed in terms of the derivatives of the moment-generating function: in which .

Example 1.

Example 2A . The joint moment-generating function and covariance of jointly normal random variables . Let and be jointly normally distributed random variables with a joint probability density function

Missing or unrecognized delimiter for \left\begin{align} f_{X_{1}, X_{2}}\left(x_{1}, x_{2}\right)= & \frac{1}{2 \pi \sigma_{1} \sigma_{2} \sqrt{1-\rho^{2}}} \exp \left{-\frac{1}{2\left(1-\rho^{2}\right)}\left[\left(\frac{x_{1}-m_{1}}{\sigma_{1}}\right)^{2}\right.\right. \tag{2.18}\ & \left.\left.-2 \rho\left(\frac{x_{1}-m_{1}}{\sigma_{1}}\right)\left(\frac{x_{2}-m_{2}}{\sigma_{2}}\right)+\left(\frac{x_{2}-m_{2}}{\sigma_{2}}\right)^{2}\right]\right} \end{align}

The joint moment-generating function is given by

To evaluate the integral in (2.19), let us note that since

we may write

in which is the normal density function. Using our knowledge of the moment-generating function of a normal law, we may perform the integration with respect to the variable in the integral in (2.19). We thus determine that is equal to

Missing or unrecognized delimiter for \right\begin{align} \int_{-\infty}^{\infty} dx_{1} \frac{1}{\sigma_{1}} \phi\left(\frac{x_{1}-m_{1}}{\sigma_{1}}\right) & \exp \left(t_{1} x_{1}\right) \exp {\left.t_{2}\left[m_{2}+\frac{\sigma_{2}}{\sigma_{1}} \rho\left(x_{1}-m_{1}\right)\right]\right} \tag{2.21}\ & \times \exp \left[\frac{1}{2} t_{2}^{2} \sigma_{2}^{2}\left(1-\rho^{2}\right)\right] \ &=\exp {\left[\frac{1}{2} t_{2}^{2} \sigma_{2}^{2}\left(1-\rho^{2}\right)+t_{2} m_{2}-t_{2} \frac{\sigma_{2}}{\sigma_{1}} \rho m_{1}\right] } \ & \times \exp \left[m_{1}\left(t_{1}+t_{2} \frac{\sigma_{2}}{\sigma_{1}} \rho\right)+\frac{1}{2} \sigma_{1}^{2}\left(t_{1}+t_{2} \frac{\sigma_{2}}{\sigma_{1}} \rho\right)^{2}\right]. \end{align}

By combining terms in (2.21), we finally obtain that

(2.22) .

The covariance is given by

Thus, if two random variables are jointly normally distributed, their joint probability law is completely determined from a knowledge of their first and second moments, since .

The foregoing notions may be extended to the case of jointly distributed random variables, . For any Borel function of real variables, the expectation of the random variable may be expressed in terms of the joint probability law of .

If are jointly continuous, with a joint probability density function , it may be shown that

If are jointly discrete, with a joint probability mass function , it may be shown that

The joint moment-generating function of jointly distributed random variables is defined by

It may also be proved that if and are random variables, such that for some Borel function of real variables, then for any Borel function of one real variable

Theoretical Exercises

Exercise 1.

2.1 . Linearity property of the expectation operation . Let and be jointly discrete random variables with finite means. Show that (2.7) holds.

Exercise 2.

2.2 . Let and be jointly distributed random variables whose joint momentgenerating function has a logarithm given by

in which is a random variable with probability density function , and are known functions, and . Show that

Moment-generating functions of the form of (2.28) play an important role in the mathematical theory of the phenomenon of shot noise in radio tubes.

Exercise 3.

2.3 . The random telegraph signal . For let , where is a discrete random variable such that , is a family of random variables such that , and for any times , the random variables , and are independent. For any , suppose that obeys (i) a Poisson probability law with parameter , (ii) a binomial probability law with parameters and . Show that for any , and for any

Regarded as a random function of time, is called a “random telegraph signal”. Note: in the binomial case, takes only integer values.

Exercises

Exercise 4.

2.1 . An ordered sample of size 5 is drawn without replacement from an urn containing 8 white balls and 4 black balls. For let be equal to 1 or 0, depending on whether the ball drawn on the jth draw is white or black. Find .

 

Answer

Mean, ; variance , covariances .

 

Exercise 5.

2.2 . An urn contains 12 balls, of which 8 are white and 4 are black. A ball is drawn and its color noted. The ball drawn is then replaced; at the same time 2 balls of the same color as the ball drawn are added to the urn. The process is repeated until 5 balls have been drawn. For let be equal to 1 or 0, depending on whether the ball drawn on the th draw is white or black. Find .

Exercise 6.

2.3 . Let and be the coordinates of 2 points randomly chosen or the unit interval. Let be the distance between the points. Find the mean, variance, and third and fourth moments of .

 

Answer

for , Var , .

 

Exercise 7.

2.4 . Let and be independent normally identically distributed random variables, with mean and variance . Find the mean of the random variable .

Hint : for any real numbers and show and use the fact that .

Exercise 8.

2.5 . Let and be jointly normally distributed with mean 0, variance 1, and covariance . Find .

 

Answer

.

 

Exercise 9.

2.6 . Let and have a joint moment-generating function

in which and are positive constants such that . Find .

Exercise 10.

2.7 . Let and have a joint moment-generating function

in which and are positive constants such that . Find .

 

Answer

Means, 1; variances, 0.5; covariance, .

 

Exercise 11.

2.8 . Let and be jointly distributed random variables whose joint momentgenerating function has a logarithm given by (2.28), with uniformly distributed over the interval -1 to 1, and

in which are given constants such that . Find , .

Exercise 12.

2.9 . Do exercise 2.8 under the assumption that is .

 

Answer

Means, 4; variances, 6; covariance, .