Independent Random Variables

In section 2 of Chapter 3 we defined the notion of a series of independent trials. In this section we define the notion of independent random variables. This notion plays the same role in the theory of jointly distributed random variables that the notion of independent trials plays in the theory of sample description spaces consisting of trials. We consider first the case of two jointly distributed random variables.

Let and be jointly distributed random variables, with individual distribution functions and , respectively, and joint distribution function . We say that the random variables and are independent if for any two Borel sets of real numbers and the events [ is in ] and [ is in ] are independent; that is,

The foregoing definition may be expressed equivalently: the random variables and are independent if for any event , depending only on the random variable , and any event , depending only on the random variable , so that the events and are independent.

It may be shown that if (6.1) holds for sets and , which are infinitely extended intervals of the form and , for any real numbers and , then (6.1) holds for any Borel sets and of real numbers. We therefore have the following equivalent formulation of the notion of the independence of two jointly distributed random variables and .

Two jointly distributed random variables, and are independent if their joint distribution function may be written as the product of their individual distribution functions and in the sense that, for any real numbers and

Similarly, two jointly continuous random variables, and are independent if their joint probability density function may be written as the product of their individual probability density functions and in the sense that, for any real numbers and ,

Equation (6.3) follows from (6.2) by differentiating both sides of (6.2) first with respect to and then with respect to . Equation (6.2) follows from (6.3) by integrating both sides of (6.3).

Similarly, two jointly discrete random variables, and are independent if their joint probability mass function may be written as the product of their individual probability mass functions and in the sense that, for all real numbers and ,

Two random variables and , which do not satisfy any of the foregoing relations, are said to be dependent or nonindependent.

Example 6A . Independent and dependent random variables . In example 5A the random variables and are independent in the case of sampling with replacement but are dependent in the case of sampling without replacement. In either case, the random variables and are identically distributed. In example 5B the random variables and are independent and identically distributed. It may be seen from the definitions given at the end of the section that the random variables considered in example 5C are independent and identically distributed.

Independent random variables have the following exceedingly important property:

Theorem 6A . Let the random variables and be obtained from the random variables and by some functional transformation, so that and for some Borel functions and of a real variable. Independence of the random variables and implies independence of the random variables and

This assertion is proved as follows. First, for any set of real numbers, write real numbers is in . It is clear that the event that is in occurs if and only if the event that is in occurs. Similarly, for any set the events that is in and is in occur, or fail to occur, together. Consequently, by (6.1)

and the proof of theorem 6A is concluded.

Example 6B . Sound intensity is often measured in decibels. A reference level of intensity is adopted. Then a sound of intensity is reported as having decibels:

Now if and are the sound intensities at two different points on a city street, let and be the corresponding sound intensities measured in decibels. If the original sound intensities and are independent random variables, then from theorem it follows that and are independent random variables.

The foregoing notions extend at once to several jointly distributed random variables. We define jointly distributed random variables as independent if any one of the following equivalent conditions holds: (i) for any Borel sets of real numbers (ii) for any real numbers

(iii) if the random variables are jointly continuous, then for any real numbers

(iv) if the random variables are jointly discrete, then for any real numbers

Theoretical Exercises

6.1 . Give an example of 3 random variables, , which are independent when taken two at a time but not independent when taken together. Hint: Let be events that have the properties asserted; see example 1C of Chapter 3. Define or 0, depending on whether the event has or has not occurred.

6.2 . Give an example of two random variables, and , which are not independent, but such that and are independent. Does such an example prove that the converse of theorem is false?

6.3 . Factorization rule for the probability density function of independent random variables . Show that jointly continuous random variables are independent if and only if their joint probability density function for all real numbers may be written in terms of some Borel functions , and .

Exercises

6.1 . The output of a certain electronic apparatus is measured at 5 different times. Let be the observations obtained. Assume that are independent random variables, each Rayleigh distributed with parameter . Find the probability that maximum . (Recall that for and is equal to 0 elsewhere.)

 

Answer

.

 

6.2 . Suppose 10 identical radar sets have a failure law following the exponential distribution. The sets operate independently of one another and have a failure rate of hours. What length of time will all 10 radar sets operate satisfactorily with a probability of 0.99?

6.3 . Let and be jointly continuous random variables, with a probability density function

(i) Are and independent random variables?

(ii) Are and identically distributed random variables?

(iii) Are and normally distributed random variables?

(iv) Find . Hint : Use polar coordinates.

(v) Are and independent random variables? Hint : Use theorem .

(vi) Find .

(vii) Find the individual probability density functions of and . [Use (8.8) .] 
(viii) Find the joint probability density function of and . [Use (6.3) .]

(ix) Would you expect that ?

(x) Would you expect that ?

 

Answer

(i) Yes; (ii) yes; (iii) yes; (iv) ; (v) yes; (vi) 0.8426; (vii) for otherwise; (viii) for ; = 0 otherwise; (ix) yes; (x) no.

 

6.4 . Let , and be independent random variables, each uniformly distributed on the interval 0 to 1. Determine the number such that

(i) [at least one of the numbers is greater than .

(ii) [at least 2 of the numbers is greater than .

Hint : To obtain a numerical answer, use the table of binomial probabilities.

6.5 . Consider two events and such that , and . Let the random variables and be defined as or 0, depending on whether the event has or has not occurred, and or 0, depending on whether the event has or has not occurred. State whether each of the following statements, is true or false:

(i) The random variables and are independent;

(ii) ;

(iii) ;

(iv) The random variable is uniformly distributed on the interval 0 to 1;

(v) The random variables and are identically distributed.

 

Answer

(i) True; (ii) false; (iii) true; (iv) false; (v) false.

 

6.6 . Show that the two random variables and considered in exercise 5.7 are independent if their joint probability mass function is given by Table , and are dependent if their joint probability mass function is given by Table 5B.

In exercises 6.7 to 6.9 let and be independent random variables, uniformly distributed over the interval 0 to 1.

6.7 . Find (i) , (ii) .

 

Answer

(i) 0.125; (ii) 0.875.

 

6.8 . Find (i) , (ii) , (iii) .

6.9 . Find (i) , (ii) , (iii) .

 

Answer

(i) 0.393; (ii) ; (iii) .