The notion of independence of two random variables,
We begin by considering the properties of expectations of products of random variables. Let
Theorem 3A: If the random variables
if the expectations on the right side of (3.2) exist.
To prove equation (3.2), it suffices to prove it in the form
since independence of
Now suppose that we modify (3.2) and ask only that it hold for the functions
For reasons that are explained after (3.7), two random variables,
For uncorrelated random variables the formula given by (2.11) for the variance of the sum of two random variables becomes particularly elegant; the variance of the sum of two uncorrelated random variables is equal to the sum of their variances. Indeed,
if and only if
Two random variables that are independent are uncorrelated, for if (3.2) holds then, a fortiori, (3.4) holds. The converse is not true in general; an example of two uncorrelated random variables that are not independent is given in theoretical exercise 3.2. In the important special case in which
The correlation coefficient
In view of (3.7) and (3.5), two random variables
The correlation coefficient provides a measure of how good a prediction of the value of one of the random variables can be formed on the basis of an observed value of the other. It is subsequently shown that
Further
and
From (3.9) and (3.10) it follows that if the correlation coefficient equals 1 or -1 then there is perfect prediction; to a given value of one of the random variables there is one and only one value that the other random variable can assume. What is even more striking is that
That (3.8), (3.9), and (3.10) hold follows from the following important theorem.
Theorem 3B . For any two jointly distributed random variables,
Applied to the random variables
We prove (3.11) as follows. Define, for any real number
The inequalities given by (3.11) and (3.12) are usually referred to as Schwarz’s inequality or Cauchy’s inequality.
Conditions for Independence . It is important to note the difference between two random variables being independent and being uncorrelated. They are uncorrelated if and only if (3.4) holds. It may be shown that they are independent if and only if (3.2) holds for all functions
Theorem 3c. Two jointly distributed random variables
(i) Criterion in terms of probability functions. For any Borel sets
(ii) Criterion in terms of distribution functions. For any two real numbers,
(iii) Criterion in terms of expectations. For any two Borel functions,
(iv) Criterion in terms of moment-generating functions (if they exist). For any two real numbers,
Theoretical Exercises
3.1. The standard deviation has the properties of the operation of taking the absolute value of a number : show first that for any 2 real numbers,
Hint : Square both sides of the equations. Show next that for any 2 random variables,
Give an example to prove that the variance does not satisfy similar relationships.
3.2. Show that independent random variables are uncorrelated. Give an example to show that the converse is false.
Hint : Let
3.3. Prove that if
3.4 . Let
Express
3.6. Let
3.7. Let
3.8. Suppose that
3.9. In an urn containing
Exercises
3.1. Consider 2 events
Answer
3.2. Consider a sample of size 2 drawn with replacement (without replacement) from an urn containing 4 balls, numbered 1 to 4. Let
3.3. Two fair coins, each with faces numbered 1 and 2, are thrown independently. Let
Answer
3.4. Let
3.5. Let
Answer
3.6. Let
3.7. Consider the random variables whose joint moment-generating function is given in exercise 2.6 . Find
Answer
3.8. Consider the random variables whose joint moment-generating function is given in exercise 2.7 . Find
3.9. Consider the random variables whose joint moment-generating function is given in exercise 2.8 . Find
Answer
3.10. Consider the random variables whose joint moment-generating function is given in exercise 2.9 . Find