Independent Events and Families of Events

The notions of independent and dependent events play a central role in probability theory. Certain relations, which recur again and again in probability problems, may be given a general formulation in terms of these notions. If the events and have the property that the conditional probability of , given , is equal to the unconditional probability of , one intuitively feels that the event is statistically independent of , in the sense that the probability of having occurred is not affected by the knowledge that has occurred. We are thus led to the following formal definition.

Definition of an Event Being Independent of an Event Which Has Positive Probability. Let and be events defined on the same probability space . Assume , so that is well defined. 

The event is said to be independent (or statistically independent) of the event if the conditional probability of , given , is equal to the unconditional probability of ; in symbols, is independent of if

Now suppose that both and have positive probability. Then both and are well defined, and from (4.6) of Chapter 2 it follows that  

If is independent of , it then follows that is independent of , since from (1.1) and (1.2) it follows that . It further follows from (1.1) and (1.2) that  

By means of (1.3) , a definition may be given of two events being independent, in which the two events play a symmetrical role.

Definition of Independent Events. Let and be events defined on the same probability space. The events and are said to be independent if (1.3) holds. 

Example 1A . Consider the problem of drawing with replacement a sample of size 2 from an urn containing four white and two red balls. Let denote the event that the first ball drawn is white and , the event that the second ball drawn is white. By (2.5) , in Chapter 2, , whereas . In view of (1.3) , the events and are independent.

Two events that do not satisfy (1.3) are said to be dependent (although a more precise terminology would be nonindependent ). Clearly, to say that two events are dependent is not very informative, for two events, and , are dependent if and only if . However, it is possible to classify dependent events to a certain extent, and this is done later. (See section 5.)

It should be noted that two mutually exclusive events, and , are independent if and only if , which is so if and only if either or has probability zero.

Example 1B . Mutually exclusive events. Let a sample of size 2 be drawn from an urn containing six balls, of which four are white. Let denote the event that exactly one of the balls drawn is white, and let denote the event that both balls drawn are white. The events and are mutually exclusive and are not independent, whether the sample is drawn with or without replacement.

Example 1C . A paradox? Choose a summer day at random on which both the Dodgers and the Giants are playing baseball games. Let be the event that the Dodgers win, and let be the event that the Giants win. If the Dodgers and the Giants are not playing each other, then we may consider the events and as independent but not mutually exclusive. If the Giants and the Dodgers are playing each other, then we may consider the events and as mutually exclusive but not independent. To resolve this paradox, one need note only that the probability space on which the events and are defined is not the same in the two cases. (See example 2B.)

The notions of independent events and of conditional probability may be extended to more than two events. Suppose one has three events , and defined on a probability space. What are we to mean by the conditional probability of the event , given that the events and have occurred, denoted by ? From the point of view of the frequency interpretation of probability, by we mean the fraction of occurrences of both and on which also occurs. Consequently, we make the formal definition that if ; is undefined if .

Next, what do we mean by the statement that the event is independent of the events and ? It would seem that we should mean that the conditional probability of , given either or or the intersection , is equal to the unconditional probability of . We therefore make the following formal definition.

The events , and , defined on the same probability space, are said to be independent (or statistically independent) if  

If (1.5) and (1.6) hold, it then follows that (assuming that the events , have positive probability, so that the conditional probabilities written below are well defined) \begin{align} & P[A \mid B, C]=P[A \mid B]=P[A \mid C]=P[A] \\ & P[B \mid A, C]=P[B \mid A]=P[B \mid C]=P[B] \tag{1.7} \\ & P[C \mid A, B]=P[C \mid A]=P[C \mid B]=P[C]. \end{align} 

Conversely, if all the relations in (1.7) hold, then all the relations in (1.5) and (1.6) hold.

It is to be emphasized that (1.5) does not imply (1.6) , so that three events, , and , which are pairwise independent [in the sense that (1.5) holds], are not necessarily independent. To see this, consider the following example.

Example 1D . Pairwise independent events that are not independent. Let a ball be drawn from an urn containing four balls, numbered 1 to 4. Assume that possesses equally likely descriptions. The events , and satisfy (1.5) but do not satisfy (1.6) . Indeed, . The reader may find it illuminating to explain in words why 1.

Example 1E . The joint credibility of witnesses. Consider an automobile accident on a city street in which car I stops suddenly and is hit from behind by car II. Suppose that three persons, whom we call , and , witness the accident. Suppose the probability that each witness has correctly observed that car I stopped suddenly is estimated by having the witnesses observe a number of contrived incidents about which each is then questioned. Assume that it is found that has probability 0.9 of stating that car I stopped suddenly, has probability 0.8 of stating that car I stopped suddenly, and has probability 0.7 of stating that car I stopped suddenly. Let , and denote, respectively, the events that persons , and will state that car I stopped suddenly. Assuming that , and are independent events, what is the probability that (i) , , and will state that car I stopped suddenly, (ii) exactly two of them will state that car I stopped suddenly?

 

Solution

By independence, the probability that all three witnesses will state that car I stopped suddenly is given by . It is subsequently shown that if , and are independent events then , and are independent events. Consequently, the probability that exactly two of the witnesses will state that car I stopped suddenly is given by

 

The probability that at least two of the witnesses will state that car I stopped suddenly is . It should be noted that the sample description space on which the events , and are defined is the space of 3-tuples in which is equal to “yes” or “no,” depending on whether person says that car I did or did not stop suddenly; components and are defined similarly with respect to persons and .

 

We next define the notions of independence and of conditional probability for events .

We define the conditional probability of , given that the events , have occurred, denoted by ; \begin{align} P\left[A_{n} \mid A_{1}, A_{2}, \ldots, A_{n-1}\right] & =P\left[A_{n} \mid A_{1} A_{2} \cdots A_{n-1}\right] \tag{1.8} \\ & =\frac{P\left[A_{1} A_{2} \cdots A_{n}\right]}{P\left[A_{1} A_{2} \cdots A_{n-1}\right]} \end{align} if .

We define the events as independent (or statistically independent) if for every choice of integers from 1 to  

Equation (1.9) implies that for any choice of integers from 1 to (for which the following conditional probability is defined) and for any integer from 1 to not equal to one has  

We next consider families of independent events , for independent events never occur alone. Let and be two families of events; that is, and are sets whose members are events on some sample description space .

Two families of events and are said to be independent if any two events and , selected from and , respectively, are independent. More generally, families of events are said to be independent if any set of events (where is selected from is selected from , and so on, until is selected from ) is independent , in the sense that it satisfies the relation  

As an illustration of the fact that independent events occur in families, let us consider two independent events, and , which are defined on a sample description space . Define the families and by so that consists of , its complement , the certain event , and the impossible event , and, similarly, consists of , and .

We now show that if the events and are independent then the families of events and defined by (1.12) are independent. In order to prove this assertion, we must verify the validity of (1.11) with for each pair of events, one from each family, that may be chosen. Since each family has four members, there are sixteen such pairs. We verify (1.11) for only four of these pairs, namely , and , and leave to the reader the verification of (1.11) for the remaining twelve pairs. We have that and satisfy (1.11) by hypothesis. Next, we show that and satisfy (1.11) . By (5.2) of Chapter . Since, by hypothesis, , it follows that for by (5.3) of Chapter 1 . Next, and satisfy (1.11) , since and , so that . Next, and satisfy (1.11) , since and , so that .

More generally, by the same considerations, we may prove the following important theorem, which expresses (1.9) in a very concise form.

Theorem. Let be events on a probability space. The events are independent if and only if the families of events are independent. 

Theoretical Exercises

1.1 . Consider independent events . Show that Consequently, obtain the probability that in 6 independent tosses of a fair die the number 3 will appear at least once. Answer : .

1.2 . Let the events be independent and for . Let be the probability that none of the events will occur. Show that .

1.3 . Let the events be independent and have equal probability . Show that the probability that exactly of the events will occur is (for )

Hint : .

1.4 . The multiplicative rule for the probability of the intersection of events . Show that, for events for which ,

1.5 . Let and be independent events. In terms of and , express, for , (i) [exactly of the events and will occur], (ii) [at least of the events and will occur], (iii) [at most of the events and will occur].

1.6 . Let , and be independent events. In terms of , and , express, for , (i) [exactly of the events will occur], (ii) [at least of the events will occur], (iii) [at most of the events will occur].

Exercises

1.1 . Let a sample of size 4 be drawn with replacement (without replacement) from an urn containing 6 balls, of which 4 are white. Let denote the event that the ball drawn on the first draw is white, and let denote the event that the ball drawn on the fourth draw is white. Are and independent? Prove your answers.

 

Answer

Yes, since and

 

(No, since and ).

1.2 . Let a sample of size 4 be drawn with replacement (without replacement) from an urn containing 6 balls, of which 4 are white. Let denote the event that exactly 1 of the balls drawn on the first 2 draws is white. Let be the event that the ball drawn on the fourth draw is white. Are and independent? Prove your answers.

1.3 . (Continuation of 1.2). Let and be as defined in exercise 1.2. Let be the event that exactly 2 white balls are drawn in the 4 draws. Are , and independent? Are and independent? Prove your answers.

 

Answer

No.

 

1.4 . Consider example . Find the probability that (i) both and will state that car I stopped suddenly, (ii) neither nor will state that car I stopped suddenly, (iii) at least 1 of , and will state that car I stopped suddenly.

1.5 . A manufacturer of sports cars enters 3 drivers in a race. Let be the event that driver 1 “shows” (that is, he is among the first 3 drivers in the race to cross the finish line), let be the event that driver 2 shows, and let be the event that driver 3 shows. Assume that the events are independent and that . Compute the probability that (i) none of the drivers will show, (ii) at least 1 will show, (iii) at least 2 will show, (iv) all of them will show.

 

Answer

(i) 0.729; (ii) 0.271; (iii) 0.028; (iv) 0.001.

 

1.6 . Compute the probabilities asked for in exercise 1.5 under the assumption that .

1.7 . A manufacturer of sports cars enters drivers in a race. For let be the event that the ith driver shows (see exercise 1.5). Assume that the events are independent and have equal probability . Show that the probability that exactly of the drivers will show is for .

1.8 . Suppose you have to choose a team of 3 persons to enter a race. The rules of the race are that a team must consist of 3 people whose respective probabilities of showing must add up to ; that is, . What probabilities of showing would you desire the members of your team to have in order to maximize the probability that at least 1 member of your team will show? (Assume independence.)

1.9 . Let and be 2 independent events such that the probability is that they will occur simultaneously and that neither of them will occur. Find and ; are and uniquely determined?

 

Answer

Possible values for are and .

 

1.10 . Let and be 2 independent events such that the probability is that they will occur simultaneously and that will occur and will not occur. Find and ; are and uniquely determined?