Consider a sequence of jointly distributed random variables
We consider first the notion of convergence with probability one . We say that
The sequence
The sequence
Convergence in probability derives its importance from the fact that, like convergence with probability one, no moments need exist before it can be considered, as is the case with convergence in mean square. It is immediate that if convergence in mean square holds then so does convergence in probability; one need only consider the following form of Chebyshev’s inequality: for any
The relation that exists between convergence with probability one and convergence in probability is best understood by considering the following characterization of convergence with probability one, which we state without proof. Let
On the other hand, the sequence
and (1.3) implies (1.1). Thus, if
Convergence with probability one of the sequence
On the other hand, convergence in probability of the sequence
The following theorem gives a condition under which convergence in mean square implies convergence with probability one.
Theorem 1A. If a sequence
Proof
From (1.6) it follows that