Solution of the Problem of the Addition of Independent Random Variables by the Method of Characteristic Functions

By the use of characteristic functions, we may give a solution to the problem of addition of independent random variables. Let be independent random variables, with respective characteristic functions . Let be their sum. To know the probability law of , it suffices to know its characteristic function . However, it is immediate, from the properties of independent random variables, that for every real number

or, equivalently, . Thus, in terms of characteristic functions, the problem of addition of independent random variables is given by (4.1) a simple and concise solution, which may also be stated in words: the probability law of a sum of independent random variables has as its characteristic function the product of the characteristic functions of the individual random variables. 

In this section we consider certain cases in which (4.1) leads to an exact evaluation of the probability law of . In Chapter 10 we show how (4.1) may be used to give a general approximate evaluation of the probability law of .

There are various ways, given the characteristic function of the sum , in which one can deduce from it the probability law of .

It may happen that will coincide with the characteristic function of a known probability law. For example, for each suppose that is normally distributed with mean and variance . Then, , and, by (4.1),

We recognize as the characteristic function of the normal distribution with mean and variance . Therefore, the sum is normally distributed with mean and variance . By using arguments of this type, we have the following theorem.

Theorem 4A. Let be the sum of independent random variables. 

If, for , is , then is
If for , is binomial distributed with parameters and , then is binomial distributed with parameters and
If, for , is Poisson distributed with parameter , then is Poisson distributed with parameter
If, for , is distributed with degrees of freedom, then is distributed with degrees of freedom. 
If, for , is Cauchy distributed with parameters and , then is Cauchy distributed with parameters and

One may be able to invert the characteristic function of to obtain its distribution function or probability density function. In particular, if is absolutely integrable, then has a probability density function for any real number given by

In order to evaluate the infinite integral in (4.2), one will generally have to use the theory of complex integration and the calculus of residues.

Even if one is unable to invert the characteristic function to obtain the probability law of in closed form, the characteristic function can still be used to obtain the moments and cumulants of . Indeed, cumulants assume their real importance from the study of the sums of independent random variables because they are additive over the summands. More precisely, if are independent random variables whose th cumulants exist, then the th cumulant of the sum exists and is equal to the sum of the th cumulants of the individual random variables . In symbols,

Equation (4.3) follows immediately from the fact that the th cumulant is (up to a constant) the th derivative at 0 of the logarithm of the characteristic function and the log-characteristic function is additive over independent summands, since the characteristic function is multiplicative.

The moments and central moments of a random variable may be expressed in terms of its cumulants. In particular, the first cumulant and the mean, the second cumulant and the variance, and the third cumulant and the third central moment, respectively, are equal. Consequently, the means, variances, and third central moments are additive over independent summands; more precisely, where, for any random variable , we define ; (4.4) may, of course, also be proved directly.

Exercises

4.1. Prove theorem 4A.

4.2. Find the probability laws corresponding to the following characteristic functions: (i) , (ii) , (iii) , (iv) .

4.3. Let be a sequence of independent random variables, each uniformly distributed over the interval 0 to 1 . Let . Show that for any real number , such that ,

hence prove by mathematical induction that

4.4. Let be a sequence of independent random variables, each normally distributed with mean 0 and variance 1. Let . Show that for any real number and integer

Prove that for ; hence deduce that has a distribution with degrees of freedom.

4.5. Let be independent random variables, each normally distributed with mean and variance 1. Let .

(i) Find the cumulants of .

(ii) Let for suitable constants and , in which is a random variable obeying a distribution with degrees of freedom. Determine and so that and have the same means and variances. Hint: Show that each has the characteristic function

 

Answer

(i) th cumulant of is ;

 

(ii) , .