The Characteristic Function of a Random Variable Specifies Its Probability Law

In this section we give various inversion formulas for the distribution function, probability mass function, and probability density function of a random variable in terms of its characteristic function. As a consequence of these formulas, it follows that to describe the probability law of a random variable it suffices to specify its characteristic function .

We first prove a theorem that gives in terms of characteristic functions an explicit formula for for a fairly large class of functions .

Theorem 3A. Let be a bounded Borel function of a real variable that at every point possesses a limit from the right and a limit from the left . Let be the arithmetic mean of these limits. Assume further that is absolutely integrable; that is, Define as the Fourier integral (or transform) of ; that is, for every real number Then, for any random variable the expectation may be expressed in terms of the characteristic function :

The proof of this important theorem is given in section 5 . In this section we discuss its consequences.

If the product is absolutely integrable, that is, then (3.4) may be written Without imposing the condition (3.5), it is incorrect to write (3.6). Indeed, in order even to write (3.6) the integral on the right-hand side of (3.6) must exist; this is equivalent to (3.5) being true.

We next take for a function defined as follows, for some finite numbers and (with ): The function defined by (3.7) fulfills the hypotheses of theorem 3A; it is bounded, absolutely integrable, and possesses right-hand and left-hand limits at any point . Further, for every . Now, if and are points at which the distribution function is continuous, then Further, Consequently, with this choice of function , theorem 3A yields an expression for the distribution function of a random variable in terms of its characteristic function .

Theorem 3B . If and , where , are finite real numbers at which the distribution function is continuous, then

Equation (3.10) constitutes an inversion formula , whereby, with a knowledge of the characteristic function , a knowledge of the distribution function may be obtained.

An explicit inversion formula for in terms of may be written in various ways. Since , we determine from (3.10) that at any point where is continuous The limit is taken over the set of points , which are continuity points of .

A more useful inversion formula, the proof of which is given in section 5, is the following: at any point , where is continuous, The integral is an improper Riemann integral, defined as Equations (3.11) and (3.12) lead immediately to the uniqueness theorem, which states that there is a one-to-one correspondence between distribution functions and characteristic functions; two characteristic functions that are equal at all points (or equal at all except a countable number of points) are the characteristic functions of the same distribution function, and two distribution functions that are equal at all except a countable number of points give rise to the same characteristic function.

We may express the probability mass function of the random variable in terms of its characteristic function; for any real number The proof of (3.13) is given in section 5 .

It is possible to give a criterion in terms of characteristic functions that a random variable has an absolutely continuous probability law. 1 If the characteristic function is absolutely integrable, that is , then the random variable obeys the absolutely continuous probability law specified by the probability density function for any real number given by One expresses (3.15) in words by saying that is the Fourier transform, or Fourier integral, of .

The proof of (3.15) follows immediately from the fact that at any continuity points and of Equation (3.16) follows from (3.6) in the same way that (3.10) followed from (3.4). It may be proved from (3.16) that (i) is continuous at every point , (ii) exists at every real number and is given by (3.15), (iii) for any numbers and . From these facts it follows that is specified by and that is given by (3.15).

The inversion formula (3.15) provides a powerful method of calculating Fourier transforms and characteristic functions. Thus, for example, from a knowledge that where is defined by it follows by (3.15) that Similarly, from it follows that

We note finally the following important formulas concerning sums of independent random variables, convolution of distribution functions, and products of characteristic functions . Let and be two independent random variables, with respective distribution functions and and respective characteristic functions and . It may be proved (see section 9 of Chapter 7) that the distribution function of the sum for any real number is given by

On the other hand, it is clear that the characteristic function of the sum for any real number is given by since, by independence of and . The distribution function , given by (3.22), is said to be the convolution of the distribution functions and ; in symbols, one writes .

Exercises

3.1. Verify (3.17), (3.19), (3.20), and (3.21).

3.2. Prove that if and are probability density functions, whose corresponding characteristic functions and are absolutely integrable, then

3.3. Use (3.15), (3.17), and (3.24) to prove that

Evaluate the integral on the right-hand side of (3.25).

 

Answer

for for otherwise.

 

3.4. Let be uniformly distributed over the interval 0 to . Let . Show directly that the probability density function of for any real number is given by The characteristic function of may be written in which is the Bessel function of order 0, defined for our purposes by the integral in (3.27). Is it true or false that

3.5. The image interference distribution. The amplitude of a signal received at a distance from a transmitter may fluctuate because the signal is both directly received and reflected (reflected either from the ionosphere or the ocean floor, depending on whether it is being transmitted through the air or the ocean). Assume that the amplitude of the direct signal is a constant and the amplitude of the reflected signal is a constant but that the phase difference between the two signals changes randomly and is uniformly distributed over the interval 0 to . The amplitude of the received signal is then given by . Assuming these facts, show that the characteristic function of is given by

Use this result and the preceding exercise to deduce the probability density function of .

 

Answer

for otherwise.

 


  1. In this section we use the terminology “an absolutely continuous probability law” for what has previously been called in this book “a continuous probability law”. This is to call the reader’s attention to the fact that in advanced probability theory it is customary to use the expression “absolutely continuous” rather than “continuous”. A continuous probability law is then defined as one corresponding to a continuous distribution function. ↩︎