Moment-Generating Functions

The evaluation of expectations requires the use of operations of summation and integration for which completely routine methods are not available. We now discuss a method of evaluating the moments of a probability law, which, when available, requires the performance of only one summation or integration, after which all the moments of the probability law can be obtained by routine differentiation.

The moment-generating function of a probability law is a function , defined for all real numbers by

In words, is the expectation of the exponential function .

In the case of a discrete probability law, specified by a probability mass function , the moment-generating function is given by

In the case of a continuous probability law, specified by a probability density function , the moment-generating function is given by

Since, for fixed , the integrand is a positive function of , it follows that is either finite or infinite. We say that a probability law possesses a moment-generating function if there exists a positive number such that is finite for . It may then be shown that all moments of the probability law exist and may be expressed in terms of the successive derivatives at of the moment-generating function [see (3.5) ]. We have already shown that there are probability laws without finite means. Consequently, probability laws that do not possess moment-generating functions also exist. It may be seen in Chapter 6 that for every probability law one can define a function, called the characteristic function, that always exists and can be used as a moment-generating function to obtain those moments that do exist.

If a moment-generating function exists for (for some ), then one may form its successive derivatives by successively differentiating under the integral or summation sign. Consequently, we obtain

Letting , we obtain

If the moment-generating function is finite for (for some ), it then possesses a power-series expansion (valid for ).

To prove (3.6), use the definition of and the fact that

In view of (3.6), if one can readily obtain the power-series expansion of , then one can readily obtain the th moment for any integer , since is the coefficient of ! in the power-series expansion of .

Example 3A . The Bernoulli probability law with parameter has a moment-generating function for .

with derivatives

Example 3B . The binomial probability law with parameters and has a moment-generating function for .

with derivatives

Probability Law Parameters Probability Mass Function Mean Variance
Bernoulli
Binomial
Poisson
Geometric
Negative binomial
Hypergeometric
Table. 3A . Some Frequently Encountered Discrete Probability Laws with Their Moments and Generating Functions
Probability Law Moment-Generating Function Characteristic Function Third Central Moment Fourth Central Moment
Bernoulli
Binomial
Poisson
Geometric
Negative binomial
Hypergeometricsee M. G. Kendall, Advanced Theory of Statistics , Charles Griffin, London, 1948, p. 127.
Table. 3A (Continued). Some Frequently Encountered Discrete Probability Laws with Their Moments and Generating Functions
Probability Law Parameters Probability Density Function Mean Variance
Uniform over interval to
Normal
Exponential
Gamma
Table. 3B . Some Frequently Encountered Continuous Probability laws with Their Moments and Generating Functions
Probability Law Moment-Generating Function Characteristic Function Third Central Moment Fourth Central Moment
Uniform over interval to
Normal
Exponential
Gamma
Table. 3B (Continued). Some Frequently Encountered Continuous Probability laws with Their Moments and Generating Functions

Example 3C . The Poisson probability law with parameter has a moment-generating function for all .

with derivatives

Consequently, the variance . Thus for the Poisson probability law the mean and the variance are equal .

Example 3D . The geometric probability law with parameter has a moment-generating function for such that .

From (3.14) one may show that the mean and variance of the geometric probability law are given by

Example 3E . The normal probability law with mean and variance has a moment-generating function for .

From (3.16) one may show that the central moments of the normal probability law are given by An alternate method of deriving (3.17) is by use of (2.22) in Chapter 4.

Example 3F . The exponential probability law with parameter has a moment-generating function for .

One may show from (3.18) that for the exponential probability law the mean and the standard deviation are equal, and are equal to the reciprocal of the parameter .

Example 3G . The lifetime of a radioactive atom . It is shown in section 4 of Chapter 6 that the time between emissions of particles by a radioactive atom obeys an exponential probability law with parameter . By example 3F, the mean time between emissions is . The time between emissions is called the lifetime of the atom. The half life of the atom is defined as the time such that the probability is that the lifetime will be greater than . Since the probability that the lifetime will be greater than a given number is , it follows that is the solution of , or . In words, the half life is equal to the mean multiplied by .

Theoretical Exercises

3.1 . Generating function of moments about a point . Define the moment-generating function of a probability law about a point as a function defined for all real numbers by . Show that may be obtained from by . The th moment of the probability law about the point is given by and may be read off as the coefficient of ! in the power-series expansion of .

3.2 . The factorial moment-generating function . of a probability law is defined for all such that by Its th derivative evaluated at

is called the th factorial moment of the probability law. From a knowledge of the first factorial moments of a probability law one may obtain a knowledge of the first moments of the probability law, and conversely. Thus, for example,

Equation (3.19) was implicitly used in calculating certain second moments and variances in section 2. Show that the first moments of two distinct probability laws coincide if and only if their first factorial moments coincide.

Hint : Consult M. Kendall, The Advanced Theory of Statistics , Vol. I, Griffin, London, 1948, p. 58.

3.3 . The factorial moment-generating function of the probability law of the number of matches in the matching problem . The number of matches obtained by distributing, 1 to an urn, balls, numbered 1 to , among urns, numbered 1 to , has a probability law specified by the probability mass function

Show that the corresponding moment-generating function may be written

Consequently the factorial moment-generating function of the number of matches may be written

3.4 . The first moments of the number of matches in the problem of matching balls in urns coincide with the first moments of the Poisson probability law with parameter . Show that the factorial moment-generating function of the Poisson law with parameter is given by

By comparing (3.22) and (3.23), it follows that the first factorial moments, and, consequently, the first moments of the probability law of the number of matches and the Poisson probability law with parameter 1, coincide.

Exercises

Compute the moment generating function, mean, and variance of the probability law specified by the probability density function, probability mass function, or distribution function given.

3.1 .

 

Answer

(i) ; (ii) .

 

3.2 .

3.3 .

 

Answer

(i) ; (ii) .

 

3.4 .

3.5 . Find the mean, variance, third central moment, and fourth central moment of the number of matches when (i) 4 balls are distributed in 4 urns, 1 to an urn, (ii) 3 balls are distributed in 3 urns, 1 to an urn.

 

Answer

(i) ; (ii) .

 

3.6 . Find the factorial moment-generating function of the (i) binomial, (ii) Poisson, (iii) geometric probability laws and use it to obtain their means, variances, and third and fourth central moments.