Steven R. Dunbar
Department of Mathematics
203 Avery Hall
University of Nebraska-Lincoln
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Stochastic Processes and
Advanced Mathematical Finance

__________________________________________________________________________

Moment Generating Functions

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________

Rating

Rating

Mathematically Mature: may contain mathematics beyond calculus with proofs.

_______________________________________________________________________________________________

Section Starter Question

Section Starter Question

Give some examples of transform methods in mathematics, science or engineering that you have seen or used and explain why transform methods are useful.

_______________________________________________________________________________________________

Key Concepts

Key Concepts

  1. The moment generating function converts problems about probabilities and expectations into problems from calculus about function values and derivatives.
  2. The value of the nth derivative of the moment generating function evaluated at 0 is the value of the nth moment of X.
  3. The sum of independent normal random variables is again a normal random variable whose mean is the sum of the means, and whose variance is the sum of the variances.

__________________________________________________________________________

Vocabulary

Vocabulary

  1. The nth moment of the random variable X with pdf f(x) is 𝔼 Xn = xxnf(x)dx (provided this integral converges absolutely.)
  2. The moment generating function ϕX(t) is defined by
    ϕX(t) = 𝔼 etX = ietxip(x i) if X is discrete xetxf(x)dx if X is continuous

    for all values t for which the integral converges.

__________________________________________________________________________

Mathematical Ideas

Mathematical Ideas

Transform Methods

We need some tools to aid in proving theorems about random variables. In this section we develop a tool called the moment generating function which converts problems about probabilities and expectations into problems from calculus about function values and derivatives. Moment generating functions are one of the large class of transforms in mathematics that turn a difficult problem in one domain into a manageable problem in another domain. Other examples are Laplace transforms, Fourier transforms, Z-transforms, generating functions, and even logarithms.

The general method can be expressed schematically in Figure 1.


momentgeneratingfunctions-1.png

Figure 1: Block diagram of transform methods.

Expectation of Independent Random Variables

Lemma 1. If X and Y are independent random variables, then for any functions g and h:

𝔼 g(X)h(Y ) = 𝔼 g(X) 𝔼 h(Y )

Proof. To make the proof definite suppose that X and Y are jointly continuous, with joint probability density function F(x,y). Then:

𝔼 g(X)h(Y ) =(x,y)g(x)h(y)f(x,y)dxdy =xyg(x)h(y)fX(x)fY (y)dxdy =xg(x)fX(x)dxyh(y)fY (y)dy = 𝔼 g(X) 𝔼 h(Y ) .

Remark. In words, the expectation of the product of independent random variables is the product of the expectations.

The Moment Generating Function

The moment generating function ϕX(t) is defined by

ϕX(t) = 𝔼 etX = ietxip(x i) if X is discrete xetxf(x)dx if X is continuous

for all values t for which the integral converges.

Example. The degenerate probability distribution has all the probability concentrated at a single point. That is, the degenerate random variable is a discrete random variable exhibiting certainty of outcome. If X is a degenerate random variable, then X = μ with probability 1 and X is any other value with probability 0. The moment generating function of the degenerate random variable is particularly simple:

xi=μexit = eμt.

If the moments of order k exist for 0 k k0, then the moment generating function is continuously differentiable up to order k0 at t = 0. Assuming that all operations can interchanged, the moments of X can be generated from ϕX(t) by repeated differentiation:

ϕX(t) = d dt𝔼 etX = d dtxetxf X(x)dx =x d dtetxf X(x)dx =xxetxf X(x)dx = 𝔼 XetX .

Then

ϕX(0) = 𝔼 X.

Likewise

ϕX(t) = d dtϕX(t) = d dtxxetxf X(x)dx =xx d dtetxf X(x)dx =xx2etxf X(x)dx = 𝔼 X2etX .

Then

ϕX(0) = 𝔼 X2 .

Continuing in this way:

ϕX(n)(0) = 𝔼 Xn

Remark. In words, the value of the nth derivative of the moment generating function evaluated at 0 is the value of the nth moment of X.

Theorem 2. If X and Y are independent random variables with moment generating functions ϕX(t) and ϕY (t) respectively, then ϕX+Y (t), the moment generating function of X + Y is given by ϕX(t)ϕY (t). In words, the moment generating function of a sum of independent random variables is the product of the individual moment generating functions.

Proof. Using the lemma on independence above:

ϕX+Y (t) = 𝔼 et(X+Y ) = 𝔼 etXetY = 𝔼 etX 𝔼 etY = ϕX(t)ϕY (t).

Theorem 3. If the moment generating function is defined in a neighborhood of t = 0 then the moment generating function uniquely determines the probability distribution. That is, there is a one-to-one correspondence between the moment generating function and the distribution function of a random variable, when the moment-generating function is defined and finite.

Proof. This proof is too sophisticated for the mathematical level we have now. □

The moment generating function of a normal random variable

Theorem 4. If Z N(μ,σ2), then ϕZ(t) = exp(μt + σ2t22).

Proof.

ϕZ(t) = 𝔼 etX = 1 2πσ2etxe(xμ)2(2σ2)dx = 1 2πσ2 exp (x2 2μx + μ2 2σ2tx) 2σ2 dx

By completing the square:

x2 2μx + μ2 2σ2tx = x2 2(μ + σ2t)x + μ2 = (x (μ + σ2t))2 (μ + σ2t)2 + μ2 = (x (μ + σ2t))2 σ4t2 2μσ2t.

So returning to the calculation of the m.g.f.

ϕZ(t) = 1 2πσ2 exp (x (μ + σ2t))2 σ4t2 2μσ2t 2σ2 dx = 1 2πσ2 exp σ4t2 + 2μσ2t 2σ2 exp (x (μ + σ2t))2 2σ2 dx = exp σ4t2 + 2μσ2t 2σ2 = exp μt + σ2t22

Theorem 5. If Z1 N(μ1,σ12), and Z2 N(μ2,σ22) and Z1 and Z2 are independent, then Z1 + Z2 N(μ1 + μ2,σ12 + σ 22). In words, the sum of independent normal random variables is again a normal random variable whose mean is the sum of the means, and whose variance is the sum of the variances.

Proof. We compute the moment generating function of the sum using our theorem about sums of independent random variables. Then we recognize the result as the moment generating function of the appropriate normal random variable.

ϕZ1+Z2(t) = ϕZ1(t)ϕZ2(t) = exp(μ1t + σ12t22) exp(μ 2t + σ22t22) = exp((μ1 + μ2)t + (σ12 + σ 22)t22)

An alternative visual proof that the sum of independent normal random variables is again a normal random variable using only calculus is available at The Sum of Independent Normal Random Variables is Normal.

Sources

This section is adapted from: Introduction to Probability Models, by Sheldon Ross.

_______________________________________________________________________________________________

Problems to Work

Problems to Work for Understanding

  1. Calculate the moment generating function of a random variable X having a uniform distribution on [0, 1]. Use this to obtain 𝔼 X and Var X.
  2. Calculate the moment generating function of a discrete random variable X having a geometric distribution. Use this to obtain 𝔼 X and Var X.
  3. Calculate the moment generating function of a Bernoulli random variable having value 1 with probability p and value 0 with probability 1 p. From this, derive the moment generating function of a binomial random variable with parameters n and p.

__________________________________________________________________________

Books

Reading Suggestion:

References

[1]   Sheldon M. Ross. Introduction to Probability Models. Academic Press, 9th edition, 2006.

[2]   S. R. S. Varadhan. Probability Theory. Number 7 in Courant Lecture Notes. American Mathematical Society, 2001.

__________________________________________________________________________

Links

Outside Readings and Links:

  1. Generating Functions in Virtual Laboratories in Probability.
  2. Moment Generating Functions. in MathWorld.com

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable effort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reflects the thoughts, interests and opinions of its author. They do not explicitly represent official positions or policies of my employer.

Information on this website is subject to change without notice.

Steve Dunbar’s Home Page, http://www.math.unl.edu/~sdunbar1

Email to Steve Dunbar, sdunbar1 at unl dot edu

Last modified: Processed from LATEX source on July 21, 2016