Steven R. Dunbar
Department of Mathematics
203 Avery Hall
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Stochastic Processes and

__________________________________________________________________________

Moment Generating Functions

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________

### Rating

Mathematically Mature: may contain mathematics beyond calculus with proofs.

_______________________________________________________________________________________________

### Section Starter Question

Give some examples of transform methods in mathematics, science or engineering that you have seen or used and explain why transform methods are useful.

_______________________________________________________________________________________________

### Key Concepts

1. The moment generating function converts problems about probabilities and expectations into problems from calculus about function values and derivatives.
2. The value of the $n$th derivative of the moment generating function evaluated at $0$ is the value of the $n$th moment of $X$.
3. The sum of independent normal random variables is again a normal random variable whose mean is the sum of the means, and whose variance is the sum of the variances.

__________________________________________________________________________

### Vocabulary

1. The $n$th moment of the random variable $X$ with pdf $f\left(x\right)$ is $𝔼\left[{X}^{n}\right]={\int }_{x}{x}^{n}f\left(x\right)\phantom{\rule{0.3em}{0ex}}dx$ (provided this integral converges absolutely.)
2. The moment generating function ${\varphi }_{X}\left(t\right)$ is deﬁned by

for all values $t$ for which the integral converges.

__________________________________________________________________________

### Mathematical Ideas

#### Transform Methods

We need some tools to aid in proving theorems about random variables. In this section we develop a tool called the moment generating function which converts problems about probabilities and expectations into problems from calculus about function values and derivatives. Moment generating functions are one of the large class of transforms in mathematics that turn a diﬃcult problem in one domain into a manageable problem in another domain. Other examples are Laplace transforms, Fourier transforms, Z-transforms, generating functions, and even logarithms.

The general method can be expressed schematically in Figure 1.

#### Expectation of Independent Random Variables

Lemma 1. If $X$ and $Y$ are independent random variables, then for any functions $g$ and $h$:

$𝔼\left[g\left(X\right)h\left(Y\right)\right]=𝔼\left[g\left(X\right)\right]𝔼\left[h\left(Y\right)\right]$

Proof. To make the proof deﬁnite suppose that $X$ and $Y$ are jointly continuous, with joint probability density function $F\left(x,y\right)$. Then:

$\begin{array}{llll}\hfill 𝔼\left[g\left(X\right)h\left(Y\right)\right]& ={\iint }_{\left(x,y\right)}g\left(x\right)h\left(y\right)f\left(x,y\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{0.3em}{0ex}}dy\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}{\int }_{y}g\left(x\right)h\left(y\right){f}_{X}\left(x\right){f}_{Y}\left(y\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{0.3em}{0ex}}dy\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}g\left(x\right){f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx{\int }_{y}h\left(y\right){f}_{Y}\left(y\right)\phantom{\rule{0.3em}{0ex}}dy\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =𝔼\left[g\left(X\right)\right]𝔼\left[h\left(Y\right)\right].\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Remark. In words, the expectation of the product of independent random variables is the product of the expectations.

#### The Moment Generating Function

The moment generating function ${\varphi }_{X}\left(t\right)$ is deﬁned by

for all values $t$ for which the integral converges.

Example. The degenerate probability distribution has all the probability concentrated at a single point. That is, the degenerate random variable is a discrete random variable exhibiting certainty of outcome. If $X$ is a degenerate random variable, then $X=\mu$ with probability $1$ and $X$ is any other value with probability $0$. The moment generating function of the degenerate random variable is particularly simple:

$\sum _{{x}_{i}=\mu }{e}^{{x}_{i}t}={e}^{\mu t}.$

If the moments of order $k$ exist for $0\le k\le {k}_{0}$, then the moment generating function is continuously diﬀerentiable up to order ${k}_{0}$ at $t=0$. Assuming that all operations can interchanged, the moments of $X$ can be generated from ${\varphi }_{X}\left(t\right)$ by repeated diﬀerentiation:

$\begin{array}{llll}\hfill {\varphi }_{X}^{\prime }\left(t\right)& =\frac{d}{dt}𝔼\left[{e}^{tX}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{d}{dt}{\int }_{x}{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}\frac{d}{dt}{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}x{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =𝔼\left[X{e}^{tX}\right].\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Then

${\varphi }_{X}^{\prime }\left(0\right)=𝔼\left[X\right].$

Likewise

$\begin{array}{llll}\hfill {\varphi }_{X}^{″}\left(t\right)& =\frac{d}{dt}{\varphi }_{X}^{\prime }\left(t\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{d}{dt}{\int }_{x}x{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}x\frac{d}{dt}{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{x}{x}^{2}{e}^{tx}{f}_{X}\left(x\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =𝔼\left[{X}^{2}{e}^{tX}\right].\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Then

${\varphi }_{X}^{″}\left(0\right)=𝔼\left[{X}^{2}\right].$

Continuing in this way:

${\varphi }_{X}^{\left(n\right)}\left(0\right)=𝔼\left[{X}^{n}\right]$

Remark. In words, the value of the $n$th derivative of the moment generating function evaluated at $0$ is the value of the $n$th moment of $X$.

Theorem 2. If $X$ and $Y$ are independent random variables with moment generating functions ${\varphi }_{X}\left(t\right)$ and ${\varphi }_{Y}\left(t\right)$ respectively, then ${\varphi }_{X+Y}\left(t\right)$, the moment generating function of $X+Y$ is given by ${\varphi }_{X}\left(t\right){\varphi }_{Y}\left(t\right)$. In words, the moment generating function of a sum of independent random variables is the product of the individual moment generating functions.

Proof. Using the lemma on independence above:

$\begin{array}{llll}\hfill {\varphi }_{X+Y}\left(t\right)& =𝔼\left[{e}^{t\left(X+Y\right)}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =𝔼\left[{e}^{tX}{e}^{tY}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =𝔼\left[{e}^{tX}\right]𝔼\left[{e}^{tY}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\varphi }_{X}\left(t\right){\varphi }_{Y}\left(t\right).\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Theorem 3. If the moment generating function is deﬁned in a neighborhood of $t=0$ then the moment generating function uniquely determines the probability distribution. That is, there is a one-to-one correspondence between the moment generating function and the distribution function of a random variable, when the moment-generating function is deﬁned and ﬁnite.

Proof. This proof is too sophisticated for the mathematical level we have now. □

#### The moment generating function of a normal random variable

Theorem 4. If $Z\sim N\left(\mu ,{\sigma }^{2}\right)$, then ${\varphi }_{Z}\left(t\right)=exp\left(\mu t+{\sigma }^{2}{t}^{2}∕2\right)$.

Proof.

$\begin{array}{llll}\hfill {\varphi }_{Z}\left(t\right)& =𝔼\left[{e}^{tX}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{\sqrt{2\pi {\sigma }^{2}}}{\int }_{-\infty }^{\infty }{e}^{tx}{e}^{-{\left(x-\mu \right)}^{2}∕\left(2{\sigma }^{2}\right)}\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{\sqrt{2\pi {\sigma }^{2}}}{\int }_{-\infty }^{\infty }exp\left(\frac{-\left({x}^{2}-2\mu x+{\mu }^{2}-2{\sigma }^{2}tx\right)}{2{\sigma }^{2}}\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

By completing the square:

$\begin{array}{llll}\hfill {x}^{2}-2\mu x+{\mu }^{2}-2{\sigma }^{2}tx& ={x}^{2}-2\left(\mu +{\sigma }^{2}t\right)x+{\mu }^{2}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\left(x-\left(\mu +{\sigma }^{2}t\right)\right)}^{2}-{\left(\mu +{\sigma }^{2}t\right)}^{2}+{\mu }^{2}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\left(x-\left(\mu +{\sigma }^{2}t\right)\right)}^{2}-{\sigma }^{4}{t}^{2}-2\mu {\sigma }^{2}t.\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

So returning to the calculation of the m.g.f.

$\begin{array}{llll}\hfill {\varphi }_{Z}\left(t\right)& =\frac{1}{\sqrt{2\pi {\sigma }^{2}}}{\int }_{-\infty }^{\infty }exp\left(\frac{-\left({\left(x-\left(\mu +{\sigma }^{2}t\right)\right)}^{2}-{\sigma }^{4}{t}^{2}-2\mu {\sigma }^{2}t\right)}{2{\sigma }^{2}}\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{\sqrt{2\pi {\sigma }^{2}}}exp\left(\frac{{\sigma }^{4}{t}^{2}+2\mu {\sigma }^{2}t}{2{\sigma }^{2}}\right){\int }_{-\infty }^{\infty }exp\left(\frac{-{\left(x-\left(\mu +{\sigma }^{2}t\right)\right)}^{2}}{2{\sigma }^{2}}\right)\phantom{\rule{0.3em}{0ex}}dx\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =exp\left(\frac{{\sigma }^{4}{t}^{2}+2\mu {\sigma }^{2}t}{2{\sigma }^{2}}\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =exp\left(\mu t+{\sigma }^{2}{t}^{2}∕2\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Theorem 5. If ${Z}_{1}\sim N\left({\mu }_{1},{\sigma }_{1}^{2}\right)$, and ${Z}_{2}\sim N\left({\mu }_{2},{\sigma }_{2}^{2}\right)$ and ${Z}_{1}$ and ${Z}_{2}$ are independent, then ${Z}_{1}+{Z}_{2}\sim N\left({\mu }_{1}+{\mu }_{2},{\sigma }_{1}^{2}+{\sigma }_{2}^{2}\right)$. In words, the sum of independent normal random variables is again a normal random variable whose mean is the sum of the means, and whose variance is the sum of the variances.

Proof. We compute the moment generating function of the sum using our theorem about sums of independent random variables. Then we recognize the result as the moment generating function of the appropriate normal random variable.

$\begin{array}{llll}\hfill {\varphi }_{{Z}_{1}+{Z}_{2}}\left(t\right)& ={\varphi }_{{Z}_{1}}\left(t\right){\varphi }_{{Z}_{2}}\left(t\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =exp\left({\mu }_{1}t+{\sigma }_{1}^{2}{t}^{2}∕2\right)exp\left({\mu }_{2}t+{\sigma }_{2}^{2}{t}^{2}∕2\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =exp\left(\left({\mu }_{1}+{\mu }_{2}\right)t+\left({\sigma }_{1}^{2}+{\sigma }_{2}^{2}\right){t}^{2}∕2\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

An alternative visual proof that the sum of independent normal random variables is again a normal random variable using only calculus is available at The Sum of Independent Normal Random Variables is Normal.

#### Sources

This section is adapted from: Introduction to Probability Models, by Sheldon Ross.

_______________________________________________________________________________________________

### Problems to Work for Understanding

1. Calculate the moment generating function of a random variable $X$ having a uniform distribution on $\left[0,1\right]$. Use this to obtain $𝔼\left[X\right]$ and $Var\left[X\right]$.
2. Calculate the moment generating function of a discrete random variable $X$ having a geometric distribution. Use this to obtain $𝔼\left[X\right]$ and $Var\left[X\right]$.
3. Calculate the moment generating function of a Bernoulli random variable having value $1$ with probability $p$ and value $0$ with probability $1-p$. From this, derive the moment generating function of a binomial random variable with parameters $n$ and $p$.

__________________________________________________________________________

### References

[1]   Sheldon M. Ross. Introduction to Probability Models. Academic Press, 9th edition, 2006.

[2]   S. R. S. Varadhan. Probability Theory. Number 7 in Courant Lecture Notes. American Mathematical Society, 2001.

__________________________________________________________________________

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable eﬀort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reﬂects the thoughts, interests and opinions of its author. They do not explicitly represent oﬃcial positions or policies of my employer.

Information on this website is subject to change without notice.