Steven R. Dunbar
Department of Mathematics
203 Avery Hall
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Topics in
Probability Theory and Stochastic Processes
Steven R. Dunbar

__________________________________________________________________________

Strong Law of Large Numbers

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________ ### Study Tip

__________________________________________________________________________ ### Rating

Mathematicians Only: prolonged scenes of intense rigor.

_______________________________________________________________________________________________ ### Section Starter Question

Explain what is meant by the “law of averages” and how it applies to an inﬁnite sequence of coin ﬂips.

_______________________________________________________________________________________________ ### Key Concepts

1. Almost surely
$\underset{n\to \infty }{lim}\frac{{S}_{n}\left(\omega \right)}{n}=p.$

2. Almost surely, the asymptotic proportion of any outcome in an inﬁnite sequence of trials is the probability of that outcome for a single trial.

__________________________________________________________________________ ### Vocabulary

1. Borel’s strong law of large numbers is that almost surely
$\underset{n\to \infty }{lim}\frac{{S}_{n}\left(\omega \right)}{n}=p.$

2. If $b=\left({b}_{1},{b}_{2},\dots ,{b}_{j}\right)$ is a word constructed from the alphabet $\left\{0,1\right\}$ and $s={\sum }_{i=1}^{j}{b}_{j}$ we say that ${p}^{s}{q}^{1-s}$ is the probability of the word $b$ .
3. The event “${X}_{n}$ is in ${A}_{n}$ inﬁnitely often” denoted is deﬁned as

__________________________________________________________________________ ### Mathematical Ideas

#### Borel’s Strong Law

This section presents three diﬀerent proofs of Borel’s Strong Law of Large Numbers. The ﬁrst uses the set theory of negligible sets and the Large Deviations estimate to show that the pointwise convergence fails on a negligible set. The second proof uses a carefully chosen subsequence and elementary estimates. The third proof uses the Borel-Cantelli Lemma and a fourth-moment condition.

##### Proof with the Large Deviations Estimate

Theorem 1 (Borel’s Strong Law of Large Numbers).   Almost surely

$\underset{n\to \infty }{lim}\frac{{S}_{n}\left(\omega \right)}{n}=p.$

Proof. Let ${R}_{n}\left(\omega \right)=\frac{{S}_{n}\left(\omega \right)}{n}-p$. The sequence ${\left({R}_{n}\left(\omega \right)\right)}_{n=1}^{\infty }$ fails to approach $0$ if and only if ${liminf}_{n\to \infty }|{R}_{n}\left(\omega \right)|>0$. That is, ${\left({R}_{n}\left(\omega \right)\right)}_{n=1}^{\infty }$ fails to approach $0$ if and only if there is an $m\ge 1$ depending on $\omega$ such for that for each $n\ge 1$ there exists a $k\ge n$ satisfying $|{R}_{k}\left(\omega \right)|>\frac{1}{m}$. Therefore, the set of points $\omega$ in the sample space not satisfying $\underset{n\to \infty }{lim}\frac{{S}_{n}\left(\omega \right)}{n}=p$ is contained in

$\bigcup _{m\ge 1}\bigcap _{n\ge 1}\bigcup _{k\ge n}\left\{\omega \in \Omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}|{R}_{k}\left(\omega \right)|>\frac{1}{m}\right\}.$

The goal is to show that this set is a negligible event. Since a countable union of negligible sets is negligible, it suﬃces to show that

${N}_{m}=\bigcap _{n\ge 1}\bigcup _{k\ge n}\left\{\omega \in \Omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}|{R}_{k}\left(\omega \right)|>\frac{1}{m}\right\}$

is negligible for each $m\ge 1$.

For each $k\ge 1$, let ${A}_{m,k}=\left\{\omega \in \Omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}|{R}_{k}\left(\omega \right)|>\frac{1}{m}\right\}$. By the Large Deviations Estimate, there is a constant $c=c\left(p,m\right)>0$ such that $ℙ\left[{A}_{m,k}\right]\le {\mathrm{e}}^{-ck}$. Since the series ${\sum }_{k\ge 1}{\mathrm{e}}^{-ck}$ converges, for every $𝜖>0$ there is an $n\ge 1$ such that ${\sum }_{k\ge n}{\mathrm{e}}^{-ck}<𝜖$. Because ${N}_{m}\subset {\cup }_{k\ge n}{A}_{m,k}$,

$ℙ\left[{N}_{m}\right]\le ℙ\left[{\cup }_{k\ge n}{A}_{m,k}\right]\le \sum _{k\ge n}ℙ\left[{A}_{m,k}\right]\le 𝜖.$

This proves that each ${N}_{m}$ is a negligible set. □

##### Proof with subsequences

Here is another proof of Borel’s Strong Law of Large Numbers, adapted from Theorem 1.21, in [1, page 11].

Before the proof, ﬁrst give a quick deﬁnition.

Deﬁnition. Suppose ${A}_{1}\subseteq {A}_{2}\subseteq \dots$. Then $\underset{n\to \infty }{lim}{A}_{n}={\bigcup }_{n=1}^{\infty }{A}_{n}$. Also, if ${A}_{1}\supseteq {A}_{2}\supseteq \dots$, then $\underset{n\to \infty }{lim}{A}_{n}={\bigcap }_{n=1}^{\infty }{A}_{n}$.

Theorem 2 (Strong Law of Large Numbers).

$ℙ\left[\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}\ne p\right]=0.$

Proof. Break the proof into a sequence of simple steps.

1. The ﬁrst step is the claim that $\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}=p$ if and only if $\underset{m\to \infty }{lim}\frac{{S}_{{m}^{2}}}{{m}^{2}}=p$.

To see this, ﬁrst note that the implication from the limit of the full sequence to the subsequential limit along the squares is immediate. For the other direction, for any $n$, ﬁnd $m$ so that ${m}^{2}\le n<{\left(m+1\right)}^{2}$, or $0\le n-{m}^{2}\le 2m$. Then

$\begin{array}{llll}\hfill \left|\frac{{S}_{n}}{n}-\frac{{S}_{{m}^{2}}}{{m}^{2}}\right|& =\left|\frac{{S}_{n}}{{m}^{2}}-\frac{{S}_{{m}^{2}}}{{m}^{2}}+\left(\frac{1}{n}-\frac{1}{{m}^{2}}\right){S}_{n}\right|\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \le \frac{|n-{m}^{2}|}{{m}^{2}}+\left|\frac{1}{n}-\frac{1}{{m}^{2}}\right|n\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \le \frac{2m}{{m}^{2}}+\frac{2}{m}=\frac{4}{m}.\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

The term $\frac{|n-{m}^{2}|}{{m}^{2}}$ at the second line comes from estimating $\frac{|{S}_{n}-{S}_{{m}^{2}}|}{{m}^{2}}$. Since $n\ge {m}^{2}$, the largest the diﬀerence could be would be having each ${\omega }_{i}=1$ for $i>{m}^{2}$, thus the diﬀerence could be at most $n-{m}^{2}$. Then the claim that $\underset{m\to \infty }{lim}\frac{{S}_{{m}^{2}}}{{m}^{2}}=p$ implies $\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}=p$ follows by the triangle inequality.

2. For ${m}_{0}<{m}_{1}$ deﬁne
${E}_{{m}_{0},{m}_{1}}=\bigcup _{m={m}_{0}}^{{m}_{1}}\left\{\omega :\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖\right\}.$

Note that each of these sets depends only on the coordinates ${\omega }_{1},\dots ,{\omega }_{{m}_{1}^{2}}$, so ${E}_{{m}_{0},{m}_{1}}$ is of ﬁnite type.

3. Use the Chebyshev inequality on each set in the union to obtain
$ℙ\left[\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖\right]<\frac{1}{{𝜖}^{2}}𝔼\left[{\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|}^{2}\right]=\frac{1}{{𝜖}^{2}}\frac{p\left(1-p\right)}{{m}^{2}}.$

4. Thus $\begin{array}{llll}\hfill ℙ\left[{E}_{{m}_{0},{m}_{1}}\right]& \le \sum _{m={m}_{0}}^{{m}_{1}}ℙ\left[\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{p\left(1-p\right)}{{𝜖}^{2}}\sum _{m={m}_{0}}^{{m}_{1}}\frac{1}{{m}^{2}}.\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$
5. Deﬁne
${E}_{{m}_{0}}=\underset{{m}_{1}\to \infty }{lim}{E}_{{m}_{0},{m}_{1}}=\bigcup _{m={m}_{0}}^{\infty }\left\{\omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-\frac{1}{2}\right|>𝜖\right\}.$

Note that ${E}_{{m}_{0}}$ is the set of $\omega$’s for which $\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖$ at least once for $m\ge {m}_{0}$.

6. The claim is that
$ℙ\left[{E}_{{m}_{0}}\right]=\underset{{m}_{1}\to \infty }{lim}{ℙ}_{{m}_{1}}\left[{E}_{{m}_{0},{m}_{1}}\right]\le \frac{1}{4{𝜖}^{2}}\sum _{m={m}_{0}}^{\infty }\frac{1}{{m}^{2}}.$

7. Note that
$\underset{m}{limsup}\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖$

if and only if for any ${m}_{0}$ there exists $m\ge {m}_{0}$ so that $\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖$. That is,

$\left\{\omega :\underset{m}{limsup}\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>𝜖\right\}\subseteq {E}_{{m}_{0}},$

for all ${m}_{0}$. Notice that ${E}_{{m}_{0}}\supseteq {E}_{{m}_{0}+1}$ and so consider ${E}_{𝜖}=\underset{{m}_{0}\to \infty }{lim}{E}_{{m}_{0}}$. Again the claim is that

$ℙ\left[{E}_{𝜖}\right]=\underset{{m}_{0}\to \infty }{lim}ℙ\left[{E}_{{m}_{0}}\right].$

8. Then
$ℙ\left[{E}_{𝜖}\right]\le \underset{{m}_{0}\to \infty }{lim}\left[\frac{p\left(1-p\right)}{{𝜖}^{2}}\sum _{m={m}_{0}}^{\infty }\frac{1}{{m}^{2}}\right]=0.$

9. Let ${E}_{1∕k}$ be a special case of ${E}_{𝜖}$ for $\frac{1}{k}=𝜖$ and $k\in ℤ$. Note that
${E}_{1∕k}=\left\{\omega :\underset{m}{limsup}\left|\frac{{S}_{{m}^{2}}}{{m}^{2}}-p\right|>\frac{1}{k}\right\}.$

Let $E=\underset{k}{lim}{E}_{1∕k}$ and then note that $ℙ\left[E\right]=\underset{k}{lim}ℙ\left[{E}_{1∕k}\right]=\underset{k}{lim}0=0$.

Remark. At steps6,7 and9 a limit is pulled through the probability measure. This is valid if given a sequence of ﬁnite probability measures $ℙn\left[\cdot \right]$ deﬁned on a ﬁeld of ﬁnite type events ${\mathsc{ℱ}}_{0}$, there exists a well-deﬁned probability measure $ℙ\left[\cdot \right]$ deﬁned on $\mathsc{ℱ}$,the smallest $\sigma$-ﬁeld containing ${\mathsc{ℱ}}_{0}$ agreeing with $ℙn\left[\cdot \right]$ on ${\mathsc{ℱ}}_{0}$. That is true from the following theorem, with proof deferred to another section.

Theorem 3 (Kolmogorov Consistency Theorem).   Given a consistent family of ﬁnite-dimensional distributions $ℙn\left[\cdot \right]$ there exists a unique probability $ℙ\left[\cdot \right]$ on $\left(\Omega ,\Sigma \right)$ such that for every $n$, under the natural projection ${\pi }_{n}\left(\omega \right)=\left({x}_{1},{x}_{2},\dots ,{x}_{n}\right)$ the induced measure $ℙ{\pi }_{n}^{-1}$ is identical to ${ℙ}_{n}$ on ${ℝ}^{n}$.

##### Proof with the Borel-Cantelli Lemma

Lemma 4 (Borel-Cantelli).

The direct half:
For a sequence of events ${A}_{n}$, if
$\sum _{n=1}^{\infty }ℙ\left[{A}_{n}\right]<\infty$

then

$ℙ\left[\omega :\underset{n\to \infty }{lim}\mathbf{1}{A}_{n}\left(\omega \right)= 0\right]= 1.$

The converse half:
If the events ${A}_{n}$ are mutually independent, the converse is also true.

Deﬁnition.   Let ${X}_{1},{X}_{2},{X}_{3},\dots ,$ be a sequence of random variables and ${A}_{1},{A}_{2},{A}_{3},\dots$ a sequence of Borel sets in $ℝ$. The event “${X}_{n}$ is in ${A}_{n}$ inﬁnitely often” denoted is deﬁned as

Remark. Note that the complementary event to the event in the Borel-Cantelli Lemma is

$\left[\omega :\underset{n\to \infty }{limsup}\mathbf{1}{A}_{n}\left(\omega \right)= 1\right].$

This is exactly the event ${\cap }_{n=1}^{\infty }{\cup }_{j=n}^{\infty }{A}_{j}$, that is, the event that inﬁnitely many of the events occur. Thus the direct half of the Borel-Cantelli Lemma can be expressed as

The converse half of the Borel-Cantelli Lemma can be expressed as

Proof.

The direct half:

But obviously, ${\sum }_{n=1}^{\infty }ℙ\left[{A}_{n}\right]< \infty$ implies $\underset{n\to \infty }{lim}{\sum }_{j=m}^{\infty }ℙ\left[{A}_{j}\right]= 0$.

The converse half:
This will be proved in the form

The complement of the set is ${\cup }_{n=1}^{\infty }{\cap }_{j=n}^{\infty }{A}_{j}^{C}$. Then the goal is to show this is a negligible set. It is enough to show that each set ${\cap }_{j=n}^{\infty }{A}_{j}^{C}$ is negligible. Temporarily ﬁx $n$and for each $m \ge n$, let ${B}_{m}$ be the event ${B}_{m}={\cap }_{j=n}^{m}{A}_{j}^{C}$, so that ${\cap }_{j=n}^{\infty }{A}_{j}^{C}\subset {B}_{m}$.

Because the events ${\left({A}_{n}\right)}_{n=1}^{\infty }$ are independent

$\begin{array}{c}ℙ\left[{B}_{m}\right]= ℙ\left[{\cap }_{j=n}^{m}{A}_{j}^{C}\right]=\underset{j=n}{\overset{m}{\prod }}ℙ\left[{A}_{n}^{C}\right]=\underset{j=n}{\overset{\infty }{\prod }}\left(1 - ℙ\left[{A}_{n}\right]\right)\\ \le \underset{j=n}{\overset{\infty }{\prod }}exp\left(-ℙ\left[{A}_{n}\right]\right)= exp\left(-\underset{j=n}{\overset{m}{\sum }}ℙ\left[{A}_{j}\right]\right).\end{array}$

This can be made arbitrarily small by choosing suﬃciently large $m$, so by deﬁnition, ${\cap }_{j=n}^{\infty }{A}_{j}^{C}$ is negligible.

Theorem 5 (Strong Law of Large Numbers).   If ${X}_{1},{X}_{2},\dots ,{X}_{n}\dots$ is a sequence of independent identically distributed random variables with $𝔼\left[|{X}_{i}{|}^{4}\right]= C < \infty$, then

$\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}=\underset{n\to \infty }{lim}\frac{{X}_{1}+{X}_{2}+ \cdots +{X}_{n}}{n}= 𝔼\left[{X}_{1}\right]$

with probability $1$.

Proof. We can assume without loss of generality that $𝔼\left[{X}_{i}\right]= 0$, otherwise just consider ${Y}_{i}={X}_{i}- 𝔼\left[{X}_{i}\right]$. Expansion of ${\left({X}_{1}+ \cdots +{X}_{n}\right)}^{4}$, then using the independence and identical distribution assumptions shows

$𝔼\left[{S}_{n}^{4}\right]= n𝔼\left[{X}_{1}^{4}\right]+ 3n\left(n - 1\right)𝔼{\left[{X}_{1}^{2}\right]}^{2}\le nC + 3{n}^{2}{\sigma }^{4}.$

Then adapting the proof of the Markov and Chebyshev inequalities with fourth moments,

$\begin{array}{llll}\hfill ℙ\left[\left|\frac{{S}_{n}}{n}\right|\ge \delta \right]& = ℙ\left[|{S}_{n}{|}^{4}\ge {n}^{4}{\delta }^{4}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={\int }_{|{S}_{n}{|}^{4}\ge {n}^{4}{\delta }^{4}}dℙ\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \le {\int }_{\Omega }\frac{|{S}_{n}{|}^{4}}{{n}^{4}{\delta }^{4}}dℙ\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{𝔼\left[|{S}_{n}{|}^{4}\right]}{{n}^{4}{\delta }^{4}}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \le \frac{nC + 3{n}^{2}{\sigma }^{4}}{{n}^{4}{\delta }^{4}}.\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Then from the comparison theorem for series convergence

$\underset{n=1}{\overset{\infty }{\sum }}ℙ\left[\left|\frac{{S}_{n}}{n}\right|\ge \delta \right]< \infty ,$

and the direct half of the Borel-Cantelli Lemma applies to show that

or equivalently

$\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}= 0$

with probability $1$. □

##### The Strong Law for Pairwise Independent Random Variables

Example. Bernstein’s example of pairwise independence that are not all independent.

Consider ﬂipping a quarter and a dime. Let ${X}_{1}= 1$ if the quarter is heads, ${X}_{2}= 1$ if the dime is heads, and ${X}_{3}= 1$ if the quarter is the same as the dime. First, $ℙ\left[{X}_{i}\right]=\frac{1}{2}$ for $i = 1, 2, 3$. Next note the following probabilities:

$\begin{array}{llll}\hfill ℙ\left[{X}_{1}= 1 \cap {X}_{2}= 1\right]& = ℙ\left[{X}_{1}= 1\right]ℙ\left[{X}_{2}= 1\right]=\frac{1}{4},\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill ℙ\left[{X}_{1}= 1 \cap {X}_{3}= 1\right]& = ℙ\left[{X}_{1}= 1\right]ℙ\left[{X}_{3}= 1\right]=\frac{1}{4},\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill ℙ\left[{X}_{2}= 1 \cap {X}_{3}= 1\right]& = ℙ\left[{X}_{2}= 1\right]ℙ\left[{X}_{3}= 1\right]=\frac{1}{4},\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

but

$\frac{1}{4}= ℙ\left[{X}_{1}= 1 \cap {X}_{2}= 1 \cap {X}_{3}= 1\right]\ne ℙ\left[{X}_{1}= 1\right]ℙ\left[{X}_{2}= 1\right]ℙ\left[{X}_{3}= 1\right]=\frac{1}{8}.$

Lemma 6. Let ${\left({X}_{n}\right)}_{n\ge 1}$ be a sequence of random variables. If $\underset{n=1}{\overset{\infty }{\sum }}ℙ\left[{X}_{n}> 𝜖\right]$ converges for all $𝜖 > 0$, then $\underset{n\to \infty }{lim}{X}_{n}= 0$ almost surely

Proof.

1. Set ${A}_{n}= \left[{X}_{n}> 𝜖\right]$.
2. Proposition?? implies for that for every $𝜖 > 0$ there exists almost surely an ${n}_{0}\left(\omega ,𝜖\right)$ such that $|{X}_{n}\left(\omega \right)|\le 𝜖$ for every $n \ge {n}_{0}\left(\omega ,𝜖\right)$.
3. By considering a countable union of negligible events, we conclude that for every positive integer $m$ there exists almost surely ${n}_{0}\left(\omega ,∕m\right)$ such that $|{X}_{n}\left(\omega \right)|\le 1∕m$ for all $n \ge {n}_{0}\left(\omega , 1∕m\right)$.
4. This implies that the sequence ${\left({X}_{n}\right)}_{n\ge 1}$ almost surely converges to $0$.

Corollary 1. Let ${\left({X}_{n}\right)}_{n\ge 1}$ be a sequence of random variables. If $\underset{n=0}{\overset{\infty }{\sum }}𝔼\left[|{X}_{n}|\right]$ converges, then the sequence ${\left({X}_{n}\right)}_{n\ge 1}$ converges almost surely to $0$.

Theorem 7 (Pairwise Independent Strong Law). Let ${\left({X}_{n}\right)}_{n\ge 1}$ be a sequence of pairwise independent random variables. Suppose that $𝔼\left[{X}_{n}\right]< \infty$, and $\underset{n\ge 1}{sup}𝔼\left[{X}_{n}^{2}\right]< \infty$. Then for ${S}_{n}={\sum }_{i=1}^{n}{X}_{n}$, we have

$\underset{n\to \infty }{lim}ℙ\left[\left|\frac{{S}_{n}}{n}\right|\ge 𝜖\right]= 0$

for all $𝜖 > 0$. In other words, $\underset{n\to \infty }{lim}\frac{{S}_{n}}{n}= 0$ almost surely.

Proof. Let $M =\underset{n\ge 1}{sup}𝔼\left[{X}_{n}^{2}\right]$ and note that pairwise independence says

$𝔼\left[{X}_{i}{X}_{j}\right]= 𝔼\left[{X}_{i}\right]𝔼\left[{X}_{j}\right]= 0,\phantom{\rule{1em}{0ex}}i\ne j.$

Note that we have

$𝔼\left[{\left(\frac{{S}_{n}}{n}\right)}^{2}\right]=\frac{1}{{n}^{2}}\underset{i=1}{\overset{n}{\sum }}𝔼\left[{X}_{i}^{2}\right]\le \frac{M}{n}.$

The Chebyshev inequality gives us that

$ℙ\left[\left|\frac{{S}_{n}}{n}\right|\ge 𝜖\right]\le \frac{𝔼\left[{\left(\frac{{S}_{n}}{n}\right)}^{2}\right]}{{𝜖}^{2}}\le \frac{1}{{𝜖}^{2}}\frac{M}{n},$

and so $\underset{n\to \infty }{lim}ℙ\left[\left|\frac{{S}_{n}}{n}\right|\ge 𝜖\right]= 0$. Since ${\sum }_{n\ge 1}𝔼\left[{\left(\frac{{S}_{n}}{n}\right)}^{2}\right]< \infty$, by Corollary1 $\underset{n\to \infty }{lim}{\left(\frac{{S}_{{n}^{2}}}{{n}^{2}}\right)}^{2}= 0$ almost surely, and hence $\underset{n\to \infty }{lim}\frac{{S}_{{n}^{2}}}{{n}^{2}}= 0$ almost surely. Now let $m = ⌊\sqrt{n}⌋$; i.e., ${m}^{2}\le n \le {\left(m + 1\right)}^{2}$. Then $\underset{n\to \infty }{lim}\frac{{S}_{{m}^{2}}}{n}= 0$ almost surely. This gives

$\begin{array}{llll}\hfill 𝔼\left[{\left(\frac{{S}_{n}}{n}-\frac{{S}_{{m}^{2}}}{n}\right)}^{2}\right]& =\frac{1}{{n}^{2}}𝔼\left[{\left(\underset{{m}^{2}+1}{\overset{n}{\sum }}{X}_{i}\right)}^{2}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{{n}^{2}}\underset{i={m}^{2}+1}{\overset{n}{\sum }}𝔼\left[{X}_{i}^{2}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \le \frac{\left(n -{m}^{2}\right)M}{{n}^{2}}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{2m + 1}{{n}^{2}}M\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{2⌊\sqrt{n}⌋ + 1}{{n}^{2}}M\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & = O\left({n}^{-3∕2}\right).\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

Using Corollary1 again

##### Applications of the Strong Law

The following proposition says that the asymptotic proportion of any outcome in an inﬁnite sequence of trials is “almost surely” the probability of that outcome for a single trial. This is sometimes referred as the “Law of Averages”.

Proposition 8 (The Law of Averages).   Let ${\left({A}_{n}\right)}_{n=1}^{\infty }$ be a sequence of equiprobable independent random events with probability $p$. The asymptotic empirical probability that these event will occur is almost surely $p$; that is,

$\underset{n\to \infty }{lim}\frac{1}{n}\left|\right\left\{k\phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}1 \le k \le n,\omega \in {A}_{k}\right\}\left|\right= p$

almost surely.

Proof. For each $\omega$ create a sequence $\rho ={\left({\rho }_{n}\right)}_{n=1}^{\infty }$ of $0$’s and $1$’s by setting ${\rho }_{n}= 1$ if $\omega \in {A}_{n}$ and ${\rho }_{n}= 0$ otherwise. The proof is to show

 $\underset{n\to \infty }{lim}\frac{1}{n}\underset{k=1}{\overset{n}{\sum }}{\rho }_{k}= ℙ\left[{\rho }_{i}= 1\right]= p$ (1)

almost surely.

For each $n$ and each $\left({𝜖}_{1},{𝜖}_{2},\dots ,{𝜖}_{n}\right) \in {\left\{0, 1\right\}}^{n}$ let $s ={\sum }_{k=0}^{n}{𝜖}_{k}$, then

$ℙ\left[{\rho }_{k}={𝜖}_{k}, 1 \le k \le n\right]={p}^{s}{\left(1 - p\right)}^{n-s}.$

Now the same proof used to show that $\frac{1}{n}{\sum }_{k=1}^{n}{\omega }_{k}$ almost surely converges to $p$ can be applied to complete the proof that the convergence of (1) is almost sure. □

#### Strong Law for Finite Type Events

Corollary 2. Let $A$ be a ﬁnite type event. For each integer $n \ge 1$ and each $\omega \in \Omega$, let $S\left(A,n,\omega \right)$ be the number of integers $k$ between $1$ and $n$ such that $\left({\omega }_{k},{\omega }_{k+1},{\omega }_{k+2},\dots \phantom{\rule{0.3em}{0ex}}\right) \in A$. Then

$\underset{n\to \infty }{lim}\frac{S\left(A,n,\omega \right)}{n}= ℙ\left[A\right]$

almost surely.

Proof. Since the event $A$ is of ﬁnite type, $A$ depends only on the coordinates with index less than or equal to $m$ for some positive integer $m$. Equivalently, there exists an ${A}^{\prime }\subset {\Omega }_{m}$ such that

$A =\left\{\omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}\left({\omega }_{1},{\omega }_{2},\dots ,{\omega }_{m}\right) \in {A}^{\prime }\right\}.$

For each integer $j$ from $1$ to $m$ consider the sequence ${\left({A}_{j,n}\right)}_{n\ge 0}$ of events deﬁned by

$\begin{array}{llll}\hfill {A}_{j,n}& =\left\{\omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}\left({\omega }_{j+nm},{\omega }_{j+nm+1},{\omega }_{j+nm+2},\dots \phantom{\rule{0.3em}{0ex}}\right) \in A\right\}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\left\{\omega \phantom{\rule{0.3em}{0ex}}:\phantom{\rule{0.3em}{0ex}}\left({\omega }_{j+nm},{\omega }_{j+nm+1},\dots ,{\omega }_{j+\left(n+1\right)m+-1}\right) \in {A}^{\prime }\right\}.\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

For ﬁxed $j$ and varying $n$, the events ${A}_{j,n}$ are independent and each has probability $ℙ\left[A\right]$. Let $S\left(A,j,n,\omega \right)$ be the number of integers $k$ between $0$ and $n - 1$ such that $\omega \in {A}_{j,k}$. The Proposition implies that

$\underset{n\to \infty }{lim}\frac{S\left(A,j,n,\omega \right)}{n}= ℙ\left[A\right]$

almost surely. Also,

$\frac{1}{nm}S\left(A,nm,\omega \right) =\frac{1}{m}\underset{j=1}{\overset{m}{\sum }}\frac{S\left(A,j,n,\omega \right)}{n}.$

The corollary then follows from the inequality

$\frac{1}{\left(n + 1\right)m}S\left(A,nm,\omega \right) \le \frac{1}{nm + k}S\left(A,nm + k,\omega \right) \le \frac{1}{nm}S\left(A, \left(n + 1\right)m,\omega \right)$

which holds for every $k$ from $0$ to $m$. □

Deﬁnition. If $b = \left({b}_{1},{b}_{2},\dots ,{b}_{j}\right)$ is a word constructed from the alphabet $\left\{0, 1\right\}$ and $s ={\sum }_{i=1}^{j}{b}_{j}$ we say that ${p}^{s}{q}^{1-s}$ is the probability of the word $b$ .

Corollary 3. In the sequence $\omega$, every word $b$ almost surely occurs with asymptotic frequency equal to its probability.

Proof. This is a special case of Proposition 2

#### Sources

This proof of the Strong Law for sum of Bernoulli random variables is adapted from: Heads or Tails, by Emmanuel Lesigne, Student Mathematical Library Volume 28, American Mathematical Society, Providence, 2005, Chapter 11.2, pages 82-85. . The proof based on subsequential limits and the Borel Cantelli Lemma and following remarks is adapted from Probability by Leo Breiman, Addison-Wesley, Reading MA, 1968, Chapters 1 and 3.  The proof using the fourth-moment and the Borel-Cantelli Lemma is adapted from Varadhan, .

_______________________________________________________________________________________________ ### Algorithms, Scripts, Simulations

#### Algorithm

́Comment Post: Observation that any coin ﬂip sequence
selected by a pseudo-random-number-generator
has the average number of heads converge to the probability of heads,
suggesting the Strong Law of Large Numbers.
́Comment Post: The average number of heads sampled at an exponential sequence of indices, to suggest the Strong Law of Large Numbers.
1́Set probability of success $p$.
2́Set length of coin-ﬂip sequence $n$.
3́Initialize and ﬁll length $n$ array of coin ﬂips.
4́Use vectorization to sum the cumulative sequence ${S}_{n}$ of heads.
5́Use vectorization to compute an exponential sequence of indices for sampling.
6́Use vectorization to compute the average number of heads along the exponential sequence of indices.
7́Print the averages.

#### Scripts

R
1p <- 0.5
2n <- 1e+6
3coinFlips <- (runif(n) <= p)
4S <- cumsum(coinFlips)
5
6tenpows <- 10^(1:6)
7cat(S[tenpows]/tenpows, "\n")
0cAp0x1-180008:
Octave
1p = 0.5;
2n = 1e+6;
3coinFlips = rand(1, n) <= p;
4S = cumsum(coinFlips);
5
6tenpows = 10 .^ (1:6);
7S(tenpows) ./ tenpows
0cAp1x1-180008:
Perl
1$p = 0.5; 2$n         = 1e+6;
3$coinFlips = random($n) <= $p; 4$S         = cumusumover($coinFlips); 5 6$pows = zeros(6)->xlinvals( 1, 6 );
7$tenpows = 10**$pows;
8
9print $S($tenpows - 1 ) / \$tenpows, "\n";
0cAp2x1-1800010:
SciPy
1import scipy
2
3p = 0.5
4n = 1e+6
5coinFlips = scipy.random.random(n) <= p
6S = scipy.cumsum(coinFlips, axis=0)
7
8pows = scipy.arange(1, 6+1)    # 6 entries
9tenpows = 10**pows
10
11print( scipy.take(S,tenpows-1)/scipy.array(tenpows,dtype=float) )
0cAp3x1-1800012:

__________________________________________________________________________ ### Problems to Work for Understanding

__________________________________________________________________________ ### References

   Leo Breiman. Probability. Addison Wesley, 1968.

   Emmanuel Lesigne. Heads or Tails: An Introduction to Limit Theorems in Probability, volume 28 of Student Mathematical Library. American Mathematical Society, 2005.

   S. R. S. Varadhan. Probability Theory. Number 7 in Courant Lecture Notes. American Mathematical Society, 2001.

__________________________________________________________________________ __________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable eﬀort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reﬂects the thoughts, interests and opinions of its author. They do not explicitly represent oﬃcial positions or policies of my employer.

Information on this website is subject to change without notice.