[an error occurred while processing this directive] [an error occurred while processing this directive]


The Law of the Iterated Logarithm


Key Concepts

Key Concepts

  1. The law of the iterated logarithm tells very precisely how far the fortune in a fair coin-tossing game will make excursions from the beginning value.


Vocabulary

Vocabulary

  1. The limsup, abbreviation for limit superior is a refined and generalized notion of limit, being the largest dependent-variable subsequence limit. That is, among all subsequences of independent-variable values tending to some independent-variable value, usually infinity, there will be a corresponding dependent-variable subsequence. Some of these dependent-variable sequences will have limits, and among all these, the largest is the limsup.
  2. The liminf, abbreviation for limit inferior is the analogous least of all dependent-variable subsequence limits.


Mathematical Ideas

Mathematical Ideas

This section is adapted from: W. Feller, in Introduction to Probability Theory and Volume I, Chapter III, and Chapter VIII, and also E. Lesigne, Heads or Tails: AN Introduction to Limit Theorems in Probability, Chapter 12, American Mathematical Society, Student Mathematical Library, Volume 28, 2005.

We again consider the net fortune in the fair coin-tossing game. That is, we consider the sum Sn where the independent, identically distributed random variables in the sum Sn = X1 + ... + Xn are the Bernoulli random variables Xi = +1 with probability p = 1/2 and Xi = -1 with probability q = 1 -p = 1/2. Note that the mean m = 0 is and the variance is s2 = 1 for each of the summands Xi.

The assumption of fairness, p = 1/2 = q simplifies the statements and the proofs, but does not materially alter the truth of the following theorems, after making appropriate adjustments to the mean and the variance.

Some Review and Background

The Strong Law of Large Numbers says that

 Sn- nli-->mo o n = 0

with probability 1 in the sample space of all possible coin flips. In a way, this says n is “too strong”, it “condenses out” all variation in the sum Sn.

The Central Limit Theorem applied to this special sequence of coin flips says

 S lim -1n/2 = Z n-->o o n

where Z =_ N (0,1) is a normal random variable and the limit is interpreted as convergence in distribution. In fact, this implies that for large n about 68% of the points in the sample space of all possible coin flips satisfy

| | | Sn | ||-1/2||< 1 n

and about 95% of the points in the sample space of all possible coin flips satisfy

| | |Sn | ||-1/2||< 2. n

In a way, this says  V~ -- n is “too weak”, it doesn’t condense out enough information.

These two together suggest that “somewhere in between n and  V~ -- n” we might be able to make stronger statements about convergence and the variation in the sequence Sn

In fact, Hausdorff’s estimate tells us:

 | | lim ||-Sn---||< constant n-->o o |n1/2+e|

with probability 1 in the sample space of all possible coin flips for all values if e > 0 . In a way, this says n1/2+e is “almost too strong”, it condenses out almost too much information.

Even better, Hardy and Littlewood’s estimate tells us:

 | | |--Sn----| nli-->mo o || V~ n-logn-||< constant

with probability 1 in the sample space of all possible coin flips for all values of e > 0 . In a way, this says (n log n)1/2 is “ still a little too strong”, it condenses out enough information.

Khintchine’s Law of the Iterated Logarithm is “just right.” It tells us very precisely how the sums vary with n. Using a method due to Erdös, it is possible to refine the law even more, but this is probably past the point of diminishing returns.

Like the Central Limit Theorem, the Law of the Iterated Logarithm illustrates in an astonishingly precise way that even completely random sequences obey precise mathematical laws.

The Law of the Iterated Logarithm

Khintchine’s Law of the Iterated Logarithm says that:

There exist infinitely many n such that

 V~ - V~ ------------- Sn > (1 - e) n 2 log(log(n))
and furthermore for every n larger than a threshold value N
 V~ -V ~ ------------- Sn < (1 + e) n 2 log(log(n)).

I have written these is a slightly non-standard way, with the multiplier  V~ ---------- 2 log logn of the standard deviation from the Central Limit Theorem to emphasize the similarity to and the difference from the Central Limit Theorem.

Theorem 1 (Law of the Iterated Logarithm) With probability 1 we have:

 -----Sn------ limn-->sou o p V~ 2n-loglog-n = 1
This means that with probability 1 for any e > 0 , only finitely many of the events:
 V~ - V~ ------------- Sn > (1 + e) n 2log(log(n))
occur; on the other hand, with probability 1,
 V~ - V~ ------------- Sn > (1- e) n 2log(log(n))
occurs for infinitely many n.

For reasons of symmetry, for e > 0 , the inequality

 ------------- S < - (1 + e) V~ nV ~ 2log(log(n)) n
can only occur for finitely many n; while
 V~ - V~ ------------- Sn < - (1 - e) n 2log(log(n))
must occur for infinitely many n. That is,
 Sn limn-->i oo nf V~ ---------------= - 1 2n log(log(n))
with probability 1.

Compare the Law of the Iterated Logarithm to the Central Limit Theorem. The Central Limit Theorem, says that Sn/ V~ -- n is approximately distributed as a N(0, 1) random variable for large n. Therefore, for a large but fixed n, there is probability about 1/6 that the values of Sn/ V~ -- n can exceed the standard deviation, 1, or Sn >  V~ n-. For a fixed but large n, with probability about 0.025, Sn/ V~ -- n can exceed twice the standard deviation, 2, or Sn > 2 V~ -- n. The Law of the Iterated Logarithm tells us the more precise information that sooner or later the fortune will exceed

 V~ -------------- Sn > (1 - e) 2n log(log(n))
for any e > 0 , and will do so infinitely often! The Law of the iterated Logarithm does not tell us how long we will have to wait between such repeated crossings however, and the wait can be very, very long indeed, although it must (with probability 1) eventually occur again.

Moreover, the Law of the Iterated Logarithm tells us in addition that

S < -(1 - e) V~ 2n-log(log(n)) n
must occur for infinitely many n. This means that the fortune must (with probability 1) oscillate back and forth across the net zero axis infinitely often, crossing the upper and lower boundaries:
 V~ -------------- ± (1- e) 2nlog(log(n))

The statement puts some strength behind an understanding of the long-term swings backs and forth in value of a random process. It also implies a form of recurrence, that is, a random walk must vist every integer value. If a market price behaves like a random walk, then it too can be expected (with certainlty, but perhaps with longs periods between the swings) to go far above, and far below the starting price.

The law of the iterated logarithm for Bernoulli trials stated here is a very special case of a more general theorem first formulated by Kolmogorov in 1929. It is also possible to formulate even stronger and more general theorems!

Illustration

Law of the Iterated Logarithm

Here are 2000 steps of a random walk with p = q = 1/2. In this figure, the random walk must return again to cross the blue curves with (1- e) = 0.9 infinitely many times, but may only cross the red curve with 1 + e = 1.1 finitely many times.


Problems to Work for Understanding

  1. The “multiplier of the variance  V~ -- n” function  V~ ------------- 2log(log(n)) grows very slowly. To understand how very slowly, calculate a table with n = 10j and  V~ ------------- 2log(log(n)) for j = 10, 20, 30,..., 100. (Remember, in mathematical work above calculus, log(x) is the natural logarithm, base e, often written ln(x) in calculus and below to distinguish it from the “common” or base-10 logarithm. Be careful, some software and technology cannot properly calculate with magnitudes this large.)

    Solution

  2. Consider the sequence
     (- 1)n an = (- 1) |_ n/2 _| +------ n
    for n = 1, 2, 3.... Here  |_ x _| is the “floor function”, the greatest integer less than or equal to x, so  |_ 1 _| = 1,  |_ 3/2 _| = 1,  |_ 8/3 _| = 2,  |_ -3/2 _| = -2, etc. Find
    lim sup an n-->o o
    and
    lim infan. n--> oo
    Does the sequence an have a limit?

    Solution


Reading Suggestion:

  1. Chapter VIII.5, Introduction to Probability Theory and Its Applications, Volume I, Second Edition by W. Feller, J. Wiley and Sons, pages 191-195.
  2. E. Lesigne, Heads or Tails: AN Introduction to Limit Theorems in Probability, Chapter 12, American Mathematical Society, Student Mathematical Library, Volume 28, 2005.


Outside Readings and Links:


[an error occurred while processing this directive] [an error occurred while processing this directive]

Last modified: [an error occurred while processing this directive]