[an error occurred while processing this directive] [an error occurred while processing this directive]

## The Law of the Iterated Logarithm

### Key Concepts

1. The law of the iterated logarithm tells very precisely how far the fortune in a fair coin-tossing game will make excursions from the beginning value.

### Vocabulary

1. The limsup, abbreviation for limit superior is a refined and generalized notion of limit, being the largest dependent-variable subsequence limit. That is, among all subsequences of independent-variable values tending to some independent-variable value, usually infinity, there will be a corresponding dependent-variable subsequence. Some of these dependent-variable sequences will have limits, and among all these, the largest is the limsup.
2. The liminf, abbreviation for limit inferior is the analogous least of all dependent-variable subsequence limits.

### Mathematical Ideas

This section is adapted from: W. Feller, in Introduction to Probability Theory and Volume I, Chapter III, and Chapter VIII, and also E. Lesigne, Heads or Tails: AN Introduction to Limit Theorems in Probability, Chapter 12, American Mathematical Society, Student Mathematical Library, Volume 28, 2005.

We again consider the net fortune in the fair coin-tossing game. That is, we consider the sum Sn where the independent, identically distributed random variables in the sum Sn = X1 + + Xn are the Bernoulli random variables Xi = +1 with probability p = 1/2 and Xi = -1 with probability q = 1 -p = 1/2. Note that the mean = 0 is and the variance is 2 = 1 for each of the summands Xi.

The assumption of fairness, p = 1/2 = q simplifies the statements and the proofs, but does not materially alter the truth of the following theorems, after making appropriate adjustments to the mean and the variance.

#### Some Review and Background

The Strong Law of Large Numbers says that

with probability 1 in the sample space of all possible coin flips. In a way, this says n is “too strong”, it “condenses out” all variation in the sum Sn.

The Central Limit Theorem applied to this special sequence of coin flips says

where is a normal random variable and the limit is interpreted as convergence in distribution. In fact, this implies that for large n about 68% of the points in the sample space of all possible coin flips satisfy

and about 95% of the points in the sample space of all possible coin flips satisfy

In a way, this says is “too weak”, it doesn’t condense out enough information.

These two together suggest that “somewhere in between n and ” we might be able to make stronger statements about convergence and the variation in the sequence Sn

In fact, Hausdorff’s estimate tells us:

with probability 1 in the sample space of all possible coin flips for all values if . In a way, this says is “almost too strong”, it condenses out almost too much information.

Even better, Hardy and Littlewood’s estimate tells us:

with probability 1 in the sample space of all possible coin flips for all values of . In a way, this says (n log n)1/2 is “ still a little too strong”, it condenses out enough information.

Khintchine’s Law of the Iterated Logarithm is “just right.” It tells us very precisely how the sums vary with n. Using a method due to Erdös, it is possible to refine the law even more, but this is probably past the point of diminishing returns.

Like the Central Limit Theorem, the Law of the Iterated Logarithm illustrates in an astonishingly precise way that even completely random sequences obey precise mathematical laws.

#### The Law of the Iterated Logarithm

Khintchine’s Law of the Iterated Logarithm says that:

There exist infinitely many n such that

and furthermore for every n larger than a threshold value N

I have written these is a slightly non-standard way, with the multiplier of the standard deviation from the Central Limit Theorem to emphasize the similarity to and the difference from the Central Limit Theorem.

Theorem 1 (Law of the Iterated Logarithm) With probability 1 we have:

This means that with probability 1 for any , only finitely many of the events:
occur; on the other hand, with probability 1,
occurs for infinitely many n.

For reasons of symmetry, for , the inequality

can only occur for finitely many n; while
must occur for infinitely many n. That is,
with probability 1.

Compare the Law of the Iterated Logarithm to the Central Limit Theorem. The Central Limit Theorem, says that Sn/ is approximately distributed as a N(0, 1) random variable for large n. Therefore, for a large but fixed n, there is probability about 1/6 that the values of Sn/ can exceed the standard deviation, 1, or Sn > . For a fixed but large n, with probability about 0.025, Sn/ can exceed twice the standard deviation, 2, or Sn > 2. The Law of the Iterated Logarithm tells us the more precise information that sooner or later the fortune will exceed

for any , and will do so infinitely often! The Law of the iterated Logarithm does not tell us how long we will have to wait between such repeated crossings however, and the wait can be very, very long indeed, although it must (with probability 1) eventually occur again.

Moreover, the Law of the Iterated Logarithm tells us in addition that

must occur for infinitely many n. This means that the fortune must (with probability 1) oscillate back and forth across the net zero axis infinitely often, crossing the upper and lower boundaries:

The statement puts some strength behind an understanding of the long-term swings backs and forth in value of a random process. It also implies a form of recurrence, that is, a random walk must vist every integer value. If a market price behaves like a random walk, then it too can be expected (with certainlty, but perhaps with longs periods between the swings) to go far above, and far below the starting price.

The law of the iterated logarithm for Bernoulli trials stated here is a very special case of a more general theorem first formulated by Kolmogorov in 1929. It is also possible to formulate even stronger and more general theorems!

#### Illustration

Here are 2000 steps of a random walk with p = q = 1/2. In this figure, the random walk must return again to cross the blue curves with infinitely many times, but may only cross the red curve with finitely many times.

### Problems to Work for Understanding

1. The “multiplier of the variance ” function grows very slowly. To understand how very slowly, calculate a table with n = 10j and for j = 10, 20, 30,..., 100. (Remember, in mathematical work above calculus, log(x) is the natural logarithm, base e, often written ln(x) in calculus and below to distinguish it from the “common” or base-10 logarithm. Be careful, some software and technology cannot properly calculate with magnitudes this large.)
2. Consider the sequence
for n = 1, 2, 3.... Here x is the “floor function”, the greatest integer less than or equal to x, so 1 = 1, 3/2 = 1, 8/3 = 2, -3/2 = -2, etc. Find
and
Does the sequence an have a limit?