[an error occurred while processing this directive] [an error occurred while processing this directive]
This section is adapted from: W. Feller, in Introduction to Probability Theory and Volume I, Chapter III, and Chapter VIII, and also E. Lesigne, Heads or Tails: AN Introduction to Limit Theorems in Probability, Chapter 12, American Mathematical Society, Student Mathematical Library, Volume 28, 2005.
We again consider the net fortune in the fair coin-tossing game. That is, we consider the sum S_{n} where the independent, identically distributed random variables in the sum S_{n} = X_{1} + + X_{n} are the Bernoulli random variables X_{i} = +1 with probability p = 1/2 and X_{i} = -1 with probability q = 1 -p = 1/2. Note that the mean = 0 is and the variance is ^{2} = 1 for each of the summands X_{i}.
The assumption of fairness, p = 1/2 = q simplifies the statements and the proofs, but does not materially alter the truth of the following theorems, after making appropriate adjustments to the mean and the variance.
The Strong Law of Large Numbers says that
with probability 1 in the sample space of all possible coin flips. In a way, this says n is “too strong”, it “condenses out” all variation in the sum S_{n}.
The Central Limit Theorem applied to this special sequence of coin flips says
where is a normal random variable and the limit is interpreted as convergence in distribution. In fact, this implies that for large n about 68% of the points in the sample space of all possible coin flips satisfy
and about 95% of the points in the sample space of all possible coin flips satisfy
In a way, this says is “too weak”, it doesn’t condense out enough information.
These two together suggest that “somewhere in between n and ” we might be able to make stronger statements about convergence and the variation in the sequence S_{n}
In fact, Hausdorff’s estimate tells us:
with probability 1 in the sample space of all possible coin flips for all values if . In a way, this says is “almost too strong”, it condenses out almost too much information.
Even better, Hardy and Littlewood’s estimate tells us:
with probability 1 in the sample space of all possible coin flips for all values of . In a way, this says (n log n)^{1/2} is “ still a little too strong”, it condenses out enough information.
Khintchine’s Law of the Iterated Logarithm is “just right.” It tells us very precisely how the sums vary with n. Using a method due to Erdös, it is possible to refine the law even more, but this is probably past the point of diminishing returns.
Like the Central Limit Theorem, the Law of the Iterated Logarithm illustrates in an astonishingly precise way that even completely random sequences obey precise mathematical laws.
Khintchine’s Law of the Iterated Logarithm says that:
There exist infinitely many n such that
I have written these is a slightly non-standard way, with the multiplier of the standard deviation from the Central Limit Theorem to emphasize the similarity to and the difference from the Central Limit Theorem.
Theorem 1 (Law of the Iterated Logarithm) With probability 1 we have:
For reasons of symmetry, for , the inequality
Compare the Law of the Iterated Logarithm to the Central Limit Theorem. The Central Limit Theorem, says that S_{n}/ is approximately distributed as a N(0, 1) random variable for large n. Therefore, for a large but fixed n, there is probability about 1/6 that the values of S_{n}/ can exceed the standard deviation, 1, or S_{n} > . For a fixed but large n, with probability about 0.025, S_{n}/ can exceed twice the standard deviation, 2, or S_{n} > 2. The Law of the Iterated Logarithm tells us the more precise information that sooner or later the fortune will exceed
Moreover, the Law of the Iterated Logarithm tells us in addition that
The statement puts some strength behind an understanding of the long-term swings backs and forth in value of a random process. It also implies a form of recurrence, that is, a random walk must vist every integer value. If a market price behaves like a random walk, then it too can be expected (with certainlty, but perhaps with longs periods between the swings) to go far above, and far below the starting price.
The law of the iterated logarithm for Bernoulli trials stated here is a very special case of a more general theorem first formulated by Kolmogorov in 1929. It is also possible to formulate even stronger and more general theorems!
Here are 2000 steps of a random walk with p = q = 1/2. In this figure, the random walk must return again to cross the blue curves with infinitely many times, but may only cross the red curve with finitely many times.
[an error occurred while processing this directive] [an error occurred while processing this directive]
Last modified: [an error occurred while processing this directive]