Steven R. Dunbar
Department of Mathematics
203 Avery Hall
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Stochastic Processes and

__________________________________________________________________________

The Absolute Excess of Heads over Tails

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________ ### Rating

Mathematically Mature: may contain mathematics beyond calculus with proofs.

_______________________________________________________________________________________________ ### Section Starter Question

What does the law of averages have to say about the probability of having a ﬁxed lead of say 20 Heads or more over Tails or 20 Tails or more over Heads at the end of a coin ﬂipping game of some ﬁxed duration? What does the Weak Law of Large Numbers have to say about having a ﬁxed lead? What does the Weak Law have to say about having a proportional lead, say 1%? What does the Central Limit Theorem have to say about the lead?

_______________________________________________________________________________________________ ### Key Concepts

1. The probability that the number of heads exceeds the number of tails or the number of tails exceeds the number of heads in a sequence of coin-ﬂips by some ﬁxed amount can be estimated with the Central Limit Theorem and the probability gets close to $1$ as the number of tosses grows large.
2. The probability that the number of heads exceeds the number of tails or the number of tails exceeds the number of heads in a sequence of coin-ﬂips by some ﬁxed proportion can be estimated with the Central Limit Theorem and the probability gets close to $0$ as the number of tosses grows large.

__________________________________________________________________________ ### Vocabulary

1. The half-integer correction, also called the continuity correction arises because the distribution of the binomial distribution is a discrete distribution, while the standard normal distribution is a continuous distribution.

__________________________________________________________________________ ### Mathematical Ideas

#### Introduction

Probability theory generally has two classes of theorems about the results of coin-tossing games and therefore random walks:

1. Those theorems that tell how well-behaved and natural are the outcomes of typical coin-tossing games and random walks. The Weak Law of Large Numbers, the Strong Law of Large Numbers and the Central Limit Theorem fall into this category.
2. Those theorems that tell how strange and unnatural are the outcomes of typical coin-tossing games and random walks. The Arcsine Law and the Law of the Iterated Logarithm are good examples in this category.

In this section we will ask two related questions about the net fortune in a coin-tossing game:

1. What is the probability of an excess of a ﬁxed number of heads over tails or tails over heads at some ﬁxed time in the coin-ﬂipping game?
2. What is the probability that the number of heads exceeds the number of tails or the number of tails exceeds the number of heads by some ﬁxed fraction of the number of tosses?

Using the Central Limit Theorem, we will be able to provide precise answers to each question and then to apply the ideas to interesting questions in gambling and ﬁnance.

#### The Half-Integer Correction to the Central Limit Theorem

Often when using the Central Limit Theorem to approximate a discrete distribution, especially the binomial distribution, we adopt the half-integer correction, also called the continuity correction. The correction arises because the binomial distribution has a discrete distribution while the standard normal distribution has a continuous distribution. For any integer $s$ and real value $h$ with $0\le h<1$ the binomial random variable ${S}_{n}$ has $ℙn\left[|{S}_{n}|\le s\right]=ℙn\left[|{S}_{n}|\le s+h\right]$, yet the corresponding Central Limit Theorem approximation with the standard normal cumulative distribution function, $ℙ\left[|Z|\le \left(s+h\right)∕\sqrt{n}\right]$, increases as $h$ increases from $0$ to $1$. It is customary to take $h=1∕2$ to interpolate the diﬀerence. This choice is also justiﬁed by looking at the approximation of the binomial with the normal, see Figure 1.

Symbolically, the half-integer correction to the Central Limit Theorem is

$\begin{array}{llll}\hfill ℙn\left[a\le {S}_{n}\le b\right]& \approx {\int }_{\left(\left(a-1∕2\right)-np\right)∕\sqrt{npq}}^{\left(\left(b+1∕2\right)-np\right)∕\sqrt{npq}}\frac{1}{\sqrt{2\pi }}exp\left(-{u}^{2}∕2\right)\phantom{\rule{0.3em}{0ex}}du\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =ℙ\left[\frac{\left(a-1∕2\right)-np}{\sqrt{npq}}\le Z\le \left(b+1∕2\right)-np\sqrt{npq}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

for integers $a$ and $b$. Figure 1: The half-integer correction: In the ﬁgure the binomial probability is $0.7487$. The simple normal approximation is $0.6809$, but with the half-integer correction the normal approximation is $0.7482$.

#### The absolute excess of heads over tails

Consider the sequence of independent random variables ${Y}_{i}$ which take values $1$ with probability $1∕2$ and $-1$ with probability $1∕2$. This is a mathematical model of a fair coin ﬂip game where a $1$ results from “heads” on the $i$th coin toss and a $-1$ results from “tails”. Let ${H}_{n}$ and ${L}_{n}$ be the number of heads and tails respectively in $n$ ﬂips. Then ${T}_{n}={\sum }_{i=1}^{n}{Y}_{i}={H}_{n}-{L}_{n}$ counts the diﬀerence between the number of heads and tails, an excess of heads if positive, and a “negative excess”, i.e. a deﬁcit, if negative. Rather than the clumsy extended phrase “the number of heads exceeds the number of tails or the number of tails exceeds the number of heads” we can say “the absolute excess of heads $|{T}_{n}|$.” The value ${T}_{n}$ also represents the net “winnings”, positive or negative, of a gambler in a fair coin ﬂip game.

Corollary 1. Under the assumption that ${Y}_{i}=+1$ with probability $1∕2$ and ${Y}_{i}=-1$ with probability $1∕2$, and ${T}_{n}={\sum }_{i=1}^{n}{Y}_{i}$, then for an integer $s$

$ℙ\left[|{T}_{n}|>s\right]\approx ℙ\left[|Z|\ge \left(s+1∕2\right)∕\sqrt{n}\right]$

where $Z$ is a standard normal random variable with mean $0$ and variance $1$.

Proof. Note that $\mu =𝔼\left[{Y}_{i}\right]=0$ and ${\sigma }^{2}=Var\left[{Y}_{i}\right]=1$.

$\begin{array}{llll}\hfill ℙn\left[|{T}_{n}|>s\right]& =1-ℙn\left[-s\le {T}_{n}\le s\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =1-ℙn\left[-s-1∕2\le {T}_{n}\le s+1∕2\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =1-ℙn\left[\left(-s-1∕2\right)∕\sqrt{n}\le {T}_{n}∕\sqrt{n}\le \left(s+1∕2\right)∕\sqrt{n}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \approx 1-ℙ\left[\left(-s-1∕2\right)∕\sqrt{n}\le Z\le \left(s+1∕2\right)∕\sqrt{n}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =ℙ\left[|Z|\ge \left(s+1∕2\right)∕\sqrt{n}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \phantom{\rule{2em}{0ex}}& \hfill \end{array}$

The crucial step occurs at the approximation, and uses the Central Limit Theorem. More precise statements of the Central Limit Theorem such as the Berry-Esséen inequality can turn the approximation into a inequality.

If we take $s$ to be ﬁxed we now have the answer to our ﬁrst question: The probability of an absolute excess of heads over tails greater than a ﬁxed amount in a fair game of duration $n$ approaches $1$ as $n$ increases.

The Central Limit Theorem in the form of the half-integer correction above provides an alternative proof of the Weak Law of Large Numbers for the speciﬁc case of the binomial random variable ${T}_{n}$. In fact,

$\begin{array}{llll}\hfill ℙn\left[\left|\frac{{T}_{n}}{n}\right|>𝜖\right]& \approx ℙ\left[|Z|\ge \frac{𝜖n+1∕2}{\sqrt{n}}\right]\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =ℙ\left[|Z|\ge 𝜖\sqrt{n}+\frac{1∕2}{\sqrt{n}}\right]\to 0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

as $n\to \infty$.

Rewriting $ℙ\left[\left|\frac{{T}_{n}}{n}\right|>𝜖\right]=ℙ\left[\left|{T}_{n}\right|>𝜖n\right]$ this restatement of the Weak Law actually provides the answer to our second question: The probability that the absolute excess of heads over tails is greater than a ﬁxed fraction of the ﬂips in a fair game of duration $n$ approaches $0$ as $n$ increases.

Finally, the half-integer correction gives an estimate on the central probability in a binomial distribution.

Corollary 2.

$ℙn\left[{T}_{n}=0\right]\approx ℙ\left[|Z|<\left(1∕2\right)∕\sqrt{n}\right]\to 0$

as $n\to \infty$.

We can estimate this further:

$\begin{array}{llll}\hfill ℙ\left[|Z|<\left(1∕2\right)∕\sqrt{n}\right]& =\frac{1}{\sqrt{2\pi }}{\int }_{-1∕\left(2\sqrt{n}\right)}^{1∕\left(2\sqrt{n}\right)}{e}^{-{u}^{2}∕2}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}du\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{\sqrt{2\pi }}{\int }_{-1∕\left(2\sqrt{n}\right)}^{1∕\left(2\sqrt{n}\right)}1-{u}^{2}∕2+{u}^{4}∕8+\dots \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}du\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{\sqrt{2\pi }}\frac{1}{\sqrt{n}}-\frac{1}{24\sqrt{2\pi }}\frac{1}{{n}^{3∕2}}+\frac{1}{640\sqrt{2\pi }}\frac{1}{{n}^{5∕2}}+\dots .\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$

So we see that $ℙ\left[{T}_{n}=0\right]$ goes to zero at the rate of $1∕\sqrt{n}$.

##### Illustration 1

What is the probability that the number of heads exceeds the number of tails by more than $20$ or the number of tails exceeds the number of heads by more than $20$ after $500$ tosses of a fair coin? By the proposition, this is:

$ℙ\left[|{T}_{n}|>20\right]\approx ℙ\left[|Z|\ge 20.5∕\sqrt{500}\right]=0.35925.$

This is a reasonably large probability, and is larger than many people would expect.

Figure 2 is a graph of the probability of at least $s$ excess heads in $500$ tosses of a fair coin: Figure 2: Probability of the absolute excess of $x$ heads or tails in $500$ tosses.

##### Illustration 2

What is the probability that there is “about the same number of heads as tails” in 500 tosses? Here we interpret “about the same” as within $5$, that is, an absolute diﬀerence of 1% or less of the number of tosses. Note that since $500$ is even the diﬀerence in the number of heads and tails cannot be an odd number, so must be either $0$, $2$ or $4$.

$ℙ\left[|{T}_{500}|<5\right]\approx ℙ\left[|Z|\le 5.5∕\sqrt{500}\right]=0.19429$

so it would be somewhat unusual (in that it occurs in less than 20% of games) to have the number of heads and tails so close.

##### Illustration 3

Suppose you closely follow a stock recommendation source whose methods use technical analysis. You accept every bit of advice from this source about trading stocks. You choose $10$ stocks to buy, sell or hold every day based on the recommendations. Each day for each stock you will gain or lose money based on the advice. Note that it is possible to gain money even if the advice says the stocks will decrease in value, say by short-selling or using put options. How good can this strategy be? We will make this vague question precise by asking “How good does the information from the technical analysis have to be so that the probability of losing money over a year’s time is $1$ in $10,000$?”

The $10$ stocks over $260$ business days in a year means that there $2,600$ daily gains or losses. Denote each daily gain or loss as ${X}_{i}$, if the advice is correct you will gain ${X}_{i}>0$ and if the advice is wrong you will lose ${X}_{i}<0$. We want the total change ${X}_{\text{annual}}={\sum }_{i=1}^{2600}{X}_{i}>0$ and we will measure that by asking that $ℙ\left[{X}_{\text{annual}}<0\right]$ be small. In the terms of this section, we are interested in the complementary probability of an excess of successes over failures.

We assume that the changes are random variables, identically distributed, independent and the moments of all the random variables are ﬁnite. We will make speciﬁc assumptions about the distribution later, for now these assumptions are suﬃcient to apply the Central Limit Theorem. Then the total change ${X}_{\text{annual}}={\sum }_{i=1}^{2600}{X}_{i}$ is approximately normally distributed with mean $\mu =2600\cdot 𝔼\left[{X}_{1}\right]$ and variance ${\sigma }^{2}=2600\cdot Var\left[{X}_{1}\right]$. Note that here again we are using the uncentered and unscaled version of the Central Limit Theorem. In symbols

$ℙ\left[a\le \sum _{i=1}^{2600}{X}_{i}\le b\right]\approx \frac{1}{\sqrt{2\pi {\sigma }^{2}}}{\int }_{a}^{b}exp\left(-\frac{{\left(u-\mu \right)}^{2}}{2{\sigma }^{2}}\right)\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}du.$

We are interested in

$ℙ\left[\sum _{i=1}^{2600}{X}_{i}\le 0\right]\approx \frac{1}{\sqrt{2\pi {\sigma }^{2}}}{\int }_{-\infty }^{0}exp\left(-\frac{{\left(u-\mu \right)}^{2}}{2{\sigma }^{2}}\right)\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}du.$

By the change of variables $v=\left(u-\mu \right)∕\sigma$, we can rewrite the probability as

$ℙ\left[{X}_{\text{annual}}\le 0\right]=\frac{1}{\sqrt{2\pi }}{\int }_{-\infty }^{-\mu ∕\sigma }exp\left(-{v}^{2}∕2\right)\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}dv=\Phi \left(-\frac{\mu }{\sigma }\right)$

so that the probability depends only on the ratio $-\mu ∕\sigma$. We desire that $\Phi \left(-\mu ∕\sigma \right)=ℙ\left[{X}_{\text{annual}}<0\right]<1∕10,000$. Then we can solve for $-\mu ∕\sigma \approx -3.7$. Since $\mu =2600\cdot 𝔼\left[{X}_{1}\right]$ and ${\sigma }^{2}=2600\cdot Var\left[{X}_{1}\right]$, we calculate that for the total annual change to be a loss we must have $𝔼\left[{X}_{1}\right]∕\sqrt{Var\left[{X}_{1}\right]}\approx \left(3.7∕\sqrt{2600}\right)=0.073$.

Now we consider what the requirement $𝔼\left[{X}_{1}\right]=0.07\cdot \sqrt{Var\left[{X}_{1}\right]}$ means for speciﬁc distributions. If we assume that the individual changes ${X}_{i}$ are normally distributed with a positive mean, then we can use $𝔼\left[{X}_{1}\right]∕\sqrt{Var\left[{X}_{1}\right]}=0.07$ to calculate that $ℙ\left[{X}_{1}\right]<0\approx 0.47210$, or about $47%$. Then the probability of a prediction leading to a positive gain is about $53%$. Alternatively, if we assume that the individual changes ${X}_{i}$ are binomial random variables where the probability of a positive gain is $ℙ\left[{X}_{1}=1\right]=p$, then $𝔼\left[{X}_{1}\right]=2p-1$ and $Var\left[{X}_{1}\right]=4p\left(1-p\right)$. We can use $2p-1=𝔼\left[{X}_{1}\right]=0.07Var\left[{X}_{1}\right]=0.07\cdot \left(4p\left(1-p\right)\right)$ to solve for $p$. The result is $p=0.53483$.

In either case, this means that any given piece of advice only has to have a $53%$ chance of being correct in order to have a perpetual money-making machine. Compare this with the strategy of using a coin ﬂip to provide the advice. Since we don’t observe any perpetual money-making machines, we conclude that any advice about stock picking must be less than $53%$ reliable or about the same as ﬂipping a coin.

Now suppose that instead we have a computer algorithm predicting stock movements for all publicly traded stocks, of which there are about $2,000$. Suppose further that we wish to restrict the chance that $ℙ\left[{X}_{\text{annual}}\right]<1{0}^{-6}$, that is 1 chance in a million. Then we can repeat the analysis to show that the computer algorithm would only need to have $ℙ\left[{X}_{1}\right]<0\approx 0.49737$, practically indistinguishable from a coin ﬂip, in order to make money. This provides a statistical argument against the utility of technical analysis for stock price prediction. Not losing money is not suﬃcient evidence to distinguish ability in stock-picking from coin-ﬂipping.

#### Sources

This section is adapted from the article “Tossing a Fair Coin” by Leonard Lipkin. The discussion of the continuity correction is adapted from The Continuity Correction in the section The Central Limit Theorem. in the Virtual Laboratories in Probability and Statistics. The third example in this section is adapted from a presentation by Jonathan Kaplan of D.E. Shaw and Co. in summer 2010.

_______________________________________________________________________________________________ ### Algorithms, Scripts, Simulations

#### Algorithm

The experiment is ﬂipping a coin $n$ times, and repeat the experiment $k$ times. Then compute the proportion for which the excess of heads over tails is more than $s$. Compare this to the theoretical probability from the standard normal distribution.

#### Scripts

Geogebra
+
R
1p <- 0.5
2n <- 500
3k <- 1000
4coinFlips <- array( 0+(runif(n*k) <= p), dim=c(n,k))
5     # 0+ coerces Boolean to numeric
6winLose = 2 * coinFlips - 1
7# -1 for Tails, +1 for Heads
9# n..n (every other integer) binomial rv sample
10# the second argument 2 means columnwise
11
12s <- 20
13prob <- sum( 0+(abs(excessHeads) > s) )/k
14theoretical <- 2*(1-pnorm( (s+0.5)/sqrt(n), mean=0, sd=1))
15cat(sprintf("Empirical probability: %f \n", prob ))
16cat(sprintf("Excess Heads estimate: %f \n", theoretical))
Octave
1p = 0.5;
2n = 500;
3k = 1000;
4
5coinFlips = rand(n,k) <= p;
6winLose = 2 * coinFlips - 1;
7# -1 for Tails, +1 for Heads
9# -n..n (every other integer) binomial rv sample, size k
10
11s = 20;
12prob = sum( (abs(excessHeads) > s) )/k;
13theoretical = 2*(1-stdnormal_cdf( (s+0.5)/sqrt(n)));
14disp("Empirical probability: "), disp( prob )
15disp("Excess Heads estimate: "), disp( theoretical )
Perl
1use PDL::NiceSlice;
2
3sub pnorm {
4    my ( $x,$sigma, $mu ) = @_; 5$sigma = 1 unless defined($sigma); 6$mu    = 0 unless defined($mu); 7 8 return 0.5 * ( 1 + erf( ($x - $mu ) / ( sqrt(2) *$sigma ) ) );
9}
10
11$p = 0.5; 12$n = 500;
13$k = 1000; 14 15#note order of dims!! 16$coinFlips = random( $k,$n ) <= $p; 17 18# 0..n binomial r.v. sample, size k 19#note transpose, PDL likes x (row) direction for implicitly threaded operations 20$headsTotal = $coinFlips->transpose->sumover; 21 22# note use of threading 23$winLose = 2 * $coinFlips - 1; 24 25# note xchg transpose, PDL likes x (row) direction for implicity threaded operations 26$excessHeads = ( $winLose->xchg( 0, 1 )->sumover ); 27 28$s           = 20;
29$prob = ( ( abs($excessHeads) > $s )->sumover ) /$k;
30$theoretical = 2 * ( 1 - pnorm( ($s + ( 1 / 2 ) ) / sqrt($n) ) ); 31 32print "Empirical probability: ",$prob,        "\n";
33print "Excess Heads theoretical estimate:", \$theoretical, "\n";
SciPy
1import scipy
2
3p = 0.5
4n = 500
5k = 1000
6
7coinFlips = scipy.random.random((n,k))<= p
8# Note Booleans True for Heads and False for Tails
9winLose = 2*coinFlips - 1
10excessHeads = scipy.sum(winLose, axis = 0)
11
12s = 20
13prob = (scipy.sum( abs(excessHeads) > s )).astype(float)/k
14# Note the casting of integer type to float to get float
15from scipy.stats import norm
16theoretical = 2*(1-norm.cdf( (s+0.5)/scipy.sqrt(n)))
17
18print "Empirical probability: ", prob
19print "Excess heads theoretical estimate:", theoretical

__________________________________________________________________________ ### Problems to Work for Understanding

1. What is the approximate probability that the number of heads is within $10$ of the number of tails, that is, a diﬀerence of 2% or less of the number of tosses in 500 tosses?
2. What is the approximate probability that the number of heads is within $20$ of the number of tails, that is, a diﬀerence of 4% or less of the number of tosses in 500 tosses?
3. What is the approximate probability that the number of heads is within $25$ of the number of tails, that is, a diﬀerence of 5% or less of the number of tosses in 500 tosses?
4. Derive and then graph a simple power function that gives the approximate probability that the number of heads is within $x$ of the number of tails in 500 tosses, for $0\le x\le 500$.
1. What is the probability that the number of heads is within $10$ of the number of tails, that is, a diﬀerence of 1% or less of the number of tosses in 1000 tosses?
2. What is the probability that the number of heads is within $10$ of the number of tails, that is, a diﬀerence of 0.5% or less of the number of tosses in 2000 tosses?
3. What is the probability that the number of heads is within $10$ of the number of tails, that is, a diﬀerence of 0.2% or less of the number of tosses in 5000 tosses?
4. Derive and then graph a simple power function of $n$ that gives the approximate probability that the number of heads is within $10$ of the number of tails, that is, a diﬀerence of $100\cdot 10∕n%$ or less of the number of tosses in $n$ tosses.
1. Derive the rate, as a function of $n$, that the probability of heads exceeds tails by a ﬁxed value $s$ approaches $1$ as $n\to \infty$.
2. Derive the rate, as a function of $n$, that the probability of heads exceeds tails by a ﬁxed fraction $𝜖$ approaches $0$ as $n\to \infty$.
3. Calculate the exact value of $ℙ\left[|{T}_{500}|<5\right]$ in Illustration 2 and compare it to the approximation.

__________________________________________________________________________ ### References

   William Feller. An Introduction to Probability Theory and Its Applications, Volume I, volume I. John Wiley and Sons, third edition, 1973. QA 273 F3712.

   Emmanuel Lesigne. Heads or Tails: An Introduction to Limit Theorems in Probability, volume 28 of Student Mathematical Library. American Mathematical Society, 2005.

   Leonard Lipkin. Tossing a fair coin. The College Mathematics Journal, 34(2):128–133, March 2003.

__________________________________________________________________________ 1. Virtual Laboratories in Probability and Statistics.. Search the page for Random Walk Experiment and run the Last Visit to Zero experiment.
2. Virtual Laboratories in Probability and Statistics.. Search the page for Ballot Experiment and run the Ballot Experiment.

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable eﬀort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reﬂects the thoughts, interests and opinions of its author. They do not explicitly represent oﬃcial positions or policies of my employer.

Information on this website is subject to change without notice.