Steven R. Dunbar
Department of Mathematics
203 Avery Hall
University of Nebraska-Lincoln
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Topics in
Probability Theory and Stochastic Processes
Steven R. Dunbar

__________________________________________________________________________

Binomial Distribution

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________

Rating

Rating

Mathematically Mature: may contain mathematics beyond calculus with proofs.

_______________________________________________________________________________________________

Section Starter Question

Section Starter Question

Consider a family with 5 children. What is the probability of having all five children be boys? How many children must a couple have for at least a 0.95 probability of at least one girl? What is a proper and general mathematical framework for setting up the answer to these questions and similar questions?

_______________________________________________________________________________________________

Key Concepts

Key Concepts

  1. A binomial random variable Sn counts the number of successes in a sequence of n trials of an experiment.
  2. A binomial random variable Sn takes only integer values between 0 and n inclusive and
    Sn = k = n kpk(1 p)nk

    for k = 0, 1, 2,,n.

  3. The expectation of a binomial random variable with n trials and probability of success p on each trial is:
    𝔼 Sn = np

  4. The variance of a binomial random variable with n trials and probability of success p on each trial is:
    Var Sn = npq = np(1 p)

__________________________________________________________________________

Vocabulary

Vocabulary

  1. An elementary experiment is a physical experiment with two outcomes. An elementary experiment is also called a Bernoulli trial.
  2. A composite experiment consists of repeating an elementary experiment n times.
  3. The sample space, denoted Ωn is the set of all possible sequences of n 0s and 1s representing all possible outcomes of the composite experiment.
  4. A random variable is a function from the sample space Ωn to the real numbers .

__________________________________________________________________________

Mathematical Ideas

Mathematical Ideas

Sample Space for a Sequence of Experiments

An elementary experiment in this section consists of an experiment with two outcomes. An elementary experiment is also called a Bernoulli trial. Label the outcomes of the elementary experiment 1, occurring with probability p and 0, occurring with probability q, where p + q = 1. Often we name 1 as success and 0 as failure. For example, a coin toss would be a physical experiment with two outcomes, say with “heads” labeled as success, and “tails” as failure.

A composite experiment consists of repeating an elementary experiment n times. The sample space, denoted Ωn is the set of all possible sequences of n 0’s and 1’s representing all possible outcomes of the composite experiment. We denote an element of Ωn as ω = (ω1,,ωn), where each ωk = 0 or 1. That is, Ωn = {0, 1}n. We assign a probability measure n on Ωn by multiplying probabilities of each Bernoulli trial in the composite experiment according to the principle of independence. Thus, for k = 1,,n,

ωk = 0 = q and  ωk = 1 = p

and inductively for each (e1,e2,,en) {1, 0}n

n+1 ωn+1 = 1 and (ω1,,ωk) = (e1,,en) = ωn+1 = 1 × (ω1,,ωn) = (e1,,en)

Additionally, let Sn(ω) be the number of 1’s in ω Ωn. Note that Sn(ω) = k=1nω k. We also say Sn(ω) is the number of successes in the composite experiment. Then

n ω = pSn(ω)qnSn(ω).

We can also define a unified sample space Ω that is the set of all infinite sequences of 0’s and 1’s. We sometimes write Ω = {0, 1}. Then Ωn is the projection of the first n entries in Ω.

A random variable is a function from a set called the sample space to the real numbers . For example as a frequently used special case, for ω Ω let

Xk(ω) = ωk,

then Xk is an indicator random variable taking on the value 1 or 0. Xk (the dependence on the sequence ω is usually suppressed) indicates success or failure at trial k. Then as above,

Sn = k=1nX i = k=1nω i

is a random variable indicating the number of successes in a composite experiment.

Binomial Probabilities

Proposition 1. The random variable Sn takes only integer values between 0 and n inclusive and

n Sn = k = n kpkqnk.

Remark. The notation n indicates that we are considering a family of probability measures indexed by n on the sample space Ω.

Proof. From the inductive definition

ωi = 0 = q and  ωi = 1 = p

and inductively for each (e1,e2,,en) {1, 0}n

n+1 ωn+1 = 1 and (ω1,,ωn) = (e1,,en) = ωn+1 = 1 × n (ω1,,ωn) = (e1,,en)

the probability assigned to an ω having k 1’s and n k 0’s is pk(1 p)nk = pSn(ω)(1 p)nSn(ω). The sample space Ωn has precisely n k such points. By the additive property of disjoint probabilities,

n Sn = k = n kpkqnk.

and the proof is complete. □

Proposition 2. If X1,X2,Xn are independent, identically distributed random variables with distribution Xi = 1 = p and Xi = 0 = q, then the sum X1 + + Xn has the distribution of a binomial random variable Sn with parameters n and p.

Proposition 3.

  1. 𝔼 Sn = np

  2. Var Sn = npq = np(1 p)

Proof. First Proof: By the binomial expansion

(p + q)n = k=0nn kpkqnk.

Differentiate with respect to p and multiply both sides of the derivative by p:

np(p + q)n1 = k=0nkn kpkqnk.

Now choosing q = 1 p,

np = k=0nkn kpk(1 p)nk = 𝔼 S n .

For the variance, differentiate the binomial expansion with respect to p twice:

n(n 1)(p + q)n2 = k=0nk(k 1)n kpk2qnk.

Multiply by p2, substitute q = 1 p,and expand:

n(n1)p2 = k=0nk2n kpk(1p)nk k=0nkn kpk(1p)nk = 𝔼 S n2𝔼 S n

Therefore,

Var Sn = 𝔼 Sn2 (𝔼 S n)2 = n(n 1)p2 + np n2p2 = np(1 p)

Proof. Second proof: Use that the sum of expectations is the expectation of the sum, and apply it to the corollary with Sn = X1 + + Xn with 𝔼 Xi = p.

Similarly, use that the sum of variances of independent random variables is the variance of the sum applied to Sn = X1 + + Xn with 𝔼 Xi = p(1 p). □

Examples

Example. The following example appeared in the January 20, 2017 “Riddler” puzzler on the website fivethirtyeight.com.

You and I find ourselves indoors one rainy afternoon, with nothing but some loose change in the couch cushions to entertain us. We decide that well take turns flipping a coin, and that the winner will be whoever flips 10 heads first. The winner gets to keep all the change in the couch! Predictably, an enormous argument erupts: We both want to be the one to go first.

What is the first flippers advantage? In other words, what percentage of the time does the first flipper win this game?

First solve an easier version of the puzzle where the first person to flip a head will win. Let the person who flips first be A, and the probability that A wins by first obtaining a head is PA. Then adding the probabilities for the disjoint events that the sequence of flips is H, or TTH, or TTTTH and so forth.

PA = 1 2 + 1 2 2 1 2 + 1 2 4 1 2 + = 1 2 1 + 1 4 + 1 4 2 + = 1 2 1 1 14 = 1 2 4 3 = 2 3.

Another way to do this problem would be to use first-step analysis from Markov Chain theory. Then the probability of the first player winning PA is the probability of winning on the first flip plus the probability of both players each losing their first flip at which point the game is essentially starting over,

PA = 1 2 + 1 4PA.

Solving, 3 4PA = 1 2 or

PA = 1 2 4 3 = 2 3.

Now extend the same reasoning as in the first approach to the case of the first player to get 10 heads winning. The first case for A to win is to get 9 heads in flips 1, 3, 5,, 17 and the 10th head on flip 19 and for player B to get anywhere from 0 to 9 heads on flips 2, 4, 6,, 18. This is probability 9 9 1 2 9 1 2 and cumulative binomial probability k=099 k 1 2 9 respectively. The next disjoint probability case is for A to win is to get 9 heads in flips 1, 3, 5,, 19 and the 10th head on flip 21 and for player B to get anywhere from 0 to 9 heads on flips 2, 4, 6,, 20. This is probability 10 9 1 2 10 1 2 and cumulative binomial probability k=0910 k 1 2 10 respectively. In general, the disjoint probability case is for A to win is to get 9 heads in flips 1, 3, 5,, 2j 1 and the 10th head on flip 2j + 1 and for player B to get anywhere from 0 to 9 heads on flips 2, 4, 6,, 2j. This is probability j 9 1 2 j 1 2 and cumulative binomial probability k=09 j k 1 2 j respectively.

Then multiplying the independent probabilities for A and B in each case and adding all these disjoint probabilities

PA = j=9j 9 1 2 j+1 k=09 j k 1 2 j

There does not seem to be an exact analytic or closed form expression for this probability as in the case of winning with a single head, so we need to approximate it. In the case of winning with 10 heads, PA 0.53278.

Sources

This section is adapted from: Heads or Tails, by Emmanuel Lesigne, Student Mathematical Library Volume 28, American Mathematical Society, Providence, 2005, Sections 1.2 and Chapter 4 [3]. The example is heavily adapted from the weekly “Riddler” column of January 20, 2017 from the website fivethirtyeight.com.

_______________________________________________________________________________________________

Algorithms, Scripts, Simulations

Algorithms, Scripts, Simulations

Algorithm

The following Octave code in inefficient in the sense that it generates far more trials than it needs. However, writing the code that captures exactly the number of flips needed on each trial would probably take more lines, so it is easy to be inefficient here.

Scripts

1p = 0.5; 
2n = 500; 
3trials = 2000; 
4 
5victory = 10; 
6 
7headsTails = ( rand(n,trials) <= p ); 
8headsTailsA = headsTails(1:2:n, :); 
9headsTailsB = headsTails(2:2:n, :); 
10totalHeadsA = cumsum( headsTailsA); 
11totalHeadsB = cumsum( headsTailsB); 
12 
13winsA = zeros(1,trials); 
14 
15for j = 1:trials 
16        winsA(1,j) = ( min(find(totalHeadsA(:,j) == victory)) <= min(find(totalHeadsB(:,j) == victory)) ); 
17    endfor; 
18empirical = sum(winsA)/trials; 
19 
20nRange = [9:40]; 
21A = binopdf(9,nRange,1/2) * (1/2); 
22B = binocdf(9,nRange,1/2); 
23analytic = dot(A,B); 
24 
25disp("The empirical probability is:") 
26disp(empirical) 
27disp("The approximation to the analytic probabily is:") 
28disp(analytic)
0cAp0x1-1200029:

_______________________________________________________________________________________________

Problems to Work

Problems to Work for Understanding

  1. Solve the example problem for the cases of winning with 2, 3, 4,, 9 heads.
  2. Write a simulation to experimentally simulate the coin-flipping game of the example. Experimentally determine the probability of A winning in the cases of winning with 1, 2, 3,10 heads.
  3. Draw a graph of the probability of A winning versus the number of heads required to win.

__________________________________________________________________________

Books

Reading Suggestion:

References

[1]   Leo Breiman. Probability. SIAM, 1992.

[2]   William Feller. An Introduction to Probability Theory and Its Applications, Volume I, volume I. John Wiley and Sons, third edition, 1973. QA 273 F3712.

[3]   Emmanuel Lesigne. Heads or Tails: An Introduction to Limit Theorems in Probability, volume 28 of Student Mathematical Library. American Mathematical Society, 2005.

__________________________________________________________________________

Links

Outside Readings and Links:

  1. Virtual Laboratories in Probability and Statistics ¿ Binomial.
  2. Weisstein, Eric W. “Binomial Distribution.” From MathWorld–A Wolfram Web Resource. BinomialDistribution.

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable effort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reflects the thoughts, interests and opinions of its author. They do not explicitly represent official positions or policies of my employer.

Information on this website is subject to change without notice.

Steve Dunbar’s Home Page, http://www.math.unl.edu/~sdunbar1

Email to Steve Dunbar, sdunbar1 at unl dot edu

Last modified: Processed from LATEX source on January 30, 2017