Steven R. Dunbar
Department of Mathematics
203 Avery Hall
University of Nebraska-Lincoln
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Topics in
Probability Theory and Stochastic Processes
Steven R. Dunbar

__________________________________________________________________________

Classes of States and Stationary Distributions

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________

Rating

Rating

Mathematically Mature: may contain mathematics beyond calculus with proofs.

_______________________________________________________________________________________________

Section Starter Question

Section Starter Question

Randomly distribute three balls between two urns, labeled A and B. Each period, select an urn at random, and if it is not empty, remove a ball from that urn and put it in the other urn. How could you determine what fraction of time is urn A empty? What is that fraction of time? Does this depend on how the balls are initially distributed between the two urns?

_______________________________________________________________________________________________

Key Concepts

Key Concepts

  1. The states of a Markov chain partition into equivalence classes according to their type: communicating, periodic versus aperiodic, recurrent versus transient, and ergodic.
  2. Every irreducible ergodic Markov chain has a unique non-negative solution of iπiPij = πj iπi = 1

    called a stationary distribution.

__________________________________________________________________________

Vocabulary

Vocabulary

  1. The states of a Markov chain partition into equivalence classes according to their type: communicating, periodic versus aperiodic, recurrent versus transient, and ergodic.
  2. Any vector (π)j satisfying the set of the equations iπiPij = πj iπi = 1

    is a stationary probability distribution of the Markov chain.

__________________________________________________________________________

Mathematical Ideas

Mathematical Ideas

Classification of States and Stationary Distributions

Definition. State j is accessible from state i if Pijn > 0 for some n 1. That is, state j is accessible from state i if and only if it is possible that the process will enter state j.

Definition. Two states communicate if state j is accessible from state i and state i is accessible from j.

Remark. Note that it is possible that j is accessible from i but that i and j do not communicate. This would happen if j is an absorbing state accessible from i.

Proposition 1. Being able to communicate is an equivalence relation, so that the states of a Markov chain partition into disjoint classes.

Proof. Left as an exercise. □

Definition. The period of a state i, written as d(i), is the greatest common divisor of all integers n 1 for which Piin > 0. If Piin = 0 for all n, define d(i) = 0.

Proposition 2.

  1. If i communicates with j, then d(i) = d(j).
  2. If state i has period d(i), then there exists an integer N depending on i such that for all integers n N, Piind(i) > 0. That is, return to state i can occur at all sufficiently large multiples of the period d(i).
  3. If Pjim > 0, then Pjim+nd(i) > 0 for all sufficiently large n.

Proof. Left as an exercise. □

Definition. A Markov chain in which each state has period 1 is aperiodic.

Definition. The Markov chain is irreducible if it has only one class.

Definition. For any state i, let fi denote the probability that starting in i the process will ever reenter state i. State i is recurrent if fi = 1 and transient if fi < 1.

Proposition 3. If state i is recurrent, then the process will reenter i infinitely often.

Proof. Left as an exercise. □

Proposition 4.

  1. State i is recurrent if
    n=1P ijn = .

  2. State i is transient if
    n=1P ijn < .

Proof. Left as an exercise. □

Definition. If a state i is recurrent, then it is positive recurrent if, starting in i, the expected time until the process returns to state i is finite.

Proposition 5. In a finite-state Markov chain all recurrent states are positive recurrent.

Proof. Left as an exercise. □

Definition. Positive recurrent and aperiodic states are ergodic.

Theorem 6. For an irreducible ergodic Markov chain lim nPijn exists and is independent of i. Furthermore, letting

πj = lim nPijn

then πj is the unique non-negative solution of

iπiPij = πj iπi = 1

Definition. Any vector (π)j satisfying the set of the equations is a stationary probability distribution of the Markov chain. A Markov chain started according to a stationary distribution will have this distribution for all future times.

Remark. A limiting distribution is always a stationary distribution, but the converse is not true. A Markov chain may have a stationary distribution but no limiting distribution. For example, the periodic Markov chain whose transition probability matrix is

P = 01 1 0

has no limiting distribution but π = (12, 12) is a stationary distribution. Notice that this Markov chain is not ergodic.

Remark. Note that this theorem says that a probability transition matrix for an irreducible ergodic Markov chain has a left eigenvector with corresponding eigenvalue 1. This is a special case of the more general Perron-Frobenius Theorem.

Amusing Example of a Stationary Distribution

The following example is adapted from the solution in Lessard. based on the problem originally posed in The Riddler.. The interest is partly in the humorous context and partly that the context obscures that the Markov chain has more states than is at first obvious.

There is a bathroom in your office building that has only one toilet. There is a small sign stuck to the outside of the door that you can slide from “Vacant” to “Occupied” so that no one else will try the door handle (theoretically) when you are inside. Unfortunately, people often forget to slide the sign to “Occupied” when entering, and they often forget to slide it to “Vacant” when exiting.

Assume that 13 of bathroom users don’t notice the sign upon entering or exiting. Therefore, whatever the sign reads before their visit, it still reads the same thing during and after their visit. Another 13 of the users notice the sign upon entering and make sure that it says “Occupied” as they enter. However, half the time they forget to slide it to “Vacant” when they exit. The remaining 13 of the users are very conscientious: They make sure the sign reads “Occupied” when they enter, and then they slide it to “Vacant” when they exit. Finally, assume that the bathroom is occupied exactly half of the time, all day, every day.

Two questions about this workplace situation:

  1. If you go to the bathroom and see that the sign on the door reads “Occupied,” what is the probability that the bathroom is actually occupied?
  2. If the sign reads “Vacant,” what is the probability that the bathroom actually is vacant?
  3. Extra credit: What happens as the percentage of conscientious bathroom users changes?

The first step is to define the states and transition probabilities. We might think that because the bathroom can be either occupied or vacant, and the sign in front can either read “Vacant” or “Occupied”, that there should be four states, one for each possible pair of occupation state and sign. However, this is not the case. Consider the state “bathroom is occupied and the sign says its occupied”. We must distinguish between the cases where the person occupying the bathroom is conscientious (they will definitely slide the sign to “Vacant” when they leave) or not (they will leave the sign as “Occupied” after they leave).

We must therefore add more states to our Markov chain that correspond to the different ways in which the bathroom can be occupied. Consider that there are three types of users: oblivious, forgetful, and conscientious, each of proportion 13.

Imagine a sequence of short times, say every minute. Using the assumption that the bathroom is occupied exactly half of the time, all day, every day we can generalize slightly to assume that the transition from state to state each minute is itself a Markov chain with transition probability

     O     V
O   1∕2   1∕2

V   1∕2   1∕2

Note that the stationary distribution is (12, 12) which is consistent with the assumption that the bathroom is occupied exactly half of the time, all day, every day.

Now the possible transitions at each minute are:

  1. A Conscientious person is in the bathroom, and it is Occupied and the sign says Occupied. We will name this state as OCO. Either the person stays and the state remains Occupied with sign Occupied with probability 12, or the conscientious person leaves and the state is V V with probability 12.
  2. A Forgetful person is in the bathroom, and it is Occupied and the sign says Occupied. We will name this state as OFO. Either the person stays and the state remains Occupied with sign Occupied with probability 12, or the forgetful person leaves and the state is V O with probability 12 12 = 14 or the forgetful person leaves, changing the sign and the state is V V with probability 12 12 = 14.
  3. An Oblivious person is in the bathroom, and it is Occupied and the sign says Occupied. We will name this state as OOO. Either the person stays and the state remains Occupied with sign Occupied with probability 12, or the oblivious person leaves and the state is V O with probability 12.
  4. A Oblivious person is in the bathroom, and it is Occupied but the sign says Vacant. We will name this state as OOV . Either the person stays and the state remains Occupied with sign Vacant with probability 12, or the oblivious person leaves and the state is V O with probability 12.
  5. The bathroom is Vacant, but the sign says Occupied. With probability 12 some person will approach, nevertheless try the door and determine the true state of vacancy and enter, dealing with the sign according to their type. Then the transition is to OCO with probability 16, OFO with probability 16 and OOO with probability 16. With probability 12 the state remains V O.
  6. The bathroom is Vacant, and the sign says Vacant. With probability 12 some person will approach, nevertheless try the door and easily determine the true state of vacancy and enter, dealing with the sign according to their type. Then the transition is to OCO with probability 16, OFO with probability 16 and OOV with probability 16. With probability 12 the state remains V V .

A state transition diagram is in Figure 1.


State transition diagram

Figure 1: The state transition diagram for the bathroom occupancy.

Then the transition probability matrix is

P = 12 0 0 0 0 12 0 12 0 0 1414 0 0 12 0 12 0 0 0 0 1212 0 16 1616 0 12 0 16 16 0 16 0 12

The stationary distribution is then the solution of

πP = π, iπi = 1

with solution

πOCO = 1 6, πOFO = 1 6, πOOO = 1 12, πOOV = 1 12, πV O = 1 4, πV V = 1 4.

Note that the bathroom is still Vacant half the time, although the time is split equally over the 2 sign states.

Vacant|says “Vacant” = πV V πV V + πOOV = 14 14 + 112 = 3 4 Occupied|says “Occupied” = πOCO + πOFO + πOOO πOCO + πOFO + πOOO + πV O = 5 8.

Sources

This section is adapted from: Ross, Introduction to Probability Models, Taylor and Karlin, An Introduction to Stochastic Modeling, Karlin and Taylor, A Second Course in Stochastic Processes and Lessard Lessard..

_______________________________________________________________________________________________

Algorithms, Scripts, Simulations

Algorithms, Scripts, Simulations

Algorithm

Scripts

__________________________________________________________________________

Problems to Work

Problems to Work for Understanding

  1. Provide examples of the classifications of states:
    1. A trivial two-state Markov chain in which neither state is accessible from the other.
    2. A trivial two-state Markov chain in which both states communicate.
    3. A Markov chain in which all states have period 2.
    4. A three-state Markov chain with two states absorbing and one transient state. What states are communicating in this example?
    5. An ergodic Markov chain.
  2. Randomly distribute three balls between two urns, labeled A and B. Each period, select an urn at random, and if it is not empty, remove a ball from that urn and put it in the other urn. Make a Markov chain model of this situation and classify all states. What fraction of time is urn A empty? Does this depend on how the balls are initially distributed between the two urns?
  3. Assume the fraction of oblivious, forgetful, and conscientious users are p, q, and r respectively, with 0 p,q,r 1 and p + q + r = 1. Solve the bathroom problem under this general assumption.
  4. Assume the fraction of oblivious, forgetful, and conscientious users are p, q, and r respectively, with 0 p,q,r 1 and p + q + r = 1. Further assume the bathroom is occupied 23 of the time, and vacant 13 of the time. Solve the bathroom problem under this general assumption.
  5. Assume the fraction of oblivious, forgetful, and conscientious users are p, q, and r respectively, with 0 p,q,r 1 and p + q + r = 1. Further assume the bathroom is occupied 1 v of the time, and vacant v of the time, where 0 < v < 1. Solve the bathroom problem under this general assumption. What happens as v 0 or v 1?
  6. Assume the fraction of oblivious, forgetful, and conscientious users are p, q, and r respectively, with 0 p,q,r 1 and p + q + r = 1. Also assume that the oblivious people spend twice as long in the bathroom as the conscientious or forgetful people. Assume the bathroom is vacant v of the time, where 0 < v < 1. Are all these assumptions consistent? Solve the bathroom problem under this general assumption.

__________________________________________________________________________

Books

Reading Suggestion:

References

[1]   S. Karlin and H. Taylor. A Second Course in Stochastic Processes. Academic Press, 1981.

[2]   Sheldon M. Ross. Introduction to Probability Models. Elsevier, 6th edition, 1997.

[3]   H. M. Taylor and Samuel Karlin. An Introduction to Stochastic Modeling. Academic Press, third edition, 1998.

__________________________________________________________________________

Links

Outside Readings and Links:

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable effort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reflects the thoughts, interests and opinions of its author. They do not explicitly represent official positions or policies of my employer.

Information on this website is subject to change without notice.

Steve Dunbar’s Home Page, http://www.math.unl.edu/~sdunbar1

Email to Steve Dunbar, sdunbar1 at unl dot edu

Last modified: Processed from LATEX source on September 25, 2017