Steven R. Dunbar
Department of Mathematics
203 Avery Hall
Lincoln, NE 68588-0130
http://www.math.unl.edu
Voice: 402-472-3731
Fax: 402-472-8466

Stochastic Processes and

__________________________________________________________________________

Stochastic Processes

_______________________________________________________________________

Note: These pages are prepared with MathJax. MathJax is an open source JavaScript display engine for mathematics that works in all browsers. See http://mathjax.org for details on supported browsers, accessibility, copy-and-paste, and other features.

_______________________________________________________________________________________________

Rating

Student: contains scenes of mild algebra or calculus that may require guidance.

_______________________________________________________________________________________________

Section Starter Question

Name something that is both random and varies over time. Does the randomness depend on the history of the process or only on its current state?

_______________________________________________________________________________________________

Key Concepts

1. A sequence or interval of random outcomes, that is, a string of random outcomes dependent on time as well as the randomness is a stochastic process. With the inclusion of a time variable, the rich range of random outcome distributions becomes a great variety of stochastic processes. Nevertheless, the most commonly studied types of random processes have connections.
2. Stochastic processes are functions of two variables, the time index and the sample point. As a consequence, stochastic processes are interpreted in several ways. The simplest is to look at the stochastic process at a ﬁxed value of time. The result is a random variable with a probability distribution, just as studied in elementary probability.
3. Another way to look at a stochastic process is to consider the stochastic process as a function of the sample point $\omega$. Each $\omega$ maps to an associated function of time $X\left(t\right)$. This means that one may look at a stochastic process as a mapping from the sample space $\Omega$ to a set of functions. In this interpretation, stochastic processes are a generalization from the random variables of elementary probability theory.

__________________________________________________________________________

Vocabulary

1. A sequence or interval of random outcomes, that is, random outcomes dependent on time is a stochastic process.
2. Let $J$ be a subset of the non-negative real numbers. Let $\Omega$ be a set, usually called the sample space or probability space. An element $\omega$ of $\Omega$ is a sample point or sample path. Let $S$ be a set of values, often the real numbers, called the state space. A stochastic process is a function $X:\left(J,\Omega \right)\to S$, that is a function of both time and the sample point to the state space.
3. The particular stochastic process usually called a simple random walk ${T}_{n}$ gives the position in the integers after taking a step to the right for a head, and a step to the left for a tail.
4. A generalization of a Markov chain is a Markov Process. In a Markov process, we allow the index set to be either a discrete set of times as the integers or an interval, such as the non-negative reals. Likewise the state space may be either a set of discrete values or an interval, even the whole real line. In mathematical notation a stochastic process $X\left(t\right)$ is called Markov if for every $n$ and ${t}_{1}<{t}_{2}<\dots <{t}_{n}$ and real number ${x}_{n}$, we have
$ℙ\left[X\left({t}_{n}\right)\le {x}_{n}\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}X\left({t}_{n-1}\right),\dots ,X\left({t}_{1}\right)\right]=ℙ\left[X\left({t}_{n}\right)\le {x}_{n}\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}X\left({t}_{n-1}\right)\right].$

Many of the models in this text will naturally be Markov processes because of the intuitive appeal of this “memory-less” property.

__________________________________________________________________________

Mathematical Ideas

Deﬁnition and Notations

A sequence or interval of random outcomes, that is, random outcomes dependent on time is a stochastic process. Stochastic is a synonym for “random.” The word is of Greek origin and means “pertaining to chance” (Greek stokhastikos, skillful in aiming; from stokhasts, diviner; from stokhazesthai, to guess at, to aim at; and from stochos target, aim, guess). The modiﬁer stochastic indicates that a subject is random in some aspect. Stochastic is often used in contrast to “deterministic,” which means that random phenomena are not involved.

More formally, let $J$ be subset of the non-negative real numbers. Usually $J$ is the nonnegative integers $0,1,2,\dots$ or the nonnegative reals $\left\{t:t\ge 0\right\}$. $J$ is the index set of the process, and we usually refer to $t\in J$ as the time variable. Let $\Omega$ be a set, usually called the sample space or probability space. An element $\omega$ of $\Omega$ is a sample point or sample path. Let $S$ be a set of values, often the real numbers, called the state space. A stochastic process is a function $X:\left(J,\Omega \right)\to S$, a function of both time and the sample point to the state space.

Because we are usually interested in the probability of sets of sample points that lead to a set of outcomes in the state space and not the individual sample points, the common practice is to suppress the dependence on the sample point. That is, we usually write $X\left(t\right)$ instead of the more complete $X\left(t,\omega \right)$. Furthermore, especially if the time set is discrete, say the nonnegative integers, then we usually write the index variable or time variable as a subscript. Thus ${X}_{n}$ would be the usual notation for a stochastic process indexed by the nonnegative integers and ${X}_{t}$ or $X\left(t\right)$ is a stochastic process indexed by the non-negative reals. Because of the randomness, we can think of a stochastic process as a random sequence if the index set is the nonnegative integers and a random function if the time variable is the nonnegative reals.

Examples

The most fundamental example of a stochastic process is a coin ﬂip sequence. The index set is the set of positive integers, counting the number of the ﬂip. The sample space is the set of all possible inﬁnite coin ﬂip sequences $\Omega =\left\{HHTHTTTHT\dots ,THTHTTHHT\dots ,\dots \right\}$. We take the state space to be the set $1,0$ so that ${X}_{n}=1$ if ﬂip $n$ comes up heads, and ${X}_{n}=0$ if the ﬂip comes up tails. Then the coin ﬂip stochastic process can be viewed as the set of all random sequences of 1’s and 0’s. An associated random process is to take ${X}_{0}=0$ and ${S}_{n}={\sum }_{j=0}^{n}{X}_{j}$ for $n\ge 0$. Now the state space is the set of nonnegative integers. The stochastic process ${S}_{n}$ counts the number of heads encountered in the ﬂipping sequence up to ﬂip number $n$.

Alternatively, we can take the same index set, the same probability space of coin ﬂip sequences and deﬁne ${Y}_{n}=1$ if ﬂip $n$ comes up heads, and ${Y}_{n}=-1$ if the ﬂip comes up tails. This is just another way to encode the coin ﬂips now as random sequences of $1$’s and $-1$’s. A more interesting associated random process is to take ${Y}_{0}=0$ and ${T}_{n}={\sum }_{j=0}^{n}{Y}_{j}$ for $n\ge 0$. Now the state space is the set of integers. The stochastic process ${T}_{n}$ gives the position in the integer number line after taking a step to the right for a head, and a step to the left for a tail. This particular stochastic process is usually called a simple random walk. We can generalize random walk by allowing the state space to be the set of points with integer coordinates in two-, three- or higher-dimensional space, called the integer lattice, and using some random device to select the direction at each step.

Markov Chains

A Markov chain is sequence of random variables ${X}_{j}$ where the index $j$ runs through $0,1,2,\dots$. The sample space is not speciﬁed explicitly, but it involves a sequence of random selections detailed by the eﬀect in the state space. The state space may be either a ﬁnite or inﬁnite set of discrete states. The deﬁning property of a Markov chain is that

$ℙ\left[{X}_{j}=l\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}{X}_{0}={k}_{0},{X}_{1}={k}_{1},\dots ,{X}_{j-1}={k}_{j-1}\right]=ℙ\left[{X}_{j}=l\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}{X}_{j-1}={k}_{j-1}\right].$

In more detail, the probability of transition from state ${k}_{j-1}$ at time $j-1$ to state $l$ at time $j$ depends only on ${k}_{j-1}$ and $l$, not on the history ${X}_{0}={k}_{0},{X}_{1}={k}_{1},\dots ,{X}_{j-2}={k}_{j-2}$ of how the process got to ${k}_{j-1}$

A simple random walk is an example of a Markov chain. The states are the integers and the transition probabilities are

Another example would be the position of a game piece in the board game Monopoly. The index set is the nonnegative integers listing the plays of the game, with ${X}_{0}$ denoting the starting position at the “Go” corner. The sample space is the set of inﬁnite sequence of rolls of a pair of dice. The state space is the set of 40 real-estate properties and other positions around the board.

Markov chains are an important and useful class of stochastic processes. Markov chains extended to making optimal decisions under uncertainty are “Markov decision processes”. Another extension to signal processing and bioinformatics is the “hidden Markov model”. Mathematicians have extensively studied and classiﬁed Markov chains and their extensions but we will not examine them carefully in this text.

A generalization of a Markov chain is a Markov process. In a Markov process, we allow the index set to be either a discrete set of times as the integers or an interval, such as the nonnegative reals. Likewise the state space may be either a set of discrete values or an interval, even the whole real line. In mathematical notation a stochastic process $X\left(t\right)$ is called Markov if for every $n$ and ${t}_{1}<{t}_{2}<\dots <{t}_{n}$ and real number ${x}_{n}$, we have

$ℙ\left[X\left({t}_{n}\right)\le {x}_{n}\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}X\left({t}_{n-1}\right),\dots ,X\left({t}_{1}\right)\right]=ℙ\left[X\left({t}_{n}\right)\le {x}_{n}\phantom{\rule{0.3em}{0ex}}|\phantom{\rule{0.3em}{0ex}}X\left({t}_{n-1}\right)\right].$

Many of the models in this text will naturally be Markov processes because of the intuitive modeling appeal of this “memory-less” property.

Many stochastic processes are naturally expressed as taking place in a discrete state space with a continuous time index. For example, consider radioactive decay, counting the number of atomic decays that have occurred up to time $t$ by using a Geiger counter. The discrete state variable is the number of clicks heard. The mathematical “Poisson process” is an excellent model of this physical process. More generally, instead of radioactive events giving a single daughter particle, imagine a birth event with a random number (distributed according to some probability law) of oﬀspring born at random times. Then the stochastic process measures the population in time. These are “birth processes” and make excellent models in population biology and the physics of cosmic rays. Continue to generalize and imagine that each individual in the population has a random life-span distributed according to some law, then dies. This gives a “birth-and-death process”. In another variation, imagine a disease with a random number of susceptible individuals getting infected, in turn infecting a random number of other individuals in the population, then recovering and becoming immune. The stochastic process counts the number of susceptible, infected and recovered individuals at any time, an “SIR epidemic process”.

In another variation, consider customers arriving at a service counter at random intervals with some speciﬁed distribution, often taken to be an exponential probability distribution with parameter $\lambda$. The customers get served one-by-one, each taking a random service time, again often taken to be exponentially distributed. The state space is the number of customers waiting for service, the queue length at any time. These are called “queuing processes”. Mathematically, these processes can be studied with “compound Poisson processes”.

Continuous Space Processes usually take the state space to be the real numbers or some interval of the reals. One example is the magnitude of noise on top of a signal, say a radio message. In practice the magnitude of the noise can be taken to be a random variable taking values in the real numbers, and changing in time. Then subtracting oﬀ the known signal leaves a continuous-time, continuous state-space stochastic process. To mitigate the noise’s eﬀect engineers model the characteristics of the process. To model noise means to specify the probability distribution of the random magnitude. A simple model is to take the distribution of values to be normally distributed, leading to the class of “Gaussian processes” including “white noise”.

Another continuous space and continuous time stochastic process is a model of the motion of particles suspended in a liquid or a gas. The random thermal perturbations in a liquid are responsible for a random walk phenomenon known as “Brownian motion” and also as the “Wiener process”, and the collisions of molecules in a gas are a “random walk” responsible for diﬀusion. In this process, we measure the position of the particle over time so that is a stochastic process from the nonnegative real numbers to either one-, two- or three-dimensional real space. Random walks have fascinating mathematical properties. Scientists make the model more realistic by including the eﬀects of inertia leading to a more reﬁned form of Brownian motion called the “Ornstein-Uhlenbeck process”.

Extending this idea to economics, we will model market prices of ﬁnancial assets such as stocks as a continuous time, continuous space process. Random market forces create small but constantly occurring price changes. This results in a stochastic process from a continuous time variable representing time to the reals or nonnegative reals representing prices. By reﬁning the model so that prices are nonnegative leads to the stochastic process known as “Geometric Brownian Motion”.

Family of Stochastic Processes

A sequence or interval of random outcomes, that is to say, a string of random outcomes dependent on time as well as the randomness is a stochastic process. With the inclusion of a time variable, the rich range of random outcome distributions becomes a great variety of stochastic processes. Nevertheless, the most commonly studied types of random processes have connections. A tree is in Figure 1, along with an indication of the stochastic process types studied in this text.

Ways to Interpret Stochastic Processes

Stochastic processes are functions of two variables, the time index and the sample point. As a consequence, stochastic processes are interpreted in several ways. The simplest is to look at the stochastic process at a ﬁxed value of time. The result is a random variable with a probability distribution, just as studied in elementary probability.

Another way to look at a stochastic process is to consider the stochastic process as a function of the sample point $\omega$. Each $\omega$ maps to an associated function $X\left(t\right)$. This means that one can look at a stochastic process as a mapping from the sample space $\Omega$ to a set of functions. In this interpretation, stochastic processes are a generalization from the random variables of elementary probability theory. In elementary probability theory, random variables are a mapping from a sample space to the real numbers, for stochastic processes the mapping is from a sample space to a space of functions. Now we ask questions like:

• “What is the probability of the set of functions that exceed a ﬁxed value on a ﬁxed time interval?”;
• “What is the probability of the set of functions having a certain limit at inﬁnity?”; and
• “What is the probability of the set of functions that are diﬀerentiable everywhere?”

This is a fruitful way to consider stochastic processes, but it requires sophisticated mathematical tools and careful analysis.

Another way to look at stochastic processes is to ask what happens at special times. For example, consider the time it takes until the function takes on one of two certain values, say $a$ and $b$. Then ask “What is the probability that the stochastic process assumes the value $a$ before it assumes the value $b$?” Note that the time that each function assumes the value $a$ is diﬀerent, it is a random time. This provides an interaction between the time variable and the sample point through the values of the function. This too is a fruitful way to think about stochastic processes.

In this text, we will consider each of these approaches with the corresponding questions.

Sources

The material in this section is adapted from many texts on probability theory and stochastic processes, especially the classic texts by S. Karlin and H. Taylor, S. Ross, and W. Feller.

_______________________________________________________________________________________________

Problems to Work for Understanding

__________________________________________________________________________

References

[1]   William Feller. An Introduction to Probability Theory and Its Applications, Volume I, volume I. John Wiley and Sons, third edition, 1973. QA 273 F3712.

[2]   Sheldon Ross. A First Course in Probability. Macmillan, 1976.

[3]   Sheldon M. Ross. Introduction to Probability Models. Academic Press, 9th edition, 2006.

[4]   H. M. Taylor and Samuel Karlin. An Introduction to Stochastic Modeling. Academic Press, third edition, 1998.

__________________________________________________________________________

1. Origlio, Vincenzo. “Stochastic.” From MathWorld–A Wolfram Web Resource, created by Eric W. Weisstein. Stochastic.
2. Weisstein, Eric W. “Stochastic Process.” From MathWorld–A Wolfram Web Resource. Stochastic Process.
3. Weisstein, Eric W. “Markov Chain.” From MathWorld–A Wolfram Web Resource. Markov Chain.
4. Weisstein, Eric W. “Markov Process.” From MathWorld–A Wolfram Web Resource. Markov Process.
5. Julia Ruscher studying stochastic processes.

__________________________________________________________________________

I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable eﬀort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk.

I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don’t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reﬂects the thoughts, interests and opinions of its author. They do not explicitly represent oﬃcial positions or policies of my employer.

Information on this website is subject to change without notice.