Encyclopaedia of DesignTheory: Topic Essay

Markov chains and random walks

A printable PDF version of this document is available.

1. Choosing at random

Suppose I have a fair coin. How can I choose a random Latin square of order 99? A fair coin is a device which can in one step (or toss) produce one bit of information (a 0 or a 1, or informally, "heads" or "tails"), in such a way that the results of different tosses are independent -- this means that each of the 2n sequences of results produced by n tosses occurs with the same probability, namely 1/2n. Given a fair coin, there is a simple algorithm for choosing a random integer x in the range [0,2n-1]. We just toss the coin n times and interpret the sequence of bits as the expansion of x in base 2. What about choosing an integer in the range [0,N-1], where N is arbitrary? We cannot do this with a bounded number of coin tosses if N is not a power of 2, since the probability of any event defined by n coin tosses is a rational number with denominator 2n. So we have to make a small compromise, as follows. Choose n to be the least integer such that 2n>=N. Choose an integer in the range [0,2n-1] as before. If it is smaller than N, we accept this result; otherwise we try again, and continue until a result is obtained. It is not hard to show that, if p=N/2n, then (The last two statements follow because the number of attempts is a geometric random variable). Since p>½, the expected number of attempts is less than 2 and the probability of needing a long series of tries is exponentially small.

Now we can choose a random structure of a certain type in some situations. If we can count the structures, then we may suppose there are N altogether; choose a random number x from [0,N-1], skip over the first x structures in the count, and take the next one. If each structure is determined by a sequence of choices, and making these choices uniformly gives the uniform distribution, then we can make the choices at random as long as we know how many choices there are at each stage.

For example, how can we choose a random permutation µ of the set {1,...,n}? The image 1µ of 1 can be any of 1,...,n; choose this image at random. Then 2µ can be any of 1,...,n except 1µ; choose a random number x from 1,...,n-1 and add one if x>=1µ, then set 2µ=x. Continuing in this way gives the required random permutation.

Choosing a random graph on n vertices is even easier: simply decide with a single coin toss whether each pair of vertices is joined by an edge or not. (Indeed, the number of graphs is 2n(n-1)/2, and counting them is equivalent to choosing one at random.)

In other cases, e.g. Latin squares, we don't even know how many structures there are, so choosing one at random cannot be done by these methods; we need a new idea.

2. Markov chains and random walks

We consider only Markov chains with a finite number of states. Let S={s1,...,sm} be a finite set of states. Suppose we are given a matrix P=(pij) of order m, whose entries are non-negative real numbers satisfying
sumj=1m pij=1.
The interpretation is that we have a marker on one of the states; at a certain moment it moves to a new state, where the probability of moving from state si to state sj is pij. The displayed equation just reflects the fact that the marker is bound to move to some state!

We can now iterate this procedure. The marker starts out on some state, possibly chosen at random from an arbitrary probability distribution. At each positive integer time it makes a transition according to the specification above. We are interested in how it behaves in the long term. There are two extremes of behaviour: