Introduction to Probability Modeling – Chapter 04

Markov Chains

Example 4.01 (Forecasting the Weather) Suppose that the chance of rain tomorrow depends on previous weather conditions only through whether or not it is raining today and not on past weather conditions. Suppose also that if it rains today, then it will rain tomorrow with probability a; and if it does not rain today, then it will rain tomorrow with probability ß. If we say that the process is in state 0 when it rains and state 1 when it does not rain, then the preceding is a two-state Markov chain whose transition probabilities are given by

Example 4.02 (A Communications System) Consider a communications system that transmits the digits 0 and 1. Each digit transmitted must pass through several stages, at each of which there is a probabitlity p that the digit entered will be unchanged when it leaves. Letting Xn denote the digit entering the nh stage, then {Xn, n = 0, 1, . . .} is a two-state Markov chain having a transition probability matrix

Example 4.03 On any given day Gary is either cheerful (C), so-so (S), or glum (G). If he is cheerful today, then he will be C, S, or G tomorrow with respective probabilities 0.5, 0.4, 0.1. If he is feeling so-so today, then he will be C, S, or G tomorrow with probabilities 0.3, 0.4, 0.3. If he is glum today, then he will be C, S, or G tomorrow with probabilities 0.2, 0.3, 0.5. Letting Xn denote Gary’s mood on an nth day, then {Xn, n ≥ 0} n is a three-state Markov chain (state 0 =C, state 1 = S, state 2 = G) with transition probability matrix defined below

Example 4.04 (Transforming a Process into a Markov Chain) Suppose that whether or not it rains today depends on previous weather conditions through the last two days. Specifically, suppose that if it has rained for the past two days, then it will
rain tomorrow with probability 0.7; if it rained today but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today, then it will rain tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2. If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not a Markov chain (why not?). However, we can transform this model into a Markov chain by saying that the state at any time is determined by the weather conditions during both that day and the previous day. In other words, we can say that the process is in:
•state 0 if it rained both today and yesterday
•state 1 if it rained today but not yesterday
•state 2 if it rained yesterday but not today
•state 3 if it did not rain either yesterday or today

Example 4.05 (A Random Walk Model) A Markov chain whose state space is given by the integers i = 0,±1,±2, . . . is said to be a random walk if, for some number 0 < p < 1, Pi,i+1 = p = 1 − Pi,i−1, i = 0,±1, . . .
The preceding Markov chain is called a random walk for we may think of it as being a model for an individual walking on a straight line who at each point of time either takes one step to the right with probability p or one step to the left with probability 1 − p.

Example 4.06 (A Gambling Model) Consider a gambler who, at each play of the game, either wins $1 with probability p or loses $1 with probability 1 − p. If we suppose that our gambler quits playing either when he goes broke or he attains a fortune of $N, then the gambler’s fortune is a Markov chain having transition probabilities Pi,i+1 = p = 1 − Pi,i−1, i = 1, 2, . . . , N − 1, P00 = PNN = 1 States 0 and N are called absorbing states since once entered they are never left. Note that the preceding is a finite state random walk with absorbing barriers (states 0 and N).

Exercise 4.08 Consider Example 4.1 in which the weather is considered as a two-state Markov chain. If α = 0.7 and β = 0.4, then calculate the probability that it will rain four days from today given that it is raining today. The transition matrix is defined below

Example 4.09 Consider Example 4.4. Given that it rained on Monday and Tuesday, what is the probability that it will rain on Thursday? The transition matrix is given below

Example 4.10 An urn always contains 2 balls. Ball colors are red and blue. At each stage, a ball is randomly chosen and then replaced by a new ball, which with probability 0.8 is the same color, and with probability 0.2 is the opposite color, as the ball it replaces. If initially, both balls are red, find the probability that the fifth ball selected is red.