Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

STA 4001

Stochastic Processes

Fall 2022

please submit your homework through blackboard.

1. Consider a Markov chain {Xn , n ≥ 0} on the state space E = {1, 2, 3} with transition probability matrix

given by

P =  !  (

 

# 0(3)$

Assume that the (initial) distribution of X0 is the uniform distribution on E.

(a) Calculate P(X3  = 2, X1  = 3).

(b) Calculate P(X4  = 1, X3    1, X2    1|X1  = 1).

(c) Determine whether a limiting distribution exists for this MC, justifying your answer.  If so, find the limiting distribution.

2. Determine the transient and recurrent classes of the Markov chain with the following transition probability matrix. The state space is S = {0, 1, 2, 3, 4, 5}

state

0

1

P =     2

3

4

5

!

% % % % % %

(

0

0

0

0

0

0

0.2

1

0

0.3

0.1

0

0

0

2      3

0      0

0.7     0

0.8   0.1

0     0.1 0     0.9 0      0

4

0

0

0

0.9

0.1

0

5

0(1)   #

(a) Write down the transient and recurrent classes.

(b)  Suppose the Markov chain starts from state 1. Determine the probability that it will never re-enter state

1.

3. Four people sit at a round table for dinner. As appetizer, they play a game where a ball is passed from one to another as follows. At each round of the game, the person holding the ball will choose at random one of his two neighbors and then pass him the ball. The four people are numbered 1, 2, 3 and 4 in a clockwise order.

(a) Explain briefly why the successive positions of the ball in the game form a Markov chain. (b)  Give the transition matrix of the Markov chain.

(c) Find the communication classes of the chain.

(d) Find the periods of the states of the chain.

(e) Does the chain have limiting probabilities?

(f) Mr. Kwok, the person with the number 4, is now holding the ball. Find the expected number of rounds to be played until the ball goes back to him.

4. Consider a reservoir with a total capacity of h units of water. Let Xn represent the amount of water in the reservoir at the end of the n-th day. You are given the following information.

• The daily inputs to the reservoir are independent and discrete random variables. On a given day, P(input is j units) =  , j = 0, 1, 2, . . . .

• Any overflow (when the total amount of water exceeds the capacity of h units) is regarded as a loss.

• Provided the reservoir is not empty, one unit is released at the end of the day.

• The value of Xn for day n is the content of the reservoir after the release at the end of the day.            Therefore, the stochastic process {Xn , n = 1, 2, . . .} is a Markov chain with state space {0, 1, 2, . . . , h−1}.

(a) Explain why the stochastic process is a Markov Chain.

(b)  Show that it is irreducible, aperiodic and positive recurrent.

For the remaining part of this question, assume h = 3.

(c) Determine the one-step transition matrix.

(d) Let τi be the rst hitting time of state i. Find

P(τ0  < ∞|X0  = 2).

(e) Find the limiting probability

lim  P(Xn  = 0).

(f) The reservoir is initially having 2 units of water. Find the expected number of days until the reservoir is empty.

5. A coin is successively flipped and its probability of showing up heads is p ∈ (0, 1). Let Y1 , Y2 , . . . be the sequence of outcomes. For n ≥ 0, let

Xn  = (Yn+1 , Yn+2 , Yn+3),

that is the nth triple of consecutive outcomes from the flips. The sequence Xn is a Markov chain on the state space

E = {TTT,TTH,THT,HTT,THH,HTH,HHT,HHH}  :=  {1, 2, 3, 4, 5, 6, 7, 8},

where T and H denotes tail and head shown in the flips while the 8 possible states are also numbered from 1 to 8 in the order given above.

In all the calculations, we denote q = 1 p to simplify the formulas.

(a)  Suppose we just see THH as the last three outcomes. Determine the probability to observe, after two

more flips, the triples THH and HTT.

(b) Determine the transition matrix of the chain.

(c) Determine the communication classes of the chain.

(d) Determine the periods of the states.

(e)  Show that the long run proportions of the eight states in E exist for the chain and determine these proportions.

(f) The triples appearing in the chain are overlapping for near times: for example by definition the last two flip outcomes {Yn+2 , Yn+3} in Xn  are identical to the rst two ones in Xn+1 . We now consider the non-overlapping triples in the sequence as follows:

Y1 Y2 Y3 , Y4 Y5 Y6 , Y7 Y8 Y9 , . . . .

Clearly, the outcomes from these non-overlapping triples are still the eight ones in E.

Show that the long run proportions of the eight states in E from these non-overlapping triples still exist and they coincide with those found in (e).

6.    (a) Consider the random walk (Xn )n0 on Z where the forward and backward probabilities are 0 < p < 1 and q = 1 − p, respectively.

i. What are the transition probabilities of the random walk?  What do you know about the tran- sient/recurrent status of the states?

ii. When starting in state i (i ∈ Z) , show that the random walk can be represented as X0  = i and, for all n ≥ 1,

Xn  = i + A1 + · · · + An ,                                                   ( 1)

where (Ai )i1 is a sequence of i.i.d. random variables with p.m.f. P(A1  = 1) = p and P(A1  = −1) = q.

iii. Find the limit of Xn when n → ∞ .

iv. Derive that for p  q, state i is transient.

(b) Consider a reflected random walk (Wn )n0 on N = {0, 1, 2, . . .} (set of nonnegative integers) defined as follows: Wo  = i where i ∈ N and, for n ≥ 0,

Wn+1  = 1{Wn >0}(Wn + An+1) + 1{Wn =0}  ,

where the sequence (Ai ) is the same as the one in the representation of the random walk (Xn ) in Equation (1).

i.  Show that (Wn ) is a Markov chain and determine its transition probabilities.

ii.  Suppose both chains (Xn ) and (Wn ) start in the same state i ∈ N. Show that for all n, Wn  ≥ Xn .

iii. When p >  , is state i transient or recurrent for the Markov chain (Wn ) ?

iv. For p <  , show that state i is recurrent for the Markov chain (Wn ) (consider first the case i = 1).

v. For p <  , show that (Wn ) is a reversible Markov chain.