Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

Applied Probability

1. Answers true or false to each of the following questions. You must also provide a very brief (1-3 lines) justification for each of your answers. You might, for example, wish to refer to a theorem discussed in lectures.

(a) For discrete-time Markov chains, hitting times are always nite with probability 1.   [1 mark]

(b) For an irreducible discrete-time Markov chain with a nite state space, hitting prob- abilities are always 1 for every state. [1 mark]

(c) If p = 0 for all m > 0, then i and j do not communicate. [1 mark]

(d) If a state i has period d and is null recurrent then n(l)!1(im)pi(nd))  > 0. [1 mark]

(e) The discrete-time Markov chain with transition matrix

P =  01(0)      1

@0   0    1A

is irreducible. [1 mark]

(f) A nite state, irreducible, discrete-time Markov chain may be transient. [1 mark]

(g) For all disjoint sets Ai  2 F, where F is a σ-algebra, a probability measure has the

property

P   Ai ! = P(Ai ).

[1 mark]

(h) For fixed k ≥ 0, the kth time a random process visits a collection of states is a stopping time. [1 mark]

(i) If (Xn ,n 2 N) is a martingale, then it has the property    E[Xn+1 − Xn  | X0 , ...,Xn] = 0. [1 mark]

(j) Let (Xn ,n 2 N) be a martingale such that |Xn | < M < 1 almost surely for all n 2 N,

where M is a constant, and let T be a stopping time such that T < K < 1 almost surely, where K is a constant, then E[XT ] = E[X0].

[1 mark] [Total: 10 marks]

2.  Consider the discrete-time Markov chain, (Xn ,n 2 N), with state space S = {1, 2, 3, 4, 5} and with the transition probability matrix given by

0 0 B(B)0.5

B

B

B

0

0.5

0

0.5

0

0.8

0

0

0

0.5

0

0.5

0

0.5

0

0.2

0 1 0.5 C(C)

C  0 C .

C

0.5 C(C)

A

(a) Identify all the communicating class(es) in this discrete-time Markov chain, and state whether they are positive recurrent, null recurrent or transient. [2 marks]

(b) State the period of state 1. [1 mark]

(c) Determine the period(s) of the communicating class(es) of (Xn ,n 2 N). Justify your answer. [2 marks]

(d) Does any communicating class in this discrete-time Markov chain have a limiting distribution? Justify your answer. [2 marks]

(e) Does any communicating class in this discrete-time Markov chain have a stationary

distribution? Justify your answer. [2 marks]

(f) For all i 2 S, define

f := P(Xn  = i,X`   i,` = 1, 2, ...,n 1 | X0  = i),

p := P(Xn  = i | X0  = i),

and

[2 marks]

(g) Describe, in simple terms, the di↵erence between p  and f  and state whether p  f or p f

[2 marks] [Total: 13 marks]

3.  Schnitzel Von Krumm, an adventurous dog, regularly leaves his home and wanders the streets. Schnitzel Von Krumm travels in such a way that after each hour of walking, he is either 1 km further away from home with probability , or 1 km closer to home with probability  . When he is at home, he stays home with probability , and leaves to walk for an hour with probability  . The distance he is from home after n hours have passed is modelled by the discrete-time Markov chain (Xn ,n 2 N)onthe state space S = {0, 1, 2, ...}, with non-zero transition probabilities

pi,i+1  =  , pi,i1  =  ,

p0,0  = 1

0,

i ≥ 1,

(a) Let ki({A})  be the expected hitting time on the set of states A, given that the process starts in state i. It can be shown that ki(A)  satisfies

80             for i 2 A,

:      j A

There may be multiple solutions to the system of equations (1). What criterion determines the particular solution that we require? [1 mark]

(b) Write down the system of di↵erence equations, and the boundary equation, for the expected time it takes Schnitzel Von Krumm to return home, given he starts i ≥ 0 kilometres away. [3 marks]

(c) Show that the expected time it takes Schnitzel Von Krumm to return home, given he starts i ≥ 0 kilometres away, is given by ki({)0}  = A+Bi− i2 , for some constants A and B . [6 marks]

(d) Use your answer to Part (a) and the boundary condition to determine the constants A and B, and therefore state the expected time it takes for Schnitzel Von Krumm to return home, given he starts i ≥ 0 kilometres away. [4 marks]

(e) Write down the Global Balance Equations for the Markov chain (Xn ,n 2 N). DO

NOT SOLVE. [3 marks]

(f) For j  ≥ 0, let B = {0, ...,j} and write down the partial balance equations for the Markov chain (Xn ,n 2 N). Use them to show that no stationary distribution exists. [5 marks]

(g) If we alter the random walk that describes Schnitzel Von Krumm’s distance from home so that state 0 is absorbing, then embedded within the process is a branching process. Write down the o↵spring distribution of this branching process. [1 mark]

(h) Find the mean of the o↵spring distribution of the branching process and, hence, state the probability that Schnitzel Von Krumm ever returns home, given he starts i ≥ 1 kilometres away.

[2 marks] [Total: 25 marks]

4. Let 0 <  µ  <  1 be the mean number of o↵spring of an individual in a discrete-time branching process, and let Xn , n 2 N, be the size of the nth generation such that 0 < X0  < 1 almost surely.

(a) Show that E[Xn] = µn E[X0]. [6 marks]

(b) Hence, or otherwise, show that Zn  := µn , n 2 N, is a martingale.

[7 marks] [Total: 13 marks]

5. Let (Xn ,n 2 N) be a discrete-time Markov chain with state space S, transition probabilities pij   := P(Xn+1  = j|Xn  = i), and transition probability matrix P. The m-step transition probabilities are p  := P(Xn+m   = j|Xn   = i), and the m-step transition probability matrix, denoted P(m), has elements P(m)ij   = p   Prove that the m-step transition probability matrix is given by P(m)  = Pm  for all m 2 {1, 2, ...}.                [5 marks]