Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

Assignment 2: Slice Sampling

10/02/2023

Aims

This assignment, worth 10% of the final module mark, aims to give you hands-on experience of some of the ideas discussed in the module. You will be implementing an MCMC scheme known as Slice Sampling.

Submission details

This assignment must be submitted, via the submission portal on Moodle, by 12 noon on 24th February 2022. You must submit two files:

1. A two page .pdf file, containing your work, the requirements of which are set out in the tasks below. This pdf must be generated using R Markdown or equivalent, so that code can be included in the pdf in a clean way. R markdown should be simple to use if you code in python: it is possible to include python code chunks. The page limit is strict - nothing will be marked after the first two pages, and files   that are not pdfs will receive a mark of zero.

2. A single file containing the all of the code you used to generate the results (any format).

The usual rules about plagiarism and collusion apply. In particular, your code should be your own; it will be checked for signs of collusion.

Preliminary task: Random Walk Metropolis

In this assignment, we will try and sample from the following (unnormalized) posterior density on R:

n         1

i=1 1 + (yi − θ)2

where n = 4 and we have (ordered) observations y = (y1, y2, y3, y4) = (−16.6, −14.7, 6.3, 8.4). This   corresponds to a Cauchy observation model with a flat uniform prior (p(θ) ≡ 1). This posterior is both heavy- tailed and multimodal, which makes it difficult for simple MCMC methods to sample from. Since the observations are now considered fixed, we will simply write π(θ) instead of π(θ|y) throughout.

As a baseline, we will first implement a Random Walk Metropolis– Hastings sampler to target π . As proposal distribution, you can use q(⋅|θ) = N(θ, 1), i.e. a Gaussian proposal centered at the current point.

1. (3 marks) Implement this Metropolis– Hastings algorithm, and show trace plots, on the same figure, of the chain for 5000 iterations started from a) θ0 = −20 and b) θ0 = 0. Comment on the mixing of the chain.

Main task: slice sampling

Slice Sampling, originally from Neal (2003), is a powerful alternative MCMC method, which works as follows:

Algorithm 1: Slice Sampling

a. Initialise θ0 and choose number of iterations N.

b. For t = 1, 2, … , N:

i. Draw s ∼ unif([0, π(θt−1)]).

ii. Draw θunif(Gs ), where Gs is the super-level-set:

Gs := {θ : π(θ) ≥ s}.

c. Output the chain (θ1 , θ2 , … , θN).

As you can see, step (bii) requires drawing a uniform point from the super-level-set Gs . The easiest way to do this is to use rejection sampling:

Algorithm 2: Rejection sampler for Gs

a. Given level s, construct a set Hs such that Gs ⊂ Hs . Set accept = 0.

b. while accept = 0,

i. Draw θ ∼ unif(Hs ).

ii. if θ ∈ Gs , set accept = 1.

c. Output θ .

The output of Algorithm 2 will be a uniform point from Gs (you do not need to prove this fact).

2. (3 marks) Given a super-level-set Gs , by considering the values y1 and yn + √for an appropriate value of α , or otherwise, give an interval Hs which contains Gs , and hence devise and        implement a rejection sampler to draw a uniform point from any given super-level-set Gs . Please display the code used for the rejection sampler.

3. (3 marks) Now implement the full Slice Sampling algorithm to sample from π . Show trace plots for the chain for 5000 iterations started from a) θ0 = −20 and b) θ0 = 0. Compare these with the previous  trace plots obtained from Metropolis– Hastings.

4. (1 mark) Now use your slice sampler to estimate to give point estimates of the tail probabilities under the posterior that P(θ > 8 ∣ y) and P(θ < −15 ∣ y).

Neal, R. M. (2003). Slice sampling. The Annals of Statistics, 31(3), 705–767.