MATH6153 Statistical Theory and Linear Models SEMESTER 1 EXAMINATION 2017/18
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
MATH6153
SEMESTER 1 EXAMINATION 2017/18
Statistical Theory and Linear Models
1. [25 marks]
Suppose that X is a random variable with standard normal distribution, with probability density function
fX (x) = exp(-x2 /2), x 2 R
and moment generating function mX (t) = exp(t2 /2).
(a) [6 marks] Use the moment generating function to show that
E(X) = 0, E(X2 ) = 1, and E(X3 ) = 0.
(b) [5 marks] Let Y = e2X . Derive the probability density function of Y. (c) [5 marks] By considering mX (t), or otherwise, show that
E(Y) = e2 ,
Var(Y) = e4 (e4 - 1).
(d) [9 marks] Let Z = X2. Derive the probability density function of Z. What is this distribution called?
2. [25 marks] Suppose that x1 , . . . , xn are independent observations of X, an exponentially distributed random variable with p.d.f.
fX (x) = θe-xθ , x > 0
where θ is a positive parameter.
(a) [10 marks] Find the Cramr-Rao lower bound for the variance of unbiased
estimators of θ, and hence derive the Cramr-Rao lower bound for the variance of unbiased estimates of φ = e-θ , in terms of θ .
(b) [3 marks] Find the maximum likelihood estimator of θ and its asymptotic distribution.
(c) [3 marks] Find the maximum likelihood estimator of φ and its asymptotic distribution (with any parameters expressed in terms of φ).
(d) [7 marks] Suppose that the Bernoulli random variable Y is derived from X by
Y = { 0(1) if X > 1
if X 1.
Show that Y , the proportion of observed ones in the sample, is an unbiased
estimator of φ. Find the variance of Y , and show that this variance exceeds the Cramr-Rao lower bound for unbiased estimators of φ .
(e) [2 marks] Show that the Cramr-Rao lower bound for unbiased estimators of λ 三 θ — 1 can be attained, and find the corresponding estimator.
3. In a particular set of Bernoulli trials, it is widely believed that the success probability is θ = 3/4. However, an alternative view is that θ = 2/3. In order totest H0 : θ = 3/4
against H1 : θ = 2/3, n independent trials are to be observed. Let θ(ˆ) denote the
proportion of successes in these trials. Assume that Φ(1.645) = 0.95 where Φ( ·) denotes the cumulative density function of the standard normal distribution.
(a) [7 marks] Show that the Neyman-Pearson approach leads to rejection of H0 in favour of H1 when
θ(ˆ) k
for some suitable k.
(b) [7 marks] By applying the central limit theorem, write down the large sample
distributions of θ(ˆ)when H0 is true and when H1 is true.
(c) [5 marks] Based on the large sample distribution, write down an expression for the probability of the Type I error. Hence find an expression for k in terms of n when the size of the testis α = 0.05.
(d) [6 marks] Write down an expression for the power of the test when H1 is true.
Hence find the value of n so that the test of H0 against H1 has power 0.95.
4. [25 marks]
(a) [12 marks] Suppose that Y1 , Y2 , . . . , Yn are independent observations and Yi is
normally distributed with mean βxi(c) and variance σ2 , for i = 1, . . . , n where
x1 , . . . , xn and c are known constants.
(i) [5 marks] Find the least squares estimate of β, denoted byβ(ˆ), by minimising
the sum of squares,
S = (yi - βxi(c))2 .
(ii) [4 marks] Show that
E(β(ˆ)) = β and Var(β(ˆ)) = .
(iii) [3 marks] Construct an unbiased estimator for σ2 based on the residual sum of squares and state its distribution.
(b) [13 marks] Suppose that x1 , . . . , xn are independent observations of X, an exponentially distributed random variable with p.d.f.
fX (x) = θe-xθ , x > 0
where θ is a positive parameter.
(i) [4 marks] Write down the likelihood function of θ. Hence write down a conjugate prior distribution for θ .
(ii) [5 marks] Obtain the posterior distribution of θ under the conjugate prior
distribution. Find the parameters of this posterior distribution and write down expressions for the mean and variance of the posterior distribution.
(iii) [4 marks] State the Bayes estimator of θ under the squared error loss
function. Under what conditions are the maximum likelihood and the Bayes estimator equivalent?
2024-01-17