关键词 > STATS310/732
STATS 310/732 Introduction to Statistical Inference SEMESTER 1, 2020
发布时间:2022-06-02
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
STATS 310/732
SEMESTER 1, 2020
STATISTICS
Introduction to Statistical Inference
Foundations of Statistical Inference
1. Let the random variables X1 ,X2 have the joint density function. f(x1 ,x2 ) = 6x1 , x1 > 0,x2 > 0,x1 + x2 < 1.
(a) Find E(X2 |X1 = x1 ).
[7 marks] (b) Find the joint density function of Y1 = X1X2 and Y2 = X2 on its support.
[7 marks] (c) Find the support of the joint distribution of Y1 and Y2 .
[3 marks] [17 marks]
2. Let X1 ,X2 ∼ N(µ,σ2 ) independently. Show that E(|X1 − X2 |) = 2σ/√π .
[6 marks]
Answer: Using X1 − X2 ∼ N(0, 2σ2 ),
E(|X1 − X2 |) = − |t| · e− t2222 dt
= 2 0 ∞ t · e− t242 dt
= − e − t242 d −t24σ2
√π 0
= 2σ
3. Let Xn ∼ Gamma(n,λ) for n = 1, 2, . . . and λ > 0, and
With λ xed, nd the limiting distribution of Yn as n → ∞ .
[6 marks]
4. Let X1 ,X2 , . . . ,Xn be a random sample drawn independently from a distribution with probability density function
f(x;θ) = 2θxexp(−θx2 ), x > 0, θ > 0.
(a) Find the method-of-moments estimator θ 1 of θ .
(b) Find the maximum likelihood estimator
(c) Find the Fisher information.
[5 marks]
(d) Are θ 1 and θ 2 the minimum variance unbiased estimators of θ? Justify your
5. Let X1 ,X2 , . . . ,Xn N(θ1 ,θ3 ) and Y1 ,Y2 , . . . ,Ym N(θ2 ,θ3 ) independently. Denote θ = (θ1 ,θ2 ,θ3 )T .
(a) Write down the expression for the log-likelihood function ℓ(θ).
[5 marks]
(b) Find the maximum likelihood estimator θ of θ . (You do not need to perform
(c) Find the Fisher information.
[5 marks]
(d) Consider using −2 log(LR) to test H0 : θ 1 = θ2 against H1 : θ 1 θ2 . Find
that is used to reject H0 at the signi cance level of α .
[5 marks] [20 marks]
6. Let X be a random variable with density function
f(x;θ) = θe−θx, x > 0, θ > 0.
To test H0 : θ = 1 against H1 : θ = 1, an independent sample of size n was taken from the distribution. Show that the uniformly most powerful test does not exist in this case.
[10 marks]
7. Let Y1 ,Y2 ,Y3 be independent normally-distributed random variables with
E(Y1 ) = β1 − 2β2 , E(Y2 ) = β2 , E(Y3 ) = β1 + 2β2
and a common variance σ 2 .
(a) Find the least squares estimator β of β = (β1 ,β2 )T . [5 marks]
θ = d1 Y1 + d2 Y2 + d3 Y3
[5 marks]
[5 marks] [15 marks]
8. Consider the linear regression model
y = Xβ + ǫ .
Let x be an arbitrary vector in the column space of the design matrix X, and
r = y − Xβ, where β is the least squares estimator of β . Using algebra (not
9. For a decision-making problem, the following table gives R(di ,θj ), the risk of each candidate decision rule di for each state of nature θj :
θ 1 θ2
25
19
23
17
23
17
21
15
(a) Find the minimax rule(s). Justify your answer.
[5 marks]
(b) If a Bayesian has a prior belief that P( = θ 1 ) = 0.4 and P( = θ2 ) = 0.6, will the Bayesian prefer d1 to d8 ? Justify your answer.
[5 marks] [10 marks]
10. l(i)m(e)at(π)or(=) .(12,C13a 61 )we(T) s(f)a(o)y(r)tha(som)t(e) t(p)his(ar)aB(m)a(e)y(t)e(e)s(r) es(θ)ti(∈)m({)to(2),r3us(a)t(n)dbe(d)ean(no)ta(e)dm(by)issi
[5 marks]
11. Let the loss function be
where c1 ,c2 > 0 are constants. Show that the Bayes estimator of θ is some α- quantile of the posterior distribution, and nd the expression for α .
[5 marks]