STAT2004J – Linear Modelling Tutorial 2
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
STAT2004J – Linear Modelling
Tutorial 2
Question 1. The observations y1 , y2 , y3 were taken on the random variables Y1, Y2 , Y3 , where
Y1 = θ + 1 ; Y2 = 2θ - φ + 2 ; Y3 = θ + 2φ + 3
E( i) = 0, ; Var( i) = σ2 (i = 1, 2, 3), ; Cov( i , 2 ) = 0 (i j)
Find the least squares estimates of θ and φ .
Question 2. Given a random sample of size n with values of the response variable y1 , y2 , . . . , yn , ind the least squares estimator of the parameter µ in the model:
yi = µ + i ,
where E( i) = 0, Var( i) = σ2 and the i are uncorrelated.
Question 3. Consider the simple linear regression model
Yi = β0 + β1Xi + i, i = 1, . . . , n
with E( i) = 0, Var( i) = σ2 and the i are uncorrelated.
(a) If β1 is known, ind the least squares estimator of β0 .
(b) If β0 is known, ind the least squares estimator of β1 .
Question 4. A linear regression model may be written as either
Yi = β0 + β1Xi + i, i = 1, . . . , n
or -
Yi = α0 + α1 (Xi - X) + i, i = 1, . . . , n
under the same model assumptions.
(a) Find the relationship between the αs and βs.
(b) Use the method of least squares to estimate α0 and α1 .
(c) Find the variance and covariance of the estimators ˆ(α)0 and ˆ(α)1 .
Question 5. Let β(ˆ)0 and β(ˆ)1 be the OLS estimators for a simple linear regression model with
parameters β0 and β1. Also let (β(ˆ)0(、) , β(ˆ)1(、)) be any arbitrary unbiased linear estimators for (β0 , β1 ).
Apply the Gauss–Markov theorem to show the following:
(a) β(ˆ)0 is the best unbiased linear estimator for β0 .
Var(β(ˆ)0 ) = El(β(ˆ)0 - β0 )2] El(β(ˆ)0(、) - β0 )2]= Var(β(ˆ)0(、))
(b) β(ˆ)1 is the best unbiased linear estimator for β1 .
Var(β(ˆ)1 ) = El(β(ˆ)1 - β1 )2] El(β(ˆ)1(、) - β1 )2]= Var(β(ˆ)1(、))
(c)
Var(β(ˆ)0 )+Var(β(ˆ)1 ) = El(β(ˆ)0 -β0 )2 +(β(ˆ)1 -β1 )2] El(β(ˆ)0(、) -β0 )2 +(β(ˆ)1(、) -β1 )2]= Var(β(ˆ)0(、))+Var(β(ˆ)1(、))
2023-12-29