Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

Econ 410 Practice Problems

Day 6

The following assumptions are known as the Simple Classical Linear Regression Model:

SLR.1 (Linear in Parameters) The population model (in other words, the true model) can be written as:

y = β0 + β1 x + u

where β0  and β1  are the population intercept and slope parameters, respectively, and u is the unobservable random error.

SLR.2 (Random Sampling) We have a simple random sample of size n, {(xi , yi ) : i = 1, 2, . . . , n}, following the population model defined in SLR.1.

SLR.3 (No Perfect Collinearity) The sample outcomes of x, namely xi  : i = 1, ..., n are not all the same value.

SLR.4 (Zero Conditional Mean) The error term (u) has an expected value of zero given any value of the explanatory variable. In other words, E(u|x) = 0.

SLR.5 (Homoskedasticity) The error term (u) has the same variance given any value of the explanatory variable. In other words, Var(u|x) = σ 2 .

Today we proved that, when assumptions SLR.1 through SLR.4 are satisfied, this is sufficient to conclude that OLS is unbiased:

1.  SLR.1 - SLR.4 ÷ OLS estimators of β0  and β1 are unbiased

Exercise 1: Given this result, if SLR.4 is violated, can we conclude the OLS slope estimator will be biased?

Exercise 2: The proof for this unbiasedness result can be found in the box below.  Please skim the proof. Where in this proof did we make use of the SLR.1 assumption? What about SLR.2-SLR.4?

Review from Video 6.4: Proof that OLS slope estimator is unbiased


Exercise 3: When we perform OLS on a simple regression, which of the following expressions are equal to zero and why?

and

● E(u) and E(”u)

● E()