ECONOMICS 351* -- NOTE 4 M.G. Abbott ECON 351* -- NOTE 4 Statistical Properties of the OLS Coefficient Estimators 1. Introduction ˆ We derived in Note 2 the OLS (Ordinary Least Squares) estimators β j (j = 0, 1) of the regression coefficients βj (j = 0, 1) in the simple linear regression model given by the population regression equation, or PRE Yi = β0 + β1Xi + ui (i = 1, …, N) (1) where ui is an iid random error term. The OLS sample regression equation (SRE) corresponding to PRE (1) is ˆ ˆ Yi = β0 + β1Xi + uˆ i (i = 1, …, N) (2) ˆ ˆ where β0 and β1 are the OLS coefficient estimators given by the formulas ˆ ∑i xi yi β1 = 2 (3) ∑i xi ˆ ˆ β0 = Y − β1X (4) xXii≡−X, yYii≡−Y, X= ∑ iiXN, and YY= ∑ iiN. Why Use the OLS Coefficient Estimators? The reason we use these OLS coefficient estimators is that, under assumptions A1- A8 of the classical linear regression model, they have several desirable statistical properties. This note examines these desirable statistical properties of the OLS ˆ coefficient estimators primarily in terms of the OLS slope coefficient estimator β1 ; ˆ the same properties apply to the intercept coefficient estimator β0 . ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 1 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott 2. Statistical Properties of the OLS Slope Coefficient Estimator ˆ ¾ PROPERTY 1: Linearity of β1 ˆ The OLS coefficient estimator β1 can be written as a linear function of the sample values of Y, the Yi (i = 1, ..., N). ˆ Proof: Starts with formula (3) for β1 : ˆ ∑i xi yi β1 = 2 ∑i xi ∑i xi (Yi − Y) = 2 ∑i xi ∑i xiYi Y ∑i xi = 2 − 2 ∑i xi ∑i xi ∑i xiYi = 2 because ∑i xi =0. ∑i xi 2 • Defining the observation weights kxii=∑ixi for i = 1, …, N, we can re- ˆ write the last expression above for β1 as: ˆ xi β1 = ∑i kiYi where ki ≡ 2 (i = 1, ..., N) … (P1) ∑i xi ˆ • Note that the formula (3) and the definition of the weights ki imply that β1 is also a linear function of the yi’s such that ˆ β1 = ∑i ki yi . ˆ Result: The OLS slope coefficient estimator β1 is a linear function of the sample values Yi or yi (i = 1,…,N), where the coefficient of Yi or yi is ki. ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 2 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott Properties of the Weights ki ˆ In order to establish the remaining properties of β1 , it is necessary to know the arithmetic properties of the weights ki. [K1] ∑iik=0, i.e., the weights ki sum to zero. x i 1 ∑=iik ∑i 22= ∑=iixx00,. because ∑=ii ∑ iixx∑ ii 2 1 [K2] ∑=iik 2 . ∑ iix 2 2 ∑ x2 2 ⎛ xi ⎞ xi ( ii) 1 ∑=iik ∑i⎜ ⎟ = ∑i = = . ⎝ ∑ x2 ⎠ 2 2 2 22∑ x ii ()∑ iix ()∑ iix ii [K3] ∑=iikxi ∑ ikiXi . ∑=iikxi ∑iki(Xi − X) =∑−iikXi X∑i ki =∑iikXi since ∑i ki=0 by [K1] above. [K4] ∑=iikxi 1. 2 2 ⎛ x ⎞ x (∑ iix ) ∑=kx ∑i x =∑ i = = 1. iii i⎜ 2 ⎟ ii 2 2 ⎝ ∑ iix ⎠ ()∑ iix (∑ iix ) Implication: ∑ iikXi= 1. ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 3 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott ˆ ˆ ¾ PROPERTY 2: Unbiasedness of β1 and β0 . ˆ ˆ The OLS coefficient estimator β1 is unbiased, meaning that E(β1 ) = β1 . ˆ ˆ The OLS coefficient estimator β0 is unbiased, meaning that E(β0 ) = β0 . ˆ • Definition of unbiasedness: The coefficient estimator β 1 is unbiased if and ˆ only if E(β1 ) = β1 ; i.e., its mean or expectation is equal to the true coefficient β1. ˆ ˆ Proof of unbiasedness of β1 : Start with the formula β1 = ∑i k iYi . 1. Since assumption A1 states that the PRE is Yi = β0 + β1Xi + ui , ˆ β1 = ∑i kiYi = ∑i ki (β0 + β1Xi + ui ) since Yi = β0 + β1Xi + ui by A1 = β0 ∑i ki + β1 ∑i kiXi + ∑i kiui = β1 + ∑i kiui , since ∑i ki = 0 and ∑i kiXi =1. ˆ 2. Now take expectations of the above expression for β1 , conditional on the sample values {Xi: i = 1, …, N} of the regressor X. Conditioning on the sample values of the regressor X means that the ki are treated as nonrandom, since the ki are functions only of the Xi. ˆ E(β1 ) = E(β1 ) + E[∑i k i u i ] = β1 + ∑i k i E(u i Xi ) since β1 is a constant and the k i are nonrandom = β1 + ∑i k i 0 since E(ui Xi ) = 0 by assumption A2 = β1. ˆ Result: The OLS slope coefficient estimator β1 is an unbiased estimator of the slope coefficient β1: that is, ˆ E(β1 ) = β1 . ... (P2) ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 4 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott ˆ ˆ ˆ Proof of unbiasedness of β0 : Start with the formula β0 = Y − β1X . 1. Average the PRE Yi = β0 + β1Xi + ui across i: N N N ∑ Yi = Nβ0 +β1 ∑ Xi + ∑ ui (sum the PRE over the N observations) i=1 i=1 i=1 N N N Y X u ∑ i Nβ ∑ i ∑ i i=1 = 0 + β i=1 + i=1 (divide by N) N N 1 N N Y = β0 + β1X + u where Y = ∑iYi N , X = ∑iXi N , and u = ∑iui N . ˆ ˆ 2. Substitute the above expression for Y into the formula β0 = Y − β1X : ˆ ˆ β0 = Y − β1X ˆ = β0 + β1X + u − β1X since Y = β0 + β1X + u ˆ = β0 + (β1 − β1)X + u. ˆ 3. Now take the expectation of β0 conditional on the sample values {Xi: i = 1, …, N} of the regressor X. Conditioning on the Xi means that X is treated as nonrandom in taking expectations, since X is a function only of the Xi. ˆ ˆ E(β0 ) = E(β0 ) + E[(β1 − β1)X] + E(u) ˆ = β0 + X E(β1 − β1) + E(u) since β0 is a constant ˆ = β0 + X E(β1 − β1) since E(u) = 0 by assumptions A2 and A5 ˆ = β0 + X []E(β1) − E(β1) ˆ = β0 + X (β1 − β1) since E(β1) = β1 and E(β1) = β1 = β0 ˆ Result: The OLS intercept coefficient estimator β0 is an unbiased estimator of the intercept coefficient β0: that is, ˆ E(β0 ) = β0 . ... (P2) ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 5 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott ˆ ¾ PROPERTY 3: Variance of β1 . ˆ • Definition: The variance of the OLS slope coefficient estimator β1 is defined as ˆ ˆ ˆ 2 Var()β1 ≡ E{[]β1 − E(β1) }. ˆ • Derivation of Expression for Var(β1 ): ˆ ˆ ˆ 1. Since β1 is an unbiased estimator of β1, E(β1 ) = β1. The variance of β1 can therefore be written as ˆ ˆ 2 Var()β1 = E{[]β1 − β1 }. ˆ 2. From part (1) of the unbiasedness proofs above, the term [β1 − β1], which is ˆ called the sampling error of β1 , is given by ˆ [β1 − β1 ] = ∑i kiui . 3. The square of the sampling error is therefore ˆ 2 2 []β1 − β1 = ()∑i kiui 4. Since the square of a sum is equal to the sum of the squares plus twice the sum of the cross products, ˆ 2 2 []β1 − β1 = ()∑i kiui N N 2 2 = ∑∑ki ui + 2 ∑kiksuius . is=<1 i s=2 ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 6 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott For example, if the summation involved only three terms, the square of the sum would be 3 2 ⎛ ⎞ 2 ⎜∑ kuii⎟ =+()k11u k2u2+k3u3 ⎝ i=1 ⎠ 2 2 2 2 2 2 =ku1 1 +++ku2 2 ku3 3 22k12ku1u2+k1k3u1u3+2kk23u2u3. 5. Now use assumptions A3 and A4 of the classical linear regression model (CLRM): 2 2 (A3) Var(u i Xi ) = E(ui Xi ) = σ > 0 for all i = 1, ..., N; (A4) Cov(ui ,us Xi ,Xs ) = E(uius Xi ,Xs ) = 0 for all i ≠ s. 6. We take expectations conditional on the sample values of the regressor X: N N ˆ 2 2 2 E[]()β1 − β1 = ∑∑ki E(ui Xi ) + 2 ∑kiksE(uius Xi ,Xs ) is=<1 i s=2 N 2 2 = ∑ ki E(ui Xi ) since E(uius Xi ,Xs ) = 0 for i ≠ s by (A4) i=1 N 2 2 2 2 = ∑ ki σ since E(ui Xi ) = σ ∀ i by (A3) i=1 2 σ 2 1 = 2 since ∑i ki = 2 by (K2). ∑i xi ∑i xi ˆ Result: The variance of the OLS slope coefficient estimator β1 is 2 2 2 ˆ σ σ σ 2 Var(β1) = 2 = 2 = where TSSX = ∑i xi . ... (P3) ∑i xi ∑i (Xi −X) TSSX ˆ The standard error of β1 is the square root of the variance: i.e., 1 ⎛ σ2 ⎞ 2 σ σ se(βˆ ) = Var(βˆ ) = ⎜ ⎟ = = . 1 1 ⎜ ∑ x 2 ⎟ 2 ⎝ i i ⎠ ∑i xi TSSX ECON 351* -- Note 4: Statistical Properties of OLS Estimators ... Page 7 of 12 pages ECONOMICS 351* -- NOTE 4 M.G. Abbott ˆ ¾ PROPERTY 4: Variance of β0 (given without proof). ˆ Result: The variance of the OLS intercept coefficient estimator β0 is 2 2 2 2 ˆ σ ∑i Xi σ ∑i Xi Var(β0 ) = 2 = 2 . ... (P4) N ∑i xi N ∑i (Xi −X) ˆ The standard error of β0 is the square root of the variance: i.e., 1 ⎛ σ2 ∑ X2 ⎞ 2 se(βˆ ) = Var(βˆ ) = ⎜ i i ⎟ .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-