Lecture 4: Multivariate Regression Model in Matrix Form

Lecture 4: Multivariate Regression Model in Matrix Form

Takashi Yamano Lecture Notes on Advanced Econometrics Lecture 4: Multivariate Regression Model in Matrix Form In this lecture, we rewrite the multiple regression model in the matrix form. A general multiple-regression model can be written as yi = β 0 + β1 xi1 + β 2 xi2 + ...+ β k xik + ui for i = 1, … ,n. In matrix form, we can rewrite this model as β 0 y1 1 x11 x12 ... x1k u1 β1 y 1 x x ... x u 2 = 21 22 2k β + 2 2 y 1 x x ... x u n n1 n2 nk n β k n x 1 n x (k+1) (k+1) x 1 n x 1 Y = Xβ + u We want to estimate β . Least Squared Residual Approach in Matrix Form (Please see Lecture Note A1 for details) The strategy in the least squared residual approach is the same as in the bivariate linear regression model. First, we calculate the sum of squared residuals and, second, find a set of estimators that minimize the sum. Thus, the minimizing problem of the sum of the squared residuals in matrix form is min u′u = (Y − Xβ ′() Y − Xβ ) 1 x n n x 1 1 Notice here that u′u is a scalar or number (such as 10,000) because u′is a 1 x n matrix and u is a n x 1 matrix and the product of these two matrices is a 1 x 1 matrix (thus a scalar). Then, we can take the first derivative of this object function in matrix form. First, we simplify the matrices: u′u = (Y ′ − β ′X ′)(Y − Xβ ) = Y ′Y − β ′X ′Y − Y ′Xβ + β ′X ′Xβ = Y ′Y − 2β ′X ′Y + β ′X ′Xβ Then, by taking the first derivative with respect to β , we have: ∂(u′u) = −2X ′Y + 2X ′Xβ ∂β From the first order condition (F.O.C.), we have − 2X ′Y + 2X ′Xβˆ = 0 X ′Xβˆ = X ′Y Notice that I have replaced β with βˆ because βˆ satisfy the F.O.C, by definition. Multiply the inverse matrix of (X ′X ) −1 on the both sides, and we have: βˆ = (X ′X ) −1 X ′Y (1) This is the least squared estimator for the multivariate regression linear model in matrix form. We call it as the Ordinary Least Squared (OLS) estimator. Note that the first order conditions (4-2) can be written in matrix form as 2 X ′(Y − Xβˆ) = 0 βˆ 1 1 ... 1 y1 1 x11 ... x1k 0 x x ... x y 1 x ...x βˆ 11 21 n1 2 − 21 2k 1 = 0 x1k x2k ... xnk yn 1 xn1 ... xnk ˆ β k ˆ ˆ ˆ 1 1 ... 1 y1 − β 0 − β1 x11 ...− β k x1k x x ... x y − βˆ − βˆ x ... − βˆ x 11 21 n1 2 0 1 21 k 2k = 0 x1k x2k ... xnk ˆ ˆ ˆ yn − β 0 − β1 xn1 ... − β k xnk (k+1) x n n x 1 This is the same as the first order conditions, k+1 conditions, we derived in the previous lecture note (on the simple regression model): n ∧ ∧ ∧ ∑(yi − β 0 − β 1 xi1 − β 2 xi2 −...−bk xik ) = 0 i=1 n ∧ ∧ ∧ ∑ xi1 (yi − β 0 − β 1 xi1 − β 2 xi2 −...−bk xik ) = 0 i=1 n ∧ ∧ ∧ ∑ xik (yi − β 0 − β 1 xi1 − β 2 xi2 −...−bk xik ) = 0 i=1 Example 4-1 : A bivariate linear regression (k=1) in matrix form As an example, let’s consider a bivariate model in matrix form. A bivariate model is yi = β 0 + β1 xi1 + ui for i = 1, …, n. In matrix form, this is Y = Xβ + u 3 y1 1 x1 u1 y 1 x β u 2 = 2 0 + 2 β1 yn 1 xn un From (1), we have βˆ = (X ′X ) −1 X ′Y (2) Let’s consider each component in (2). n n 1 x1 1 x i n xn 1 1 ... 1 1 x ∑ ∑ X ′X = 2 = i=1 i=1 = n n n 2 x1 x2 ... xn 2 xn ∑ xi ∑xi ∑ xi i=1 1 xn i=1i = 1 This is a 2 x 2 square matrix. Thus, the inverse matrix of X ′X is, n 1 x 2 − xn ′ −1 ∑ i (X X ) = n i=1 2 2 n ∑ xi −n x − xn n i=1 n 2 1 ∑ xi − xn = n i=1 2 n (∑ xi − x) − xn n i=1 The second term is 4 n y1 y i yn 1 1 ... 1 y ∑ X ′Y = 2 = i=1 = n n x1 x2 ... xn ∑ xi y i ∑ xi y i i=1 yn i=1 Thus the OLS estimators are: n yn 1 x 2 − xn βˆ = (X ′X ) −1 X ′Y = ∑ i n n i=1 x y n( x − x) 2 ∑ i i ∑ i − xn n i=1 i=1 n n yn x 2 − xn x y 1 ∑ i ∑ i i = i=1 i=1 n n n( x − x) 2 ∑ i − xn yn + n∑ xi y i i=1 i=1 n n y x 2 − x x y 1 ∑ i ∑ i i = i=1 i=1 n n ( x − x) 2 ∑ i ∑ xi y i − yxn i=1 i=1 n n y x 2 − y x 2 + y x 2 − x x y 1 ∑ i ∑ i i = i=1 i=1 n n ( x − x) 2 ∑ i ∑(xi y i − x y) i=1 i=1 n n y ( x 2 − x 2 ) − x ( x y − y x) 1 ∑ i ∑ i i = i=1 i=1 n n ( x − x) 2 ∑ i ∑(xi − x)(y i − y) i=1 i=1 y − βˆ x 1 n (x − x)(y − y) = ∑ i i i=1 n 2 (∑ xi − x) i=1 βˆ = 0 ˆ β1 5 This is what you studied in the previous lecture note. End of Example 4-1 Unbiasedness of OLS In this sub-section, we show the unbiasedness of OLS under the following assumptions. Assumptions: E 1 (Linear in parameters): Y = Xβ + u E 2 (Zero conditional mean): E(u | X ) = 0 E 3 (No perfect collinearity): X has rank k. From (2), we know the OLS estimators are βˆ = (X ′X ) −1 X ′Y We can replace y with the population model ( E 1 ), βˆ = (X ′X ) −1 X ′(Xβ + u) = (X ′X ) −1 X ′Xβ + (X ′X ) −1 X ′u = β + (X ′X ) −1 X ′u By taking the expectation on the both sides of the equation, we have: E(βˆ) = β + (X ′X ) −1 E(X ′u) From E2, we have E(u | X ) = 0 . Thus, E(βˆ) = β Under the assumptions E1-E3, the OLS estimators are unbiased. The Variance of OLS Estimators 6 Next, we consider the variance of the estimators. Assumption: 2 2 E 4 (Homoskedasticity): Var(ui | X ) = σ and Cov(ui ,u j ) = 0 , thus Var(u | X ) = σ I . Because of this assumption, we have 2 u1 E(u1u1 ) E(u1u2 ) E(u1un ) σ 0 0 u E(u u ) E(u u ) E(u u ) 0 σ 2 0 E(uu′) = E 2 []u u u = 2 1 2 2 2 n = =σ 2 I 1 2 n u 2 n E(unu1 ) E(unu2 ) E(unun ) 0 0 σ n x 1 1 x n n x n n x n n x n Therefore, Var(βˆ) = Var[β + (X ′X ) −1 X ′u] = Var[(X ′X ) −1 X ′u] = E[(X ′X ) −1 X ′uu′X (X ′X ) −1 ] = (X ′X ) −1 X ′E(uu′)X (X ′X ) −1 = (X ′X ) −1 X ′σ 2 I X (X ′X ) −1 (E4: Homoskedasticity) Var(βˆ) = σ 2 (X ′X )−1 (3) GAUSS-MARKOV Theorem: Under assumptions 1 – 4, βˆ is the Best Linear Unbiased Estimator ( BLUE ). 7 Example 4-2: Step by Step Regression Estimation by STATA In this sub-section, I would like to show you how the matrix calculations we have studied are used in econometrics packages. Of course, in practices you do not create matrix programs: econometrics packages already have built-in programs. The following are matrix calculations with STATA using data called, NFIncomeUganda.dta. Here we want to estimate the following model: ln(income)yi = β 0 + β1 femalei + β 2edui + β3edusqi + ui All the variables are defined in Example 3-1. Descriptive information about the variables are here: . su; Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- female | 648 .2222222 .4160609 0 1 edu | 648 6.476852 4.198633 -8 19 edusq | 648 59.55093 63.28897 0 361 -------------+-------------------------------------------------------- ln_income | 648 12.81736 1.505715 7.600903 16.88356 First, we need to define matrices. In STATA, you can load specific variables (data) into matrices. The command is called mkmat . Here we create a matrix, called y, containing the dependent variable, ln_nfincome , and a set of independent variables, called x, containing female , educ , educsq . mkmat ln_nfincome , matrix(y) . mkmat female educ educsq , matrix(x) −1 Then, we create some components: X ′X , (X ′X ) , and X ′Y : 8 . matrix xx=x'*x; . mat list xx; symmetric xx[4,4] female edu edusq const female 144 edu 878 38589 edusq 8408 407073 4889565 const 144 4197 38589 648 . matrix ixx=syminv(xx); . mat list ixx; symmetric ixx[4,4] female edu edusq const female .0090144 edu .00021374 .00053764 edusq -.00001238 -.00003259 2.361e-06 const -.00265043 -.0015892 .00007321 .00806547 Here is X ′Y : . matrix xy=x'*y; . mat list xy; xy[4,1] ln_nfincome female 1775.6364 edu 55413.766 edusq 519507.74 const 8305.6492 −1 Therefore the OLS estimators are (X ′X ) X ′Y : .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    29 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us