Regression with Polynomials and Interactions

Regression with Polynomials and Interactions

Regression with Polynomials and Interactions Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 04-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 1 Copyright Copyright © 2017 by Nathaniel E. Helwig Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 2 Outline of Notes 1) Polynomial Regression: 2) Interactions in Regression: Polynomials review Overview Model form Nominal*Continuous Model assumptions Example #1: Real Estate Ordinary least squares Example #2: Depression Orthogonal polynomials Continuous*Continuous Example: MPG vs HP Example #3: Oceanography Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 3 Polynomial Regression Polynomial Regression Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 4 Polynomial Regression Review of Polynomials Polynomial Function: Definition Reminder: a polynomial function has the form 2 3 n f (x) = a0 + a1x + a2x + a3x + ··· + anx n X j = aj x j=0 where aj 2 R are the coefficients and x is the indeterminate (variable). Note: xj is the j-th order polynomial term x is first order term, x2 is second order term, etc. The degree of a polynomial is the highest order term Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 5 Polynomial Regression Review of Polynomials Polynomial Function: Simple Regression > x=seq(-1,1,length=50) > x=seq(-1,1,length=50) > y=2+2*(x^2) > y=2+2*(x^3) > plot(x,y,main="Quadratic") > plot(x,y,main="Cubic") > qmod=lm(y~x) > cmod=lm(y~x) > abline(qmod) > abline(cmod) Quadratic Cubic ● ● ● 4 4.0 ● ● ● ● ● ● ● ● ● ● ● 3 3.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●●●●●●●●●● y ● 2 ●● y ●●● 3.0 ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 1 2.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● 0 2.0 −1.0 −0.5 0.0 0.5 1.0 −1.0 −0.5 0.0 0.5 1.0 x x Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 6 Polynomial Regression Model Form Model Form (scalar) The polynomial regression model has the form p X j yi = b0 + bj xi + ei j=1 for i 2 f1;:::; ng where yi 2 R is the real-valued response for the i-th observation b0 2 R is the regression intercept bj 2 R is the regression slope for the j-th degree polynomial xi 2 R is the predictor for the i-th observation iid 2 ei ∼ N(0; σ ) is a Gaussian error term Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 7 Polynomial Regression Model Form Model Form (matrix) The polynomial regression model has the form y = Xb + e or 0 1 0 2 p1 0 1 0 1 y1 1 x1 x1 ··· x1 b0 e1 2 p By2C B1 x2 x2 ··· x2 C Bb1C Be2C B C B pC B C B C By C B1 x x2 ··· x C Bb C Be C B 3C = B 3 3 3 C B 2C + B 3C B . C B . .. C B . C B . C @ . A @ . A @ . A @ . A 2 p yn 1 xn xn ··· xn bp en Note that this is still a linear model, even though we have polynomial terms in the design matrix. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 8 Polynomial Regression Model Assumptions PR Model Assumptions (scalar) The fundamental assumptions of the PR model are: 1 Relationship between X and Y is polynomial 2 xi and yi are observed random variables (known constants) iid 2 3 ei ∼ N(0; σ ) is an unobserved random variable 4 b0; b1;:::; bp are unknown constants ind 5 Pp j 2 (yi jxi ) ∼ N(b0 + j=1 bj xi ; σ ) note: homogeneity of variance Note: focus is estimation of the polynomial curve. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 9 Polynomial Regression Model Assumptions PR Model: Assumptions (matrix) In matrix terms, the error vector is multivariate normal: 2 e ∼ N(0n; σ In) In matrix terms, the response vector is multivariate normal given X: 2 (yjX) ∼ N(Xb; σ In) Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 10 Polynomial Regression Model Assumptions Polynomial Regression: Properties Some important properties of the PR model include: 1 Need n > p to fit the polynomial regression model 2 Setting p = 1 produces simple linear regression 3 Setting p = 2 is quadratic polynomial regression 4 Setting p = 3 is cubic polynomial regression 5 Rarely set p > 3; use cubic spline instead Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 11 Polynomial Regression Ordinary Least Squares Polynomial Regression: OLS Estimation The ordinary least squares (OLS) problem is min ky − Xbk2 b2Rp+1 where k · k denotes the Frobenius norm. The OLS solution has the form b^ = (X0X)−1X0y which is the same formula from SLR and MLR! Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 12 Polynomial Regression Ordinary Least Squares Fitted Values and Residuals SCALAR FORM: MATRIX FORM: Fitted values are given by Fitted values are given by ^ ^ Pp ^ j ^ ^ yi = b0 + j=1 bj xi y = Xb = Hy and residuals are given by and residuals are given by e^i = yi − y^i e^ = y − y^ = (In − H)y Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 13 Polynomial Regression Ordinary Least Squares Estimated Error Variance (Mean Squared Error) The estimated error variance is σ^2 = SSE=(n − p − 1) n X 2 = (yi − y^i ) =(n − p − 1) i=1 2 = k(In − H)yk =(n − p − 1) which is an unbiased estimate of error variance σ2. The estimate σ^2 is the mean squared error (MSE) of the model. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 14 Polynomial Regression Ordinary Least Squares Distribution of Estimator, Fitted Values, and Residuals Just like in SLR and MLR, the PR assumptions imply that b^ ∼ N(b; σ2(X0X)−1) y^ ∼ N(Xb; σ2H) 2 e^ ∼ N(0; σ (In − H)) Typically σ2 is unknown, so we use the MSE σ^2 in practice. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 15 Polynomial Regression Ordinary Least Squares Multicollinearity: Problem 2 3 Note that xi , xi , xi , etc. can be highly correlated with one another, which introduces multicollinearity problem. > set.seed(123) > x = runif(100)*2 > X = cbind(x, xsq=x^2, xcu=x^3) > cor(X) x xsq xcu x 1.0000000 0.9703084 0.9210726 xsq 0.9703084 1.0000000 0.9866033 xcu 0.9210726 0.9866033 1.0000000 Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 16 Polynomial Regression Ordinary Least Squares Multicollinearity: Partial Solution You could mean-center the xi terms to reduce multicollinearity. > set.seed(123) > x = runif(100)*2 > x = x - mean(x) > X = cbind(x, xsq=x^2, xcu=x^3) > cor(X) x xsq xcu x 1.00000000 0.03854803 0.91479660 xsq 0.03854803 1.00000000 0.04400704 xcu 0.91479660 0.04400704 1.00000000 But this doesn’t fully solve our problem. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 17 Polynomial Regression Orthogonal Polynomials Orthogonal Polynomials: Definition To deal with multicollinearity, define the set of variables z0 = a0 z1 = a1 + b1x 2 z2 = a2 + b2x + c2x 2 3 z3 = a3 + b3x + c3x + d3x 0 where the coefficients are chosen so that zj zk = 0 for all j 6= k. The transformed zj variables are called orthogonal polynomials. Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 18 Polynomial Regression Orthogonal Polynomials Orthogonal Polynomials: Orthogonal Projection The orthogonal projection of a vector v = fvi gn×1 on to the line spanned by the vector u = fui gn×1 is hv; ui proj (v) = u u hu; ui where hu; vi = u0y and hu; ui = u0u denote the inner products. http://thejuniverse.org/PUBLIC/LinearAlgebra/LOLA/dotProd/proj.html Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 19 Polynomial Regression Orthogonal Polynomials Orthogonal Polynomials: Gram-Schmidt Can use the Gram-Schmidt process to form orthogonal polynomials. Start with a linearly independent design matrix X = [x0; x1; x2; x3] j j 0 where xj = (x1;:::; xn) is the j-th order polynomial vector. Gram-Schmidt algorithm to form columnwise orthogonal matrix Z that spans the same column space as X: z0 = x0 z = x − proj (x ) 1 1 z0 1 z = x − proj (x ) − proj (x ) 2 2 z0 2 z1 2 z = x − proj (x ) − proj (x ) − proj (x ) 3 3 z0 3 z1 3 z2 3 Nathaniel E. Helwig (U of Minnesota) Regression with Polynomials and Interactions Updated 04-Jan-2017 : Slide 20 Polynomial Regression Orthogonal Polynomials Orthogonal Polynomials: R Functions Simple R function to orthogonalize an input matrix: orthog <- function(X, normalize=FALSE){ np = dim(X) Z = matrix(0, np[1], np[2]) Z[,1] = X[,1] for(k in 2:np[2]){ Z[,k] = X[,k] for(j in 1:(k-1)){ Z[,k] = Z[,k] - Z[,j]*sum(Z[,k]*Z[,j]) / sum(Z[,j]^2) } } if(normalize){ Z = Z %*% diag(colSums(Z^2)^-0.5) } Z } Nathaniel E.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    59 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us