9 Polynomial Regression

9 Polynomial Regression

MAS/MASOR/MASSM - Statistical Analysis - Autumn Term 9 Polynomial regression 9.1 Example An experiment in chemical engineering was carried out to obtain information about the e®ect of temperature on a glass laminate. The following table gives values of a quantity known as \log damping decrement" (y) for ten controlled temperatures (x) in degrees Celsius. x y -30 0.053 -20 0.057 -10 0.061 0 0.068 10 0.072 20 0.081 30 0.093 40 0.105 50 0.115 60 0.130 The following output shows the construction of the data frame that contains the data. The plot of the data suggests that there is a curvilinear relationship between y and x, which could be modelled by a polynomial curve. We begin by ¯tting a straight line to the data. Note that the function coef provides us with estimated regression coe±cients to more decimal places than does the function summary. The ¯tted line, y = 0:070745455 + 0:000850303x; explains almost 96% of the variation in the response variable, which looks very impres- sive, and the F -statistic for the regression is highly signi¯cant. However, to check the adequacy of the model, we should inspect the residuals, which we obtain using the function residuals and here store as the vector res1. A plot of the residuals against x exhibits a very strong pattern, which shows that the straight line model is inadequate and indicates that we might ¯t a quadratic to the data. y 0.06 0.08 0.10 0.12 -20 0 20 40 60 x Figure 1: Plot of y vs. x 1 > x <- seq(-30,60,by=10) > y <- c(0.053,0.057,0.061,0.068,0.072,0.081,0.093,0.105,0.115,0.130) > glass <- data.frame(y,x) > plot(x,y) > glass1.lm <- lm(y~x, data = glass) > summary(glass1.lm) Call: lm(formula = y ~ x, data = glass) Residuals: Min 1Q Median 3Q Max -0.007248 -0.003127 -0.0005 0.00288 0.008236 Coefficients: Value Std. Error t value Pr(>|t|) (Intercept) 0.0707 0.0020 34.8099 0.0000 x 0.0009 0.0001 13.5573 0.0000 Residual standard error: 0.005697 on 8 degrees of freedom Multiple R-Squared: 0.9583 F-statistic: 183.8 on 1 and 8 degrees of freedom, the p-value is 8.418e-007 Correlation of Coefficients: (Intercept) x -0.4629 > coef(glass1.lm) (Intercept) x 0.07074545 0.000850303 > res1 <- residuals(glass1.lm) > plot(x,res1) res1 -0.005 0.0 0.005 -20 0 20 40 60 x Figure 2: Plot of res1 vs. x The general situation which we consider is the one in which there are n observed pairs of values (xi; yi); i=1; : : : ; n, of the variables x; y, and we wish to ¯t a k-th order polynomial relationship of the form 2 k yi = ¯0 + ¯1xi + ¯2xi + ::: + ¯kxi + ²i i=1; : : : ; n; (1) 2 2 for some k with k < n, where the errors ²i are assumed to be NID(0; σ ). This is a special case of the multiple linear regression model in which the regressor variables are x; x2; : : : ; xk. ² Although we are ¯tting a non-linear relationship, the model (1) is linear in the parameters ¯0; ¯1; ¯2; : : : ; ¯k. The design matrix X corresponding to the above polynomial regression is a special case of the one for multiple linear regression, with 0 2 k 1 1 x1 x1 : : : x1 B C B 1 x x2 : : : xk C B 2 2 2 C X = B . C : @ . A 2 k 1 xn xn : : : xn In S-PLUS, given the variables x and y, we can de¯ne new variables x2; : : : ; xk and then carry out the multiple regression of y on x; x2; : : : ; xm for values of m with m · k to ¯t a polynomial of order k or less. What degree of polynomial should we ¯t? In the following output for our example, the variables x2, x3 and x4 are constructed and the data frame is extended to include the newly constructed variables. Next we consider a multiple linear regression of y on x and x2. > x2 <- x*x > x3 <- x2*x > x4 <- x3*x > glass <- data.frame(y,x,x2,x3,x4) > rm(x,x2,x3,x4,y) > attach(glass) > glass2.lm <- lm(y~x+x2, data = glass) > summary(glass2.lm) Call: lm(formula = y ~ x + x2, data = glass) Residuals: Min 1Q Median 3Q Max -0.001764 -0.0008682 0.00006894 0.0007739 0.001614 Coefficients: Value Std. Error t value Pr(>|t|) (Intercept) 0.0666 0.0006 117.9214 0.0000 x 0.0006 0.0000 29.5321 0.0000 x2 0.0000 0.0000 12.3261 0.0000 Residual standard error: 0.001278 on 7 degrees of freedom Multiple R-Squared: 0.9982 F-statistic: 1902 on 2 and 7 degrees of freedom, the p-value is 2.657e-010 Correlation of Coefficients: (Intercept) x x 0.2107 x2 -0.5906 -0.7645 3 > coef(glass2.lm) (Intercept) x x2 0.06663182 0.0006446212 6.856061e-006 > res2 <- residuals(glass2.lm) > plot(x,res2) res2 -0.001 0.0 0.001 -20 0 20 40 60 x Figure 3: Plot of res2 vs. x Note that the values of b0 and b1 have changed from what they were for the simple linear regression. We now obtain the ¯tted equation y = 0:06663182 + 0:0006446212x + 0:000006856061x2: The introduction of the variable x2 gives a signi¯cant improvement in the ¯t. Note ¯rstly the increase from the previous regression in the value of R2 and the dramatic increase in the value of the F-statistic in the ANOVA. However, more fundamentally, the fact that the t-statistic t = 12:3261 for x2 is highly signi¯cant shows that a highly signi¯cant improvement in ¯t is achieved by the inclusion of the variable x2. An apparently random scatter of points in the plot of the residuals (res2) against x indicates that we now have an adequate model. Nevertheless, we try a further regression, introducing a cubic term x3, to see if we can improve even further on the ¯t of the model. 4 > glass3.lm <- lm(y~x+x2+x3, data = glass) > summary(glass3.lm) Call: lm(formula = y ~ x + x2 + x3, data = glass) Residuals: Min 1Q Median 3Q Max -0.001762 -0.0008706 0.00007343 0.0007716 0.00161 Coefficients: Value Std. Error t value Pr(>|t|) (Intercept) 0.0666 0.0008 87.0416 0.0000 x 0.0006 0.0000 21.0199 0.0000 x2 0.0000 0.0000 5.4096 0.0016 x3 0.0000 0.0000 -0.0078 0.9940 Residual standard error: 0.001381 on 6 degrees of freedom Multiple R-Squared: 0.9982 F-statistic: 1087 on 3 and 6 degrees of freedom, the p-value is 1.355e-008 Correlation of Coefficients: (Intercept) x x2 x -0.2570 x2 -0.7546 0.2853 x3 0.6036 -0.6397 -0.8808 The introduction of the additional regressor variable x3 does not result in a signi¯cant im- provement in the ¯t. The value of the t-statistic for x3 is t = ¡0:0078, which is certainly 2 not signi¯cant (p = 0.9940). In fact the value of s = MSR has increased (s = 0:001381), so that the introduction of the x3 term into the model has been counter-productive. In looking for an appropriate degree of polynomial to ¯t we go through a stepwise pro- cedure of a special kind, where at each stage only the next power term is considered. If we include the regressor variable xm in the model then it is natural to include all lower powers of x as well. From the following output, which uses the function cor for correlation, we see that the regressor variables x2, x3 and x4 are particularly highly correlated with each other. This illustrates that problems of multicollinearity will arise and become ever more serious as further powers of x are introduced into the regression. > cor(glass) y x x2 x3 x4 y 1.0000000 0.9789228 0.8770847 0.9288534 0.8378634 x 0.9789228 1.0000000 0.7644708 0.8560000 0.7312607 x2 0.8770847 0.7644708 1.0000000 0.9479438 0.9589104 x3 0.9288534 0.8560000 0.9479438 1.0000000 0.9732258 x4 0.8378634 0.7312607 0.9589104 0.9732258 1.0000000 5 9.2 Orthogonal polynomials A way of dealing with problems of multicollinearity in polynomial regression is to work with what are known as orthogonal polynomials. Consider an alternative version of the model (1), yi = ®0 + ®1P1(xi) + ®2P2(xi) + ::: + ®kPk(xi) + ²i i=1; : : : ; n: (2) The regressor variables are functions P1(x);P2(x);:::;Pk(x) of the variable x, where, for r =1; : : : ; k, Pr(x) is a polynomial of order r, to be speci¯ed more precisely later. We thus have a linear model y = X® + ²; with design matrix X given by 0 1 1 P1(x1) P2(x1) :::Pk(x1) B C B 1 P (x ) P (x ) :::P (x ) C B 1 2 2 2 k 2 C X = B . C : @ . A 1 P1(xn) P2(xn) :::Pk(xn) We choose the polynomial functions P1(x);P2(x);:::;Pk(x) in such a way that the design matrix X is orthogonal, i.e., the columns of X are orthogonal to each other. We thus require that Xn Pr(xi) = 0; r =1; : : : ; k (3) i=1 and Xn Pr(xi)Ps(xi) = 0; r =2; : : : ; k; s=1; : : : ; r¡1: (4) i=1 A set of polynomials P1(x);P2(x);:::;Pk(x) which satis¯es the conditions of Equations (3) and (4) is said to be orthogonal.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    11 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us