Chapter 6: Curve Fitting

Chapter 6: Curve Fitting

Chapter 6: Curve Fitting Two types of curve fitting ² Least square regression Given data for discrete values, derive a single curve that represents the general trend of the data. — When the given data exhibit a significant degree of error or noise. ² Interpolation Given data for discrete values, fit a curve or a series of curves that pass di- rectly through each of the points. — When data are very precise. 1 PART I: Least Square Regression 1 Simple Linear Regression Fitting a straight line to a set of paired observations (x1; y1); (x2; y2);:::; (xn; yn). Mathematical expression for the straight line (model) y = a0 + a1x where a0 is the intercept, and a1 is the slope. Define ei = yi;measured ¡ yi;model = yi ¡ (a0 + a1xi) Criterion for a best fit: Xn Xn 2 2 min Sr = min ei = min (yi ¡ a0 ¡ a1xi) a0;a1 a0;a1 i=1 i=1 Find a0 and a1: 2 @S Xn r = ¡2 (y ¡ a ¡ a x ) = 0 (1) @a i 0 1 i 0 i=1 @S Xn r = ¡2 [(y ¡ a ¡ a x )x ] = 0 (2) @a i 0 1 i i 1 i=1 Pn Pn Pn From (1), i=1 yi ¡ i=1 a0 ¡ i=1 a1xi = 0, or Xn Xn na0 + xia1 = yi (3) i=1 i=1 Pn Pn Pn 2 From (2), i=1 xiyi ¡ i=1 a0xi ¡ i=1 a1xi = 0, or Xn Xn Xn 2 xia0 + xi a1 = xiyi (4) i=1 i=1 i=1 (3) and (4) are called normal equations. From (3), 1 Xn 1 Xn a = y ¡ x a =y ¹ ¡ xa¹ 0 n i n i 1 1 i=1 i=1 1 Pn 1 Pn where x¹ = n i=1 xi, y¹ = n i=1 yi. 3 Pn 1 Pn 1 Pn Pn 2 Pn From (4), i=1 xi(n i=1 yi ¡ n i=1 xia1) + i=1 xi a1 = i=1 xiyi, Pn 1 Pn Pn i=1 xiyi ¡ n i=1 xi i=1 yi a1 = Pn 2 1 Pn 2 i=1 xi ¡ n ( i=1 xi) or Pn Pn Pn n i=1 xiyi ¡ i=1 xi i=1 yi a1 = Pn 2 Pn 2 n i=1 xi ¡ ( i=1 xi) Definitions: Xn Xn 2 2 Sr = ei = (yi ¡ a0 ¡ a1xi) i=1 i=1 Standard error of the estimate: r S S = r y=x n ¡ 2 — Spread around the regression line Standard deviation of data points r sP S n (y ¡ y¹)2 S = t = i=1 i y n ¡ 1 n ¡ 1 4 Pn 2 where St = i=1(yi ¡ y¹) . — Spread around the mean value y¹. Correlation coefficient: r S ¡ S r = t r St — Improvement or error reduction due to describing the data in terms of a straight line rather than as an average value. P P Other criteria for regression (a) min ei, (b) min jeij, and (c) min max ei 5 Figure 1: (a) Spread of data around mean of dependent variable, (b) spread of data around the best-fit line Illustration of linear regression with (a) small and (b) large residual errors 6 Example: x 1 2 3 4 5 6 7 y 0.5 2.5 2.0 4.0 3.5 6.0 5.5 P P xi = 1 + 2 + ::: + 7 = 28 yi = 0:5 + 2:5 + ::: + 5:5 = 24 P 2 2 2 2 P xi = 1 + 2 + ::: + 7 = 140 xiyi = 1 £ 0:5 + 2 £ 2:5 + ::: + 7 £ 5:5 = 119:5 Pn Pn Pn n xiyi¡ xi yi 7£119:5¡28£24 i=1P iP=1 i=1 a1 = n 2 n 2 = 7£140¡282 = 0:8393 n i=1 xi ¡( i=1 xi) 1 P 1 P 1 1 a0 =y ¹ ¡ xa¹ 1 = n yi ¡ a1n xi = 7 £ 24 ¡ 0:8393 £ 7 £ 28 = 0:07143. Model: y = 0:07143 + 0:8393x. Pn 2 Sr = i=1 ei , ei = yi ¡ (a0 + a1xi) e1 = 0:5 ¡ 0:07143 ¡ 0:8393 £ 1 = ¡0:410 e2 = 2:5 ¡ 0:07143 ¡ 0:8393 £ 2 = 0:750 e3 = 2:0 ¡ 0:07143 ¡ 0:8393 £ 3 = ¡0:589 ::: e7 = 5:5 ¡ 0:07143 ¡ 0:8393 £ 7 = ¡0:446 2 2 2 2 Sr = (¡0:410) + 0:750 + (¡0:589) + ::: + 0:446 = 2:9911 7 Pn 2 St = i=1(yi ¡ y¹) = 22:714 Standardq deviationq of data points: St 22:714 Sy = n¡1 = 6 = 1:946 Standardq error ofq the estimate: Sr 2:9911 Sy=x = n¡2 = 7¡2 = 0:774 S < S , S < S . y=x y r t q q Correlation coefficient r = St¡Sr = 22:714¡2:9911 = 0:932 St 22:714 2 Polynomial Regression Given data (xi; yi), i = 1; 2; : : : ; n, fit a second order polynomial 2 y = a0 + a1x + a2x 2 ei = yi;measured ¡ yi;model = yi ¡ (a0 + a1xi + a2xi ) Criterion for a best fit: Xn Xn 2 2 2 min Sr = min ei = min (yi ¡ a0 ¡ a1xi ¡ a2xi ) a0;a1;a2 a0;a1;a2 i=1 i=1 Find a0, a1, and a2: 8 @S Xn r = ¡2 (y ¡ a ¡ a x ¡ a x2) = 0 (1) @a i 0 1 i 2 i 0 i=1 @S Xn r = ¡2 [(y ¡ a ¡ a x ¡ a x2)x ] = 0 (2) @a i 0 1 i 2 i i 1 i=1 @S Xn r = ¡2 [(y ¡ a ¡ a x ¡ a x2)x2] = 0 (3) @a i 0 1 i 2 i i 2 i=1 Pn Pn Pn Pn 2 From (1), i=1 yi ¡ i=1 a0 ¡ i=1 a1xi ¡ i=1 a2xi = 0, or Xn Xn Xn 2 0 na0 + xia1 + xi a2 = yi (1 ) i=1 i=1 i=1 Pn Pn Pn 2 Pn 3 From (2), i=1 xiyi ¡ i=1 a0xi ¡ i=1 a1xi ¡ i=1 xi a2 = 0, or Xn Xn Xn Xn 2 3 0 xia0 + xi a1 + xi a2 = xiyi (2 ) i=1 i=1 i=1 i=1 Pn 2 Pn 2 Pn 3 Pn 4 From (3), i=1 xi yi ¡ i=1 a0xi ¡ i=1 a1xi ¡ i=1 xi a2 = 0, or Xn Xn Xn Xn 2 3 4 2 0 xi a0 + xi a1 + xi a2 = xi yi (3 ) i=1 i=1 i=1 i=1 9 Comments: ² The problem of determining a least-squares second order polynomial is equiv- alent to solving a system of 3 simultaneous linear equations. ² In general, to fit an m-th order polynomial 2 m y = a0 + a1x1 + a2x + ::: + amx using least-square regression is equivalent to solving a system of (m + 1) simultaneous linear equations.q Sr Standard error: Sy=x = n¡(m+1) 3 Multiple Linear Regression Multiple linear regression is used when y is a linear function of 2 or more inde- pendent variables. Model: y = a0 + a1x1 + a2x2. Given data (x1i; x2i; yi), i = 1; 2; : : : ; n ei = yi;measured ¡ yi;model Pn 2 Pn 2 Sr = i=1 ei = i=1(yi ¡ a0 ¡ a1x1i ¡ a2x2i) Find a0, a1, and a2 to minimize Sr. 10 @S Xn r = ¡2 (y ¡ a ¡ a x ¡ a x ) = 0 (1) @a i 0 1 1i 2 2i 0 i=1 @S Xn r = ¡2 [(y ¡ a ¡ a x ¡ a x )x ] = 0 (2) @a i 0 1 1i 2 2i 1i 1 i=1 @S Xn r = ¡2 [(y ¡ a ¡ a x ¡ a x )x ] = 0 (3) @a i 0 1 1i 2 2i 2i 2 i=1 Pn Pn Pn 0 From (1), na0 + i=1 x1ia1 + i=1 x2ia2 = i=1 yi (1 ) Pn Pn 2 Pn Pn 0 From (2), i=1 x1ia0 + i=1 x1ia1 + i=1 x1ix2ia2 = i=1 x1iyi (2 ) Pn Pn Pn 2 Pn 0 From (3), i=1 x2ia0 + i=1 x1ix2ia1 + i=1 x2ia2 = i=1 x2iyi (3 ) 2 P P 3 2 3 2 P 3 Pn P x1i P x2i a0 P yi 4 2 5 4 5 4 5 x1i x1i x1ix2i a1 = x1iyi P P P 2 P x2i x1ix2i x2i a2 x2iyi q Sr Standard error: Sy=x = n¡(m+1) 11 4 General Linear Least Squares Model: y = a0Z0 + a1Z1 + a2Z2 + ::: + amZm where Z0;Z1;:::;Zm are (m + 1) different functions. Special cases: ² Simple linear LSR: Z0 = 1, Z1 = x, Zi = 0 for i ¸ 2 i 2 ² Polynomial LSR: Zi = x (Z0 = 1, Z1 = x, Z2 = x , ...) ² Multiple linear LSR: Z0 = 1, Zi = xi for i ¸ 1 “Linear” indicates the model’s dependence on its parameters, ai’s. The functions can be highly non-linear. Pn 2 Pn 2 Sr = i=1 ei = i=1(yi;measured ¡ yi;model) Given data (Z0i;Z1i;:::;Zmi; yi), i = 1; 2; : : : ; n, Xn Xm 2 Sr = (yi ¡ ajZji) i=1 j=0 Find aj, j = 0; 1; 2; : : : ; m to minimize Sr. 12 @S Xn Xm r = ¡2 (y ¡ a Z ) ¢ Z = 0 @a i j ji ki k i=1 j=0 Xn Xn Xm yiZki = ZkiZjiaj; k = 0; 1; : : : ; m i=1 i=1 j=0 Xm Xn Xn ZkiZjiaj = yiZki j=0 i=1 i=1 ZT ZA = ZT Y where 2 3 Z01 Z11 :::Zm1 6 Z Z :::Z 7 Z = 6 02 12 m2 7 4 ::: 5 Z0n Z1n :::Zmn 13 PART II: Polynomial Interpolation Given (n + 1) data points, (xi; yi), i = 0; 1; 2; : : : ; n, there is one and only one polynomial of order n that passes through all the points. 5 Newton’s Divided-Difference Interpolating Polynomials Linear Interpolation Given (x0; y0) and (x1; y1) y ¡ y f (x) ¡ y 1 0 = 1 0 x1 ¡ x0 x ¡ x0 y1 ¡ y0 f1(x) = y0 + (x ¡ x0) x1 ¡ x0 f1(x): first order interpolation A smaller interval, i.e., jx1 ¡ x0j closer to zero, leads to better approximation.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    27 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us