1、1,Chapter 13 Curve Fitting and Correlation,This chapter will be concerned primarily with two separate but closely interrelated processes: (1) the fitting of experimental data to mathematical forms that describe their behavior and (2) the correlation between different experimental data to assess how
2、closely different variables are interdependent.,2,The fitting of experimental data to a mathematical equation is called regression. Regression may be characterized by different adjectives according to the mathematical form being used for the fit and the number of variables. For example, linear regre
3、ssion involves using a straight-line or linear equation for the fit. As another example, Multiple regression involves a function of more than one independent variable.,3,Linear Regression,Assume n points, with each point having values of both an independent variable x and a dependent variable y.,4,P
4、reliminary Computations,5,Best-Fitting Straight Line,6,Example 13-1. Find best fitting straight line equation for the data shown below.,7,Example 13-1. Continuation.,8,Example 13-1. Continuation.,9,Example 13-1. Continuation., x = 0:9; yapp = 1.9721*x + 4.1455; y = the 10 values of y; plot(x, yapp,
5、x, y, o)The best-fit plot and the actual points are shown on the next slide.,10,11,MATLAB General Polynomial Fit, x = x1 x2 x3.xn; y = y1 y2 y3yn; p = polyfit(x, y, m) yapp = polyval(p, x) plot(x, yapp, x, y, o),12,Example 13-2. Rework Example 13-1 using MATLAB., x = 0:9; y = the 10 values of y; p =
6、 polyfit(x, y, 1) p =1.9721 4.1455These are the same values obtained manually in Example 13-1.,13,Example 13-3. For data of previous two examples, obtain a 2nd degree fit.,Assume that the vectors x and y are still in memory. p = polyfit(x, y, 2) p =0.0011 1.9619 4.1591 yapp2 = polyval(p, x); plot(x,
7、 yapp2, x, y, o)The results are shown on the next slide.,14,15,Example 13-4. Determine several polynomial fits for the function below., t = -1:0.05:1; y = sin(pi*t); plot(t, y)A plot of the function is shown on the next slide.,16,17,Example 13-4. Continuation.,(a) m = 1 p1 = polyfit(t, y, 1) p1 =0.8
8、854 0.0000 yapp1 = polyval(p1, t); plot(t, yapp1, t, y, o)The results are shown on the next slide.,18,19,Example 13-4. Continuation.,(b) m = 2 p2 = polyfit(t, y, 2) p2 =0.0000 0.8854 -0.0000The polynomial is the same as for m = 1. This is due to the fact that the sine function is an odd function and
9、 the coefficients of the terms with even degrees are zero.,20,Example 13-4. Continuation.,(c) m = 3 p3 = polyfit(t, y, 3) p3 =-2.8139 -0.0000 2.6568 0.0000 yapp3 = polyval(p3, t); plot(t, yapp3, t, y, o)The results are shown on the next slide. A fit for m = 4 would be the same as for m = 3.,21,22,Ex
10、ample 13-5. Continuation.,m = 5 p5 = polyfit(t, y, 5) p5 =1.6982 0.0000 -4.7880 -0.0000 3.0990 0.0000 yapp5 = polyval(p5, t); plot(t, yapp5, t, y, o)The results are shown on the next slide.,23,24,Example 13-5. For data below, obtain a 2nd degree fit for the temperature T as a function of the distanc
11、e x., x = 0:5; T = 71 76 86 100 118 140; p = polyfit(x,T,2) p =2.0893 3.4107 70.8214,25,Example 13-5. Continuation., x1 = 0:0.1:5; T1 = polyval(p, x1); plot(x1, T1, x, T, o)The results are shown on the next slide.,26,27,Multiple Linear Regression,28,Multiple Regression (Continuation),29,MATLAB Proce
12、dure for Linear Regression,1. Form m column vectors each of length k representing the independent variables. x1 = x11 x12 x13x1k; x2 = x21 x22 x23x2k; . . xm = xm1 xm2 xm3.xmk;,30,MATLAB Procedure (Continuation),2. Form a column vector of length k representing the dependent variable y. y = y1 y2 y3.
13、yk; 3. Form a rectangular matrix X of size k by m+1 as follows: X= ones(size(x1) x1 x2 xm; 4. Determine a column vector a of length m+1 by the command that follows: a = Xy,31,MATLAB Procedure (Continuation),5. The best-fit linear multiple regression formula is then given by Y = X*a;6. The maximum di
14、fference between the actual data and the formula is Error_Maximum = max(abs(Y-y),32,Correlation,33,Correlation Coefficient,34,Implications of Correlation Coefficient,1. If C(x, y) = 1, the two variables are totally correlated in a positive sense.2. If C(x, y) = -1 , the two variables are totally correlated in a negative sense.3. If C(x, y) = 0, the two variables are said to be uncorrelated.,35,One Final Note,Correlation does not necessarily imply causation!,