14
BSM510 Numerical Analysis General Linear Least-Squares and Nonlinear Regression Prof. Manar Mohaisen Department of EEC Engineering Korea University of Technology and Education (KUT)

BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

BSM510Numerical Analysis

General Linear Least-Squares and Nonlinear Regression

Prof. Manar MohaisenDepartment of EEC Engineering

Korea University of Technology and Education (KUT)

Page 2: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Review of Precedent LectureStatistics reviewStatistics review

Linear Least-Squares regressionLinear Least Squares regression

Linearization of nonlinear models

2Korea University of Technology and Education (KUT)

Page 3: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Lecture ContentPolynomial RegressionPolynomial Regression

Multiple Linear RegressionMultiple Linear Regression

Nonlinear Regression

3Korea University of Technology and Education (KUT)

Page 4: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Linear Least-Squares Regression: ReviewSquare errorSquare error

2 20 1

1 1( )

n nr i ii i

S e y a a x= =

= = − −∑ ∑

♦ Derive with respect to the unknowns

0 12 ( )ri i

S y a a x∂ = − − −∂ ∑ 0 12 ( )ri i i

S y a a x x∂ = − − −∂ ∑

Set these two equations to 0, we get the following system of equations

0 10

( )i iya∂ ∑ 0 1

1( )i i iy

a∂ ∑

021

i i

i ii i

n x yaa x yx x

⎡ ⎤ ⎡ ⎤⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎣ ⎦ ⎣ ⎦⎣ ⎦

=∑ ∑∑∑ ∑

Using any of the methods we learned,

⎣ ⎦⎣ ⎦

i i i in x y x ya

−= ∑ ∑ ∑

4Korea University of Technology and Education (KUT)

1 22

i i i i

i i

an x x⎛ ⎞

⎜ ⎟⎝ ⎠

=−

∑ ∑ ∑∑ ∑ 0 1a y a x= −

Page 5: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Nonlinear Regression: NecessityExampleExample♦ The data exhibit nonlinear patterns♦ Linear least-squares regression

S SCoefficient of determination

0

5

2 0.4380rt

t

S Sr

S=

−=

10

-5

0

20

-15

-10y

data

-5 -4 -3 -2 -1 0 1 2 3-25

-20

x

datalinear regression

♦ A solution: Polynomial regression

5Korea University of Technology and Education (KUT)

20 1 2

mmy a a x a x a x e= + + + +L

Page 6: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Polynomial RegressionExtension of the linear least squares methodExtension of the linear least-squares method♦ 2nd order polynomial extension

20 1 2y a a x a x e= + + +

♦ As in the case of, we need to find the unknowns (a0, a1 and a2)The square error is defined by

0 1 2y

The square error is defined by

2 2 20 1 2

1 1( )

n nr i ii i

S e y a a x a x= =

= = − − −∑ ∑

Sr is derived with respect to each of the unknowns

20 1 22 ( )r

i i iS y a a x a xa∂ = − − − −∂ ∑ 0 1 20

i i ia∂ ∑

20 1 2

12 ( )r

i i i iS x y a a x a xa∂ = − − − −∂ ∑

202 31

2 3 4 22

i i i

i i i i i

i i i i i

n x x yax x x a x y

ax x x x y

⎡ ⎤ ⎡ ⎤⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎣ ⎦⎢ ⎥ ⎣ ⎦⎣ ⎦

=∑ ∑ ∑

∑ ∑ ∑ ∑∑ ∑ ∑ ∑

6Korea University of Technology and Education (KUT)

2 20 1 2

22 ( )r

i i i iS x y a a x a xa∂ = − − − −∂ ∑

i i i i i⎢ ⎥⎣ ⎦⎢ ⎥ ⎣ ⎦⎣ ⎦∑ ∑ ∑ ∑

Page 7: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Polynomial RegressionExample: Linear vs Polynomial regressionExample: Linear vs. Polynomial regression

x ‐5 ‐4 ‐3 ‐2 ‐1 0 1 2 3

y ‐20 ‐19 ‐9 ‐2 0 3 0 ‐3 ‐12

♦ Linear regression

1 1.85i i i in x y x ya

−= =∑ ∑ ∑ 0 1 5.2611a y a x= − = −S S−

♦ 2nd order polynomial regression

1 22

1.85

i i

an x x⎛ ⎞

⎜ ⎟⎝ ⎠

−∑ ∑0 1

5.2611 1.85y x= − +2 0.3224rt

t

S Sr

S=

−=

20 02 31 1

2 3 4 22 2

9 9 69 649 69 189 17569 189 1077 1063

i i i

i i i i i

n x x ya ax x x a x y a

a ax x x x y

⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤⎡ ⎤ ⎡ ⎤⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥→⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎣ ⎦

− −= − − =

− −

∑ ∑ ∑∑ ∑ ∑ ∑∑ ∑ ∑ ∑2 269 89 077 063i i i i ix x x x y⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎣ ⎦⎣ ⎦ ⎣ ⎦⎢ ⎥ ⎣ ⎦⎣ ⎦

∑ ∑ ∑ ∑

01

1.18440.4249

aa

⎡ ⎤⎡ ⎤⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥ = − 2 0.9481rtS S

rS

=−=

7Korea University of Technology and Education (KUT)

12 1.1374a

⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥

⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎣ ⎦−

0.9481t

rS

Page 8: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Polynomial RegressionExample: Linear vs Polynomial regression contdExample: Linear vs. Polynomial regression – contd.

linear: 5.2611 1.85y x= − +22nd order polynomial: 1 1844 0 4249 1 1374y x x= − −

2 0.3224r =

2 0 9481r =

5

2nd order polynomial: 1.1844 0.4249 1.1374y x x= − − 0.9481r =

-5

0

-15

-10

y

-25

-20datalinear2nd order polynomial

8Korea University of Technology and Education (KUT)

-5 -4 -3 -2 -1 0 1 2 3-30

x

Page 9: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Polynomial Regression2nd order polynomial regression using Matlab2nd order polynomial regression using Matlab

% file: 2nd order polynomial regression[0 1 2 3 4 5]’x =[0 1 2 3 4 5]’;

y = [2.1 7.7 13.6 27.2 40.9 61.1]’;

% Create the matrix ZZ [ ( i ( )) ^2]Z = [ones(size(x)) x x.^2];

% STEP 2: Z’*Z is the coefficients matrixa = (Z’*Z)\(Z’*y);

% The fitting: a0 + a1*x + a2*x^2y_1 = Z*a;

% find 2% find r2Sr = sum( (y – y_1).^2 );St = sum( (y – mean(y)).^2 );r2 = (St – Sr) ./ St;

9Korea University of Technology and Education (KUT)

Page 10: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Polynomial Regression2nd order polynomial regression using Matlab2nd order polynomial regression using Matlab♦ Using polyfit function

% file: 2nd order polynomial regressionx =[0 1 2 3 4 5]’;y = [2.1 7.7 13.6 27.2 40.9 61.1]’;

% Create the matrix ZZ = [ones(size(x)) x x.^2];

% STEP 2:Find aa = polyfit(x, y, 2);

% The fitting: a0 + a1*x + a2*x^2y_1 = Z*a’;

% find r2Sr = sum( (y – y_1).^2 );St = sum( (y – mean(y)).^2 );r2 = (St – Sr) ./ St;

10Korea University of Technology and Education (KUT)

Page 11: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Multiple Linear Regressiony is a linear function or two or more variablesy is a linear function or two or more variables♦ Example: y depends on two variables (x1 and x2)

0 1 1 2 2y a a x a x e= + + +♦ Square error

2 20 1 1 2 2

1 1( )

n nr i ii i

S e y a a x a x= =

= = − − −∑ ∑

♦ The unknowns are given as follows

1 1i i= =

0 1 1, 2 2,0

2 ( )ri i i

S y a a x a xa∂ = − − − −∂ ∑

S∂ 1 2i in x x ya⎡ ⎤ ⎡ ⎤

⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥∑ ∑ ∑

01, 1 1, 2 2,1

2 ( )rii i i

S x y a a x a xa

∂ = − − − −∂ ∑

02 1 1 2 22 ( )rii i i

S x y a a x a x∂ = − − − −∂ ∑

1, 2, 021, 1, 1, 2, 1 1,

2 2 2,2, 1, 2, 2,

i i i

ii i i i i

iii i i i

yax x x x a x y

a x yx x x x

⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥⎣ ⎦⎢ ⎥ ⎢ ⎥⎣ ⎦⎣ ⎦

=∑ ∑ ∑

∑ ∑ ∑ ∑∑∑ ∑ ∑

11Korea University of Technology and Education (KUT)

02, 1 1, 2 2,2

2 ( )ii i ix y a a x a xa∂ ∑

Page 12: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Multiple Linear RegressionExample: y a a x a x e= + + +Example:

x1 0 2 2.5 1 4 7

x2 0 1 2 3 6 2

0 1 1 2 2y a a x a x e= + + +

♦ Solution: Find the unknowns

0 3 6

y 5 10 9 0 3 27

1, 2, 0 021 1 1 2 1 1 1

6 16.5 14 5416.5 76.25 48 243.5

i i i

ii i i i i

n x x ya ax x x x a x y a

⎡ ⎤ ⎡ ⎤⎡ ⎤ ⎡ ⎤⎡ ⎤ ⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ → ⎢ ⎥ ⎢⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥

= =∑ ∑ ∑

∑ ∑ ∑ ∑⎥⎥⎥1, 1, 1, 2, 1 1, 1

2 2 22,2, 1, 2, 2,

16.5 76.25 48 243.514 48 54 100

ii i i i i

iii i i i

x x x x a x y aa ax yx x x x

⎢ ⎥ → ⎢ ⎥ ⎢⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥

⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢⎣ ⎦ ⎣ ⎦⎢ ⎥ ⎣ ⎦ ⎣ ⎦⎢ ⎥⎣ ⎦⎣ ⎦

∑ ∑ ∑ ∑∑∑ ∑ ∑

⎥⎥⎥

012

543

aaa

⎡ ⎤ ⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥

⎢ ⎥⎢ ⎥ ⎣ ⎦⎣ ⎦

=−

12Korea University of Technology and Education (KUT)

Page 13: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Nonlinear RegressionNonlinear DataNonlinear Data♦ In several applications, the following nonlinear model is defined

10(1 )a xy a e e−= − +

Then the objective function to be minimized is given by

1 20 01( , ) [ (1 )]i

n a xif a a y a e−= − −∑

An optimization algorithm is used to find the unknowns (a0 and a1)1i =

% file: Nonlinear fittingfunction f = fSSR(a, xm, ym)yp = a(1)*xm.^a(2);f = sum( (ym – yp).^2 );

% in command linex = [10:10:80];y = [25 70 380 550 610 1220 830 1450];

% i i 0 1

13Korea University of Technology and Education (KUT)

% finding a0 and a1a = fminsearch(@fSSR, [1, 1], [], x, y);

Page 14: BSM510 Numerical Analysis · Korea University of Technology and Education (KUT) Review of Precedent Lecture ... ya ax ax axe= ++ ++L m. Polynomial Regression ... 11 234 2 22 9969

Lecture SummaryPolynomial RegressionPolynomial Regression

Multiple Linear RegressionMultiple Linear Regression

Nonlinear Regression

14Korea University of Technology and Education (KUT)