MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev...

41
MA2213 Lecture 7 Optimizat ion

Transcript of MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev...

Page 1: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

MA2213 Lecture 7

Optimization

Page 2: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

TopicsThe Best Approximation Problem pages 159-165

Chebyshev Polynomials pages 165-171

Finding the Minimum of a Function

Method of Steepest Descent

Constrained Minimization

Gradient of a Function

http://en.wikipedia.org/wiki/Optimization_(mathematics)

http://www.mat.univie.ac.at/~neum/glopt/applications.html

Page 3: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

What is “argmin” ?

41

]1,0[x1)-(x xmin

21

]1,0[x1)-(x xargmin

}1,0{x)-(1 xargmin]1,0[x

Page 4: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Optimization Problems

2j

n

1jVf

))f(x( argmin jy

Least Squares : given

or dxxy 2b

aVf))()f(x( argmin

]),([ baCV compute

Spline Interpolation :

},...,1,)(f]),,([f{ 2 niyxbaCV ii

bxxxxa nn 121 given

where

compute dxx 2b

a

''

Vs])([s argmin

),(),...,( 1,1 nn yxyx or

]),([ baCy and a subspace

LS equations page 179 are derived using differentiation.

Spline equations pages 149-151 are derived similarly.

Page 5: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

The Best Approximation Problem p.159]),([f baCDefinition For and integer

]|)()(f|max[min)f( xpxbxaPp

nn

}degree of spolynomial{ nPn where

Definition The best approximation problem is to compute

]|)()(f|max[minarg xpxbxaPp n

0n

Best approximation pages 159-165 is more complicated

Page 6: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Best Approximation Examples

]|)(|max[minarg11

xpem x

xPpn

n

5431.12/)( 10 eem

1752.1||max 011

mexx

2643.11752.1)(1 xxm279.0|)(|max 1

11

xmex

x

Page 7: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Best Approximation Degree 0

Page 8: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Best Approx. Error Degree 0

Page 9: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Best Approximation Degree 1

Page 10: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Best Approx. Error Degree 1

Page 11: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Properties of Best Approximation

1.Best approximation gives much smaller error than Taylor approximation.

2. Best approximation error tends to be dispersed over the interval rather that at the end.

Figures 4.43 and 4.14 on page 162 display the errorfor the degree 3 Taylor Approximation (at x = 0) and the error for the Best Approximation of degree 3 over the interval [-1,1] for exp(x), together with the figures in the preceding slides, support assertions on pages 162-163:

3. Best approximation error is oscillatory, it changes sign at least n+1 times in the interval and the sizes of the oscillations will be equal.

Page 12: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Theoretical Foundations

]),([f baCTheorem 1. (Weierstrass Approximation Theorem 1885).

If then there exists aand 0polynomial p such that ].,[,|)()(f| baxxpx Proof Weierstrass’s original proof used properties of solutions of a partial differential equation called the heat equation. A modern, more constructive proof based on Bernstein polynomials is given on pages 320-323 of Kincaid and Cheney’s Numerical Analysis: Mathematics of Scientific Computing, Brooks Cole, 2002.

Corollary

0]|)()(f|max[minlim)f(lim

xpxbxaPpn

nn n

]),([f baC

Page 13: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Accuracy of Best Approximation)],[(f baCIf then

]|)()(f|max[min)f( xpxbxaPp

nn

|)(f|max2)!1(

)()f( 1)(n

12

1

xn

abbxan

n

n

satisfies

Table 4.6 on page 163 compares this upper bound with

7,6,5,4,3,2,1),e( nxncomputed values of

and shows that it is about 2.5 times larger.

Page 14: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Theoretical FoundationsTheorem 2. (Chebyshev’s Alternation Theorem 1859).

If thenand

110 nn xxxx iff there exist points

]),([f baC 0n

]|)()(f|max[minarg xpxpbxaPp n

in ],[ ba such that

10,||f||)1()()(f nkpcxpx kkk

1cwhere |)()(f|max|||| xpxpfbxa

and

Proof Kincaid and Cheney page 416

Page 15: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Sample ProblemIn Example 4.4.1 on page 160 the author states that the

is the best linearfunction

on

210 xxx

2643.11752.1)(1 xxm

]|)(|max[minarg11

11

xpem x

xPp

Problem. Use Theorem 2 to prove this statement.

]1,1[Solution It suffices to find points

)(1 jx xme j

xe Equivalenty stated.]1,1[

in

mimimax polynomial to

such that

and the sequence

3,2,1,|)(|max|)(| 111

1

jxmexme x

xj

x j

changes sign twice.

Page 16: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Sample Problem

Step 1. Compute the set

is the only point in

01752.1|2643.11752.1 y

yxx exe

dx

d

|)(|maxarg 111

xmexx

Observe that if

)1,1(yx

has a maximum at

so

therefore

then

maximum or a minimum at

1614.0)1752.1(log ey

|)(| 1 xmex )(1 xmex has a either a

)1,1(yx

)1,1( where |)(| 1 xmex can have a maximum.

Question Can this set be empty ?

Page 17: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Sample ProblemStep 2. Observe that

The maximum MUST occur at 1, 2, or all 3 points !

}1,1614.0,1{|)(|maxarg 111

xmexx

might have a maximum at 1xand / or at Equivalently stated

|)(| 1 xmex ]1,1[)1,1(

Step 3. Compute

therefore

1x

2788.0)1(11 me

2788.0)1614.0(11614.0 me

2788.0)1(11 me

Step 4. Choose sequence 1,1614.0,1 210 xxx

Page 18: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Remez Exchange Algorithm described in pages 416-419 of Kincaid and Cheney, is based on Theorem 2. Invented by Evgeny Yakovlevich Remez in 1934, it is a powerful computational algorithm that has vast applications in the design of engineering systems such as the tuning filters that allow your TV and Mobile Telephone to tune in to the program of your choice or to listent (only) to the person who calls you.

http://en.wikipedia.org/wiki/Remez_algorithm

http://www.eepatents.com/receiver/Spec.html#D1

http://comelec.enst.fr/~rioul/publis/199302rioulduhamel.pdf

Page 19: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Chebyshev PolynomialsDefinition The Chebyshev polynomials ,...,, 210 TTT

,...2,1,0),cos()(cos nnTn are defined by the equation

Remark Clearly

however, it is NOT obvious that there EXISTS a polynomial that satisfies the equation above for EVERY nonnegative integer !

12)(,)(,1)( 2210 xxTxxTxT

Page 20: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Triple Recursion Relationderived on pages 167-168 is

1),()(2)( 11 nxTxxTxT nnn

Result 2. termsdegreelower 2)( 1 nnn xxT

Result 3.

Result 1. xxxT 34)( 33

188)( 244 xxxT

)()1()( xTxT nn

n

xxxxT 52016)( 355

Page 21: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Euler and the Binomial Expansion

give inin een )cos(2

nn ii ]sin[cos]sin[cos kknn

k

kknn

ki

k

ni

k

n)sin()(cos)sin()(cos

00

nj

jjnj

j

n

20

22 )cos1()(cos2

)1(

)(cos2 nT

Page 22: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Gradients

http://en.wikipedia.org/wiki/Gradient

Definition T

nn x

F

x

F

x

FxxF

21

1 ),...,(

Examples

2

122

21 2

2)(

x

xxx

bAxF

7

3)73( 21 xx

nnn RbRA , symmetric

RRF n : defined by xbAxxxxF TTn 2

11 ),...,(

whereT

nxxx ],...,[ 1 and

Page 23: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Geometric MeaningResult If

))(()()(1

xFuuxx

FtuxF

dt

d Tn

j jj

nn RxRRF ,: and nRuis a unit vector then)1( uuT

This has a maximum value when2||)(||

)(

xF

xFu

and it equals 2||)(|| xFTherefore, the gradient of F at x is a vector in whose direction F has steepest ascent (or increase) andwhose magnitude equals the rate of increase.

Question : What is the direction of steepest descent ?

Page 24: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Minima and Maxima

)(yFTheorem (Calculus) If RRF n :

TxxxxF ]6,22[),)(( 2121

has a minimal

0)0,1(),(min 21 FxxF

or a maximal value

Example If

and

then 0))(( yF

so 0]0,0[)0,1)(( TF

Remark The function RRG 2: defined by 22

2121 ),( xxxxG satisfies 0)0,0)(( G

however G has no maxima and no minima.

22

21

221

2121 3)1(312),( xxxxxxxF

then

Page 25: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Linear Equations and Optimization

nRbTheorem If

nnRP is symmetric and positive definite

then for every

defined by

the function

satisfies the following three properties:

),...,(lim 1||||

nx

xxF1.

satisfies

has a minimum value

RRF n :xbPxxxxF TT

n 21

1 ),...,(

2. F3.

)(yFy bPy therefore it is unique.

Proof Let

)(lim||||||||

2 xFxcPxxx

T

Since Pxxc T

x 1||||min

P is pos. def.

0c

Page 26: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Linear Equations and Optimization 0rTherefore there exists a number such that

Since the set

such that bounded and closed, there exists

}||||:{ rxRxB nr

)(min)( xFyFnRx

rBy

))((0 yF

is

calculus theorem

)0()(|||| FxFrx

Therefore, by the preceding

Furthermore, since

bPxxFxbPxxxF TT )()( 21

it follows that bPybPy 10

Page 27: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Application to Least Squares Geometry 0nmTheorem Given and a matrix

(or equivalently, with

then the following conditions are equivalentmRy

BBTnmRB

cx

and

(i) The function

nonsingular),

has a minimum value at

nT RxyBxyBxxF ),()()(

(iii)

nB )(rank

(ii) yBBBc TT 1)(

} of columns {span ByBc this is read as : Bc-y is orthogonal (or perpendicular) to

the subspace of mR spanned by column vectors of B

Page 28: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Application to Least Squares Geometry

yBBBcbPc TT 1)(

Proof (i) iff (ii)

theorem implies that

,

First observe that

BBP TyBb Tcx

is symmetric and positive definite.

then the precedingIf F(x) has minimum value at

(ii) iff (iii)

yBBBcyBcB TTT 1)(0)( iff

This proof that (ii) iff (iii) was emailed to me by Fu Xiang

yyxbPxxyBxyBxxF TTTT ][2)()()( 21

} of columns {span ByBc

Page 29: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Steepest Descent Method of Cauchy (1847) is a numerical algorithm to solve

the following problem: given compute

do following: and for

RRF n :

1. Start with

FynRx

minarg

2. )( kk yFd 1y Nk :1

Compute

3. Compute

kkkk dtyy 1

)(minarg kkk dtyFt 4. Compute

Reference : pages 440-441 Numerical Methods by Dahlquist, G. and Bjorck, A., Prentice-Hall, 1974.

Page 30: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Application of Steepest Descentto minimize the previous function

do following: and for

RRF n :

1. Start with

bAxxFxbAxxxF TT )(,)( 21

2. kkk AybyFd )(1y Nk :1

Compute

3. Compute

kkkk dtyy 1

)(minarg kkk dtyFt

4. Compute

0|)(|)( kk ttkkTkttkkdt

d dtyddtyF

kTkk

Tkk AddAybdt /)(

Page 31: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

MATLAB CODE

A = [1 1;1 2];b = [2 3]';dx = 1/10;for i = 1:21for j = 1:21 x = [(i-1)*dx (j-1)*dx]'; F(i,j) = .5*x'*A*x - b'*x;endendX = ones(21,1)*(0:.1:2);Y = X';[FX,FY] = gradient(F);

contour(X,Y,F,20)hold onquiver(X,Y,FX,FY);y(:,1) = y1;for k = 1:N yk = y(:,k); dk = b - A*yk; tk = dk'*(b-A*yk)/(dk'*A*dk); y(:,k+1) = yk + tk*dk; er(k) = norm(A*y(:,k+1)-b);endplot(y(1,:),y(2,:),'ro')

function [A,b,y,er] = steepdesc(N,y1)% function [A,b,y,er] = steepdesc(N,y1)

Page 32: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Graphics of Steepest Descent

Page 33: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Constrained OptimizationProblem Minimize

constraint where

RRF n : subject to a

0)( yc

0)( xcThe Lagrange-multiplier method computes

m

j jj ycyF1

)()(

mn RRy ,

mn RRc :

that solves the

n-equations

and the m-equations

This will generally result in a nonlinear system of equations – the topic that discussed in Lecture 9.

http://en.wikipedia.org/wiki/Lagrange_multiplier

http://www.slimy.com/~steuard/teaching/tutorials/Lagrange.html

Page 34: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Examples1. Minimize with the constraint

22

2121 ),( xxxxF

AxxxF T)(

0121 xx Since

the method of Lagrange multipliers gives

and

where

TT yyyyyyF ][)1(][2),( 212121

1,01 21

2121 yyyy2. Maximize nnRA and positive definite, subject to the constraint

is symmetric

01 xxT

This gives yyyAyyF T 2)(2)( y

hence

is an eigenvector of A and yyAyyyF TT)(Therefore 0 and is the largest eigenvalue of A

Txx ]11[)1( 21

Page 35: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Tutorial 4 (Week 9, 15-19 Oct)

1. Do problem 7 on page 165. Suggestion: practice by doing problem 2 on page 164 and problem 5 on page 165 since these problems are similar and have solutions on pages 538-539. Do NOT hand in solutions for your practice problems.

2. Do problem 10 on pages 170-171. Suggestion: study the discussion of the minimum size property on pages 168-169.Then practice by doing problem 3 on page 169. Do NOT hand in solutions for your practice problems.

]|)(|max[minarg11

1

xpxnxPp n

Extra Credit : Compute

Suggestion: THINK about Theorem 2 and problem 3 on page 169.

Page 36: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Tutorial 4 (Week 9, 15-19 Oct)

3. The trapezoid method for integrating a function

using

]),([f baCn equal length subintervals can be shown to give an

estimate having the form 63

42

21)( nananaInT

where

on (a) Show that for any

dependsb

adxxI )(f and the sequence ,,, 321 aaa

.f )()2()2( 31

34 nTnTnS where

)2( nS is the estimate for the integral obtained using Simpson’s

method with n2 equal length subintervals. (b) Use this fact to

together with the form of )(nT above to prove that there exists asequence ,,, 321 bbb with 6

24

1)( nbnbInS(c) Compute constants

321 ,, rrr so that there exists a sequence

82

61321 )4()2()( ncncInTrnTrnTr

,, 21 cc with

Page 37: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Lab 4 (Week 10, 22-26 October)4. Consider the equations for the 9 variables inside the array

432

41

41

41

321

434241

3433323130

2423222120

1413121110

030201

xxx

xxxxx

xxxxx

xxxxx

xxx

3,2,1,,04 ,1,1,,1,1 jixxxxx jijijijiji

where(a) Write these equations as bAx 999 , RbRA

(b) Compute the Jacobi iteration matrix99RB and .|||| B

(c) Write a MATLAB program to implement the Jacobi methodfor a (n+2) x (n+2) array without computing a sparse matrix A.

then solve using Gauss Elim. and display the solution in the array.

Page 38: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Tutorial 4 (Week 9, 15-19 Oct)

5. Consider the equation wherebAx nnn RbRA ,

2100

12

010

0121

0012

A

(a) Prove that the vectors

)sin(

)3sin(

)2sin(

)sin(

nmh

mh

mh

mh

vm

where nmh n ,...,1,1 are eigenvectors of A

compute their eigenvalues.

(b) Prove that the Jacobi method for this matrix converges by

showing that the spectral radius of the iteration matrix is < 1.

and

Page 39: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Lab 4 (Week 10,22-26 October)

1. (a) Modify the computer code developed for Lab 3 to computepolynomials that interpolate the function 1/(1+x*x) on the interval[-5,5] based on N = 4, 8, 16, and 32 nodes located at the points x(j) = 5 cos((2 j – 1)pi/(2N)), j = 1,…,N. (b) Compare the results with the results you obtained in Lab 3 using uniform nodes. (c) Plot the functions

both for the case where the nodes x(j) are uniformly and where they are chosen as above. (d) Show that x(j) / 5 are the zeros of a Chebyshev polynomial, then derive a formula for w(x) and use this formula to explain why the use of the nonuniform nodes x(j) above gives a smaller interpolation error than the use of uniform nodes.

))(())2())(1(()( Nxxxxxxxw

Page 40: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Lab 4 (Week 10,22-26 October)

2. (a) Write computer code to compute trapezoidal approximations

68033258176636.1)4(tan1

4

0

12

x

dxI

and run this code to compute approximations I(n) and associated errors for n = 2, 4, 8, 16, 32, 64 and 128 intervals. (b) Use the (Romberg) formula that you developed in Tutorial 4 to combine I(n), I(2n), and I(4n) for n = 2,4,8,16,32 to develop more accurate approximations R(n). Compute the ratios of consecutive errors (I-I(2n))/(I-I(n)) and (I-R(2n))/(I-R(n)) for n = 2,4,8,16, present them in a table and discuss them (I denotes exact integral). (c) Compute approximations to the integral in (a) using Gauss quadrature with n = 1, 2, 3, 4, and present the errors in a table and compare them to the errors obtained in (a), (b) above.

for

Page 41: MA2213 Lecture 7 Optimization. Topics The Best Approximation Problem pages 159-165 Chebyshev Polynomials pages 165-171 Finding the Minimum of a Function.

Homework Due Lab 5 (Week 12, 5-9 November) 3. (a) Use the MATLAB program for Prob4(c)Homework dueTut. 4

12)2(

12

)2(1

1

,11,1

1,331,2

0,

1,1,11,10,1

2,01,0

nxnnx

nxxxnx

nnxxxx

nxx

nnn

nnnn

nn

n

njixxxxx jijijijiji ,1,10|4| 4

,1,1,,1,1

to compute the internal variables in the following array for n = 50.

that satisfy the inequalities

(b) Display the solution using MATLAB mesh&contour commands.

(c) Find a polynomial P of two variables so the exact solution

satisfies ),(, jiPx ji and use it to compute&display the error.