CS B553: Algorithms for Optimization and Learning

Post on 22-Feb-2016

52 views 0 download

description

CS B553: Algorithms for Optimization and Learning. Univariate optimization. f (x). x. Key Ideas. Critical points Direct methods Exhaustive search Golden section search Root finding algorithms Bisection [More next time] Local vs. global optimization Analyzing errors, convergence rates. - PowerPoint PPT Presentation

Transcript of CS B553: Algorithms for Optimization and Learning

CS B553: ALGORITHMS FOR OPTIMIZATION AND LEARNINGUnivariate optimization

x

f(x)

KEY IDEAS Critical points Direct methods

Exhaustive search Golden section search

Root finding algorithms Bisection [More next time]

Local vs. global optimization Analyzing errors, convergence rates

x

f(x)Local maxima

Local minimaInflection point

Figure 1

x

f(x)

a b

Figure 2a

x

f(x)

a b

Find critical points, apply 2nd derivative test

Figure 2b

x

f(x)

a b

Figure 2b

x

f(x)

a b

Global minimum must be one of these points

Figure 2c

x

f(x)

a b

Exhaustive grid searchFigure 3

x

f(x)

a b

Exhaustive grid search

x

f(x)Two types of errors

x* xt

f(xt)

f(x*)

Geometric error

Anal

ytica

l erro

rFigure 4

x

f(x)

a b

Does exhaustive grid search achieve e/2 geometric error?

e

x*

x

f(x)

a b

Does exhaustive grid searchachieve e/2 geometric error?

Not necessarily for multi-modal objective functions

Error

x*

LIPSCHITZ CONTINUITYSlope +K

Slope -K

|f(x)-f(y)| K|x-y|

Figure 5

x

f(x)

a b

Exhaustive grid search achieves Ke/2 analytical error in worst case

e

Figure 6

x

f(x)

a b

Golden section search

m

Bracket [a,b]Intermediate point m with f(m) < f(a),f(b)

Figure 7a

x

f(x)

a b

Golden section search

m

Candidate bracket 1 [a,m]

c

Candidate bracket 2 [c,b]

Figure 7b

x

f(x)

a b

Golden section search

m

Figure 7b

x

f(x)

a b

Golden section search

m c

Figure 7b

x

f(x)

a b

Golden section search

m

Figure 7b

x

f(x)

a b

Optimal choice: based on golden ratio

m

Choose c so that (c-a)/(m-c) = , where is the golden ratio=> Bracket reduced by a factor of -1 at each step

c

NOTES Exhaustive search is a global optimization:

error bound is for finding the true optimum GSS is a local optimization: error bound

holds only for finding a local minimum Convergence rate is linear:

with xn = sequence of bracket midpoints

x

f(x)

Root finding: find x-value where f’(x) crosses 0

f’(x)

Figure 8

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

Figure 9a

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

m

Figure 9

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

Figure 9

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

m

Figure 9

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

Figure 9

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

m

Figure 9

Bisectiong(x)

a b

Bracket [a,b]Invariant: sign(f(a)) != sign(f(b))

Linear convergence: Bracket size is reduced by factor of 0.5 at each iteration

Figure 9

NEXT TIME Root finding methods with superlinear

convergence Practical issues