Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic,...

21
Introduction to Design Optimization 16.90 12 May, 2014 1 Willcox, 16.90, Spring 2014

Transcript of Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic,...

Page 1: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Introduction to Design Optimization

16.90 12 May, 2014

1Willcox, 16.90, Spring 2014

Page 2: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Today’s Topics

• Unconstrained optimization algorithms (cont.) • Computing gradients • The 1D search in an optimization algorithm • Surrogate models • Least squares fitting of a response surface

2Willcox, 16.90, Spring 2013

Page 3: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Design Optimization Problem Statement The design problem may be formulated as a problem of

min J x p,

s.t. g(x,p) 0 h(x,p)=0xi , ,LB x xi i UB (i n 1, ..., )

Nonlinear Programming (NLP)

1

2

1

1

1

1

where

( ) ( )

( ) ( )

( ) ( ) ( ) ( ) 1 1( ) ( ) ( ) ( )

1( ) ( )

1 1( ) ( )

1 ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )

( ) ( ) ( ) ( ) 2 2( ) ( ) ( ) ( )

2( ) ( )

2 2( ) ( )

2 ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )

J xJ JT

z x

x x x x Ti n

g xg gT

m x

h xh hT

m x

3

Page 4: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Gradient-Bntnt-B

x0, q=0

Calculate J(xq)

Calculate Sq

q=q+1

Perform 1-D search xq = xq-1 + qa Sq

no yesConverged? Done

4

Based Optimization Process

Page 5: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Unconstrained Problems: Unconstrained Problems: Gradient

Unconstrained Problems: ntntUnconstrained Problems: Unconstrained Problems: Unconstrained Problems: ntnt

• First-Order Methods – use gradient information to calculate S– steepest descent method – conjugate gradient method – quasi-Newton methods

• Second-Order Methods – use gradients and Hessian to calculate S– Newton method

• Often, a constrained problem can be cast as an unconstrained problems and these techniques used.

5

Based Optimization Methods-

Page 6: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Steepest Descent

Sq = -J(xq-1) -J(x) is the direction of max decrease of J at x

Algorithm: choose x0, set x=x0

repeat until converged: S = -J(x) choose a to minimize J(x+aS) x = x + aS

• doesn’t use any information from previous iterations • converges slowly a is chosen with a 1-D search (interpolation or Golden section)

6

Page 7: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Conjugate Gradient

S1 = -J(x0)Sq = -J(xq-1) + bqSq-1

x2

qJ( )q 1

b ( )

2J q2 x

• search directions are now conjugate • directions Sj and Sk are conjugate if SjT H Sk = 0

(also called H-orthogonal) • makes use of information from previous iterations

without having to store a matrix

7

Page 8: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Geometric Interpretation

Steepest descent Conjugate gradient

Figures from “Optimal Design in Multidisciplinary Systems,” AIAA Professional Development Short Course Notes, September 2002.

8

Page 9: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Newton’s Method

Taylor series:

x T 1J J(x) ( )0 0J( ) T ( )0 x x x H x x2

where

differentiate:

0 x x x0 0( ) ( ) ( )J J x x H x x

at optimum J(x*)=0 0 0 J( )x H x( ) x 0

H x1

x ( )0 0

J( )x

9

Page 10: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Newton’s Method

S ( )10 0

H x J( )x

• if J(x) is quadratic, method gives exact solution in one iteration

• if J(x) not quadratic, perform Taylor series about new point and repeat until converged

• a very efficient technique if started near the solution

• H is not usually available analytically, and finite difference is too expensive (nn matrix)

• H can be singular if J is linear in a design variable

10

Page 11: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Quasi Newton Sq = -Aq J(xq-1)

• Also known as variable metric methods • Objective and gradient information is used to create an

approximation to the inverse of the Hessian • A approaches H-1 during optimization of quadratic functions • Convergence is similar to second-order methods (strictly 1st order)

• Initially: A=I, so S1 is steepest descent direction then: Aq+1 = Aq + Dq

where D is a symmetric update matrix Dq = fn(xq-xq-1, q q-1 qJ(x )- J(x ), A )

• Various methods to determine De.g., Davidon-Fletcher-Powell (DFP) Broydon-Fletcher-Goldfarb-Shanno (BFGS)

11

Page 12: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Computing Gradients Using Finite Computing Gradients Using Finite Difference Approximation

12Willcox, 16.90, Spring 2013

Page 13: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

The 1D The 1D

Willcox, 16.90, Spring 2013 13

earchS

Page 14: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Surrogate Models from Choi et al.

Data-fit models Simplified physics models

Projection-based reduced models • Exploit problem structure • Embody underlying physics

= + = +

==

Page 15: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Data Fit Methods

• Sample the simulation at some number of design points – Use DOE methods (e.g., Latin hypercube) to select the

points

• Fit a surrogate model using the sampled information

• Surrogate may be global (e.g., quadratic response surface) or local (e.g., Kriging interpolation)

• Surrogate may be updated adaptively by adding sample points based on surrogate performance (e.g., Efficient Global Optimization, EGO)

15

Page 16: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Polynomial Response Surface Method

• Surrogate model is a local or global polynomial model

• Can be of any order – Most often quadratic; higher order requires many samples

• Advantages: Simple to implement, visualize, and understand, easy to find the optimum of the response surface

• Disadvantages: May be too simple, doesn’t capture multimodal functions well

16

Page 17: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Global Polynomial Response Surface • Fit objective function with a polynomial

• e.g., quadratic approximation to a function of n design variables x1, x2, …, xn

• Coefficients determined using a least squares fit to available data

• Update model by including a new function evaluation then doing least squares fit to compute the new coefficients

17

Page 18: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Fitting a Polynomial Response Surface

Page 19: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Polynomial Response Surface Method

Matlab demo: Peaks function

19

Page 20: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

Summary

• From this lecture and the online notes you should: – Have an understanding of how a design problem can

be posed as an optimization problem – Have a basic understanding of the steps in the

gradient-based unconstrained optimization algorithms – Be able to estimate gradient and Hessian information

using finite difference approximation – Understand how to construct a polynomial response

surface using least-squares regression and how to measure the quality of fit.

20Willcox, 16.90, Spring 2013

Page 21: Introduction to Design Optimization 16.90 12 May, 2014 · 00 Hx. J x • if. J (x) is quadratic, method gives exact solution in one iteration • if. J (x) not quadratic, perform

MIT OpenCourseWarehttp://ocw.mit.edu

16.90 Computational Methods in Aerospace EngineeringSpring 2014

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.