MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling...

36
MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) Modeling Examples

Transcript of MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling...

Page 1: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

1

Non Linear Programming 1

Nonlinear Programming (NLP)– Modeling Examples

Page 2: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

2

Linear Programming Model

1 1 2 2

11 1 12 2 1n n 1

21 1 22 2 2n n 2

m1 1 2 2 mn n m

Maximize .....

subject to

a x + a x + ... +a x b

a x + a x + ... +a x b

a x + a x + ... +a x b

n n

m

c x c x c x

x

1 2, , ..., 0nx x

ASSUMPTIONS:

Proportionality Assumption

– Objective function– Constraints

Additivity Assumption– Objective function– Constraints

Page 3: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

3

What is a non-linear program?

maximize 3 sin x + xy + y3 - 3z + log zSubject to x2 + y2 = 1 x + 4z 2 z 0

A non-linear program is permitted to have non-linear constraints or objectives.

A linear program is a special case of non-linear programming!

Page 4: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

4

Nonlinear Programs (NLP)

Nonlinear objective function f(x) and/or Nonlinear constraints gi(x).

Today: we will present several types of non-linear programs.

1 2 , , ,

( )

( ) , 1, 2, ,

n

i i

Let x x x x

Max f x

g x b i m

Page 5: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

5

Unconstrained Facility Location

0

2

4

6

8

10

12

14

16

y

0 2 4 6 8 10 12 14 16

C (2)

(7)

B

A(19)

P ?

D (5)

x

Loc. Dem.

A: (8,2) 19

B: (3,10) 7

C: (8,15) 2

D: (14,13) 5

P: ?

This is the warehouse location problem with a single warehouse that can be located anywhere in the plane. Distances are “Euclidean.”

Page 6: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

6

Costs proportional to distance;known daily demands

An NLP

2 28 2( ) ( )x y d(P,A) =…

2 214 13( ) ( )x y d(P,D) =

minimize 19 d(P,A) + … + 5 d(P,D)subject to: P is unconstrained

Page 7: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

7

Here are the objective values for 55 different locations.

0

50

100

150

200

250

300

350

valuesfor y

Ob

ject

ive

valu

e

x = 0

x = 2

x = 4

x = 6

x = 8

x = 10

x = 12

Page 8: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

8

Facility Location. What happens if P must be within a specified region?

0

2

4

6

8

10

12

14

16

y

0 2 4 6 8 10 12 14 16

C (2)

(7)

B

A (19)

P ?

D (5)

x

Page 9: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

9

The model

2 219 8 2( ) ( )x y

2 25 14 13( ) ( )x y

+ …+Minimize

Subject to x 7 5 y 11 x + y 24

Page 10: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

10

0-1 integer programs as NLPs

minimize j cj xj

subject to j aij xj = bi for all i

xj is 0 or 1 for all j

is “nearly” equivalent to

minimize j cj xj + 106 j xj (1- xj).

subject to j aij xj = bi for all i

0 xj 1 for all j

Page 11: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

11

Some comments on non-linear models

The fact that non-linear models can model so much is perhaps a bad sign– How can we solve non-linear programs if we

have trouble with integer programs?– Recall, in solving integer programs we use

techniques that rely on the integrality.

Fact: some non-linear models can be solved, and some are WAY too difficult to solve. More on this later.

Page 12: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

12

Variant of exercise from Bertsimas and Freund

Buy a machine and keep it for t years, and then sell it. (0 t 10)– all values are measured in $ million– Cost of machine = 1.5– Revenue = 4(1 - .75t) – Salvage value = 1/(1 + t)

Page 13: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

13

Machine values

00.5

11.5

22.5

33.5

44.5

0.2 1

1.8

2.6

3.4

4.2 5

5.8

6.6

7.4

8.2 9

9.8

Time

Mil

lio

ns

of

do

llar

s

revenue

salvage

total

Page 14: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

14

How long should we keep the machine?

Work with your partner on how long we should keep the machine, and why?

Page 15: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

15

Non-linearities Because of Time

Discount rates decreasing value of equipment over time

– wear and tear, improvements in technology Tax implications (Depreciation) Salvage value

Secondary focus of the previous model(s): Finding the right model can be subtle

Page 16: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

16

Non-linearities in Pricing

The price of an item may depend on the number sold – quantity discounts for a small seller– price elasticity for monopolist

Complex interactions because of substitutions: – Lowering the price of GM automobiles will

decrease the demand for the competitors

Page 17: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

17

Non-linearities because of congestion

The time it takes to go from MIT to Harvard by car depends non-linearly on the congestion.

As congestion increases just to its limit, the traffic sometimes comes to a near halt.

Page 18: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

18

Non-linearities because of “penalties”

Consider any linear equality constraint:

e.g., 3x1 + 5x2 + 4x3 = 17

Suppose it is a “soft” constraint and we permit solutions violating it. We can then write:

3x1 + 5x2 + 4x3 - y = 17

And we may include a term of –10y2 in the objective function.

– This adds flexibility to the solution by discourages violation of our “goals”

Page 19: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

19

Portfolio Optimization

In the following slides, we will show how to model portfolio optimization as NLPs

The key concept is that risk can be modeled using non-linear equations

Since this is one of the most famous applications of non-linear programming, we cover it in much more detail

Page 20: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

20

Risk vs. Return

In finance, one trades of risk and return. For a given rate of return, one wants to minimize risk.

For a given rate of risk, one wants to maximize return.

Return is modeled as expected value. Risk is modeled as variance (or standard deviation.)

Page 21: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

21

Expectations Add

Suppose that X and Y are random variables E(X + Y) = E(X) + E(Y)

Interpretation: – Suppose that the expected return in one year

for Stock 1 is 9%.– Suppose that the expected return in one year

for Stock 2 is 10%– If you put $100 in Stock 1, and $200 in Stock 2,

your expected return is $9 + $20 = $29.

Page 22: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

22

Variances do not add (at least not simply)

Suppose that X and Y are random variables Var(aX + bY) =

a2 Var(X) + b2 Var(Y) + 2ab Cov(X, Y)

Example. The risk of investing in “umbrellas” and “sunglasses” is less than the risk of either investment by itself.

In general:

Var(X1 + X2 + …+ Xn) = 1( ) 2 ( , )

n

i i ji i jVar X Cov X X

Page 23: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

23

Reducing risk

Diversification is a method of reducing risk, even when investments are positively correlated (which they often are).

If only two investments are made, then the risk reduction depends on the covariance.

Page 24: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

24

Portfolio Selection (cont’d)

Two Methods are commonly used:

– Min Risk

s.t. Expected Return Bound

– Max Expected Return - (Risk)

where reflects the tradeoff between return and risk.

Page 25: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

25

Portfolio Selection Example

There are 3 candidate assets for out portfolio, X, Y and Z. The expected returns are 30%, 20% and 8% respectively (if possible we would like at least a 12% return). Suppose the covariance matrix is:

What are the variables?

3 1 0 5

1 2 0 4

0 5 0 4 1

.

.

. .

X Y Z

X

Y

Z

Let X,Y,Z be percentage of portfolio of each asset.

Page 26: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

26

Portfolio Selection Example

Min

st

Max

st

2 2 23 2 2 0.8X Y Z XY XZ YZ

1.3 1.2 1.08 1.12

1

0, 0, 0

X Y Z

X Y Z

X Y Z

2 2 2

1.3 1.2 1.08

(3 2 2 0.8 )

X Y Z

X Y Z XY XZ YZ

1

0, 0, 0

X Y Z

X Y Z

Page 27: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

27

More on Portfolio Selection

There can be institutional constraints as well, especially for mutual funds.

No more than 15% in the energy sector Between 20% to 25% high growth At most 3% in any one firm etc. We end up with a large non-linear program. The unconstrained version becomes the “CapM

model” in finance.

Portfolio Example

Page 28: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

28

RegressionEstimate for Midterm = x * HW3 + y

Midterm = x * HW3 + y + residual

x y

0.6 40

HW3 Estimate Midterm 1 Residual Residual squared91 94.6 89 -5.6 31.3680 88 97.5 9.5 90.2561 76.6 58.5 -18.1 327.6188 92.8 92 -0.8 0.6486 91.6 93.5 1.9 3.6156 73.6 87 13.4 179.5660 76 99 23 52987 92.2 85 -7.2 51.8450 70 67 -3 9

sum of squares 1222.87

Find the best linear fit for estimating the midterm grade from the homework grades

Page 29: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

29

Writing regression as an NLP

Minimize j (rj)2

subject to

r1 = (91x + y) – 89

r2 = (80x + y) – 97.5

r3 = (61x + y) – 58.5

r9 = (50x + y) – 67

Minimize j (rj)2

subject to

rj = Hj x + y – Mj for each j

In an optimization framework, one can constrain coefficients.

Page 30: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

30

Midterm 2 vs Homeworks (2002)

30

40

50

60

70

80

90

100

30 40 50 60 70 80 90 100

Avg of last 3 homeworks

Mid

term

Gra

de

r2 =.082

Page 31: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

31

Midterm 1 vs. homework 3 (2001)

40

50

60

70

80

90

100

40 50 60 70 80 90 100

homework 3 grades

mid

term

gra

de

s

r2 =.29

Page 32: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

32

An application of regression to finance

A famous application in Finance of determining the best linear fit is determining the of a stock.

CAPM assumes that the return of a stock s in a given time period is

rs = a + rm + ,

rs = return on stock s in the time period

rm = return on market in the time period

= a 1% increase in stock market will lead to a % increase in the return on s (on average)

Page 33: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

33

Regression, and estimating

Return on Stock A vs. Market Return

-60.00%

-40.00%

-20.00%

0.00%

20.00%

40.00%

60.00%

80.00%

-40.00% -20.00% 0.00% 20.00% 40.00% 60.00% 80.00%

Market

Sto

ck

What is the best linear fit for this data? What does one mean by best?

Page 34: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

34

Regression, and estimating

Return on Stock A vs. Market Return

-60.00%

-40.00%

-20.00%

0.00%

20.00%

40.00%

60.00%

80.00%

-40.00% -20.00% 0.00% 20.00% 40.00% 60.00% 80.00%

Market

Sto

ck

The value is the slope of the regression line. Here it is around .6 (lower expected gain than the market, and lower risk.)

Page 35: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

35

Solving NLP’s by Excel Solver

Page 36: MIT and James Orlin © 2003 1 Non Linear Programming 1 Nonlinear Programming (NLP) –Modeling Examples.

MIT and James Orlin © 2003

36

Summary

Applications of NLP to location problems, portfolio management, regression

Non-linear programming is very general and very hard to solve

Special case of convex minimization NLP is easier, because a local minimum is a global minimum