Using Modern Design of Experiments (DOE) Methods to...

37
Copyright © 2008, SAS Institute Inc. All rights reserved. INFORMS NYC Chapter March 17, 2010 Tom Donnelly, PhD JMP Principal Customer Advocate [email protected] Using Modern Design of Experiments (DOE) Methods to Optimize Processes

Transcript of Using Modern Design of Experiments (DOE) Methods to...

Page 1: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

INFORMSNYC ChapterMarch 17, 2010

Tom Donnelly, PhDJMP Principal Customer Advocate

[email protected]

Using Modern

Design of Experiments (DOE)

Methods to Optimize Processes

Page 2: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

A Fitting Beginning…

2

1906 – W.T. Gossett, a Guinness chemist

Draw a yeast culture sample

Yeast in this culture?

Guess too little –incomplete fermentation; too much -- bitter beer

He wanted to get it right

Page 3: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Summary

Building predictive models of multiple-responses allows one to provide management with the knowledge to make better business decisions.

A Design of Experiments (DOE) is a collection of trials built to support a proposed model.

Modern computer-based DOE tools can quickly build a design for your predictive model – and do it for virtually any real-world combination of factor types, additional constraints, and special models.

3

Page 4: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Design of Experiments (DOE) for 25+ Years

„83-‟87 Honeywell, Inc., Engineer

First saw the power of DOE in 1984 – career changing event

„87-‟99 ECHIP, Inc., Partner & Technical Director

200+ DOE courses, on-site at 40+ companies - many

chemical/food/pharma - requiring mixture/formulation DOE

„99-‟05 Peak Process, LLC, Consultant

„05-‟08 US Army, Edgewood CB Center, Analyst

DOE with Real data and Modeling & Simulation data

MORS → WSC → MAS → INFORMS

Dec. ‟08 Joined the SAS Institute Inc., Customer Advocate

Work mostly in DOE and Federal Government domains

– Data Visualization, Data Mining* and their synergy with DOE

– Primarily support DoD sites and National Laboratories

* April 13, NYC, Data Mining with Prof. Dick DeVeaux 4

My Background

Page 5: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Projects Using DOE at U.S. Army ECBC

JPM Nuclear Biological Chemical Contamination Avoidance (NBCCA) - Whole Systems Live Agent Test (WSLAT) Team support to the Joint Biological Point Detection System (JBPDS)

Agent Fate wind tunnel experiments

Decontamination Sciences Team

• Contact Hazard Residual Hazard Efficacy Agent T&E Integrated Variable Environment (CREATIVE) -real and simulation data

• Modified vaporous hydrogen peroxide (mVHP) decontamination – real data

Smoke and Target Defeat Team

• Pepper spray characterization – real data

• Obscurant material evaluation (with OptiMetrics, Inc.) – simulation data

U.S. Army Independent Laboratory In-house Research (ILIR) on novel experimental designs used with simulations

• Re-analysis of U.S. Air Force Kunsan Focused Effort BWA simulation data

• CB Sim Suite used for sensitivity analysis of atmospheric stability

U.S. Marine Corps Expeditionary Biological Detection (EBD) Advanced Technology Demonstration (ATD)

• Chamber testing of detectors – real data

• CB Sim Suite sensor deployment studies – simulation data

U.S. Navy lead on Joint Expeditionary Collective Protection (JECP)

• Swatch and chamber testing – real data

• Computational Fluid Dynamics (CFD) – simulation data 5

Detection, Decontamination & Protection

Page 6: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

If you present information (not just data!),I highly recommend you read Tufte.

6

His grand principles include:

– Enforce wise visual comparisons – Content counts most of all

– Show causality – Use small multiples (format constancy)

– Use multivariate displays – Put everything on universal grid

– Give reasons to believe – Don‟t de-quantify data

– Complete integration of evidence

− words, numbers, images, diagrams www.edwardtufte.com

Page 7: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.7

Plot ALL the Data

• Enforce wise visual comparisons – Content counts most of all

• Show causality – Use small multiples (format constancy)

• Use multivariate displays – Put everything on universal grid

• Give reasons to believe – Don‟t de-quantify data

• Complete integration of evidence

− words, numbers, images, diagrams

30

40

50

60

70

80

90

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Depth

Front Rear

30

40

50

60

70

80

90

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

Detector Configuration

13682457

30

40

50

60

70

80

90

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Front Rear

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

13682457

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

55.4

30

40

50

60

70

80

90

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

Detector #

Leverage Plots for the Response Data „Counts‟ for Six Explanatory Variables

30

40

50

60

70

80

90

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Depth

Front Rear

30

40

50

60

70

80

90

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

Detector Configuration

13682457

30

40

50

60

70

80

90

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Front Rear

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

13682457

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

55.4

30

40

50

60

70

80

90

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

Detector #

30

40

50

60

70

80

90

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Depth

Front Rear

30

40

50

60

70

80

90

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

Detector Configuration

13682457

30

40

50

60

70

80

90

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Front Rear

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

13682457

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

55.4

30

40

50

60

70

80

90

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Depth

Front Rear

30

40

50

60

70

80

90

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

Detector Configuration

13682457

30

40

50

60

70

80

90

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

54.2 54.3 54.4 54.5 54.6 54.7 54.8

Depth Leverage, P=0.3547

Front

Rear

Level

54.188269

54.791088

Least Sq Mean

0.45805290

0.46049586

Std Error

54.1785

54.8270

Mean

Front Rear

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

53.8 54.2 54.6 55.0

Config. Leverage, P=0.0176

1368

2457

Level

55.267996

53.711362

Least Sq Mean

0.46049586

0.45805290

Std Error

55.2351

53.7747

Mean

13682457

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Height Leverage, P<.0001

Lower

Upper

Level

50.227371

58.751987

Least Sq Mean

0.46049586

0.45805290

Std Error

50.0872

58.8684

Mean

Height

Lower

Upper

55.4

30

40

50

60

70

80

90

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

Detector #

30

40

50

60

70

80

90

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

50 52 54 56 58 60

Width Leverage, P<.0001

Left

Right

Level

58.693061

50.286297

Least Sq Mean

0.45805290

0.46049586

Std Error

58.7307

50.2264

Mean

Width

Left

Right

30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

48 52 56 60 64

DAY Leverage, P<.0001

1

2

Level

62.229102

46.750256

Least Sq Mean

0.45805231

0.46044230

Std Error

62.2682

46.6513

Mean

DAY

1

2 30

40

50

60

70

80

90

Counts

-Levera

ge R

esid

uals

40 45 50 55 60 65

APS # Leverage, P<.0001

1

2

3

4

Level

41.515000

55.848140

62.435288

58.160286

Least Sq Mean

0.64427253

0.65127778

0.65127778

0.65127611

Std Error

41.5150

56.0217

62.5911

58.1528

Mean

1

2

34

Detector #

Leverage Plots for the Response Data „Counts‟ for Six Explanatory Variables

2

6

WIDTH

H

E

I

G

H

T

DEPTH

5

3

1

8

Upper

Front

Rear

Left Right

Lower

Aerosol

Injected

Chamber

Exhausted

Chamber Width = twice test volume WIDTH

4

7

1-3-6-8

2-4-5-7 Detector

Configuration

2

6

WIDTH

H

E

I

G

H

T

DEPTH

5

3

1

8

Upper

Front

Rear

Left Right

Lower

Aerosol

Injected

Chamber

Exhausted

Chamber Width = twice test volume WIDTH

4

7

1-3-6-8

2-4-5-7 Detector

Configuration1-3-6-8

2-4-5-7 Detector

Configuration

Page 8: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Three Sections to Presentation

What is Design of Experiments (DOE)?

The power of predictive modeling

• Show how you can provide management with process knowledge that makes their decision making easier

Using a modern Custom Design tool

• Quickly create a design for a proposed model

• Review the Custom Design creation process, but answer the “why did you do that?” questions

• Make a Complex Custom DOE

− 10 factors, 4 types of factors, additional constraints

8

Page 9: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Classic Definition of DOE

Purposeful control of the inputs (factors) in such a way as to deduce their relationships (if any) with the output (responses).

9

Noise

Uncontrolled Factors

e.g . Humidity

Page 10: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Here are 4 Controls (inputs) & 2 Responses (outputs) and their empirical relationships (model)

10

Get this Prediction Profiler as result of analyzing data collected for a DOE

Page 11: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Alternative Modern Definition

A DOE is the specific collection of trials run to

support a proposed model.

• If proposed model is simple, e.g. just main or 1st order

effects (x1 , x2 , x3, etc.), the design is called a screening

DOE

• If the proposed model is more complex, e.g. the model is

2nd order so that it includes two-way interaction terms (x1x2 ,

x1x3, x2x3, etc.) and in the case of continuous factors,

squared terms (x12, x2

2, x32 , etc.), the design is called a

response-surface DOE

11

Page 12: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Fit requires

data from all

3 blocks

Can fit data

from blocks

1, 2 or 3

Fit requires

data from

blocks 1 & 2

Lack-of-fitLack-of-fit

Block 3Block 1 Block 2

x1

x3 x3x3

x1x1

Response SurfaceDOE in a Nutshell

12

Page 13: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Expensive Experimentation? Sequential DOE is Often Used

Block 3Block 1 Block 2

y = a0 + a1x1 + a2x2 + a3x3

Run this block 1st to:

(i) estimate the main effects*

(ii) use center point to check

for curvature.

y = a0 + a1x1 + a2x2 + a3x3

+ a12x1x2 + a13x1x3 + a23x2x3

Run this block 2nd to:

(i) repeat main effects estimate,

(ii) check if process has shifted

(iii) add interaction effects to

model if needed.

y = a0 + a1x1 + a2x2 + a3x3

+ a12x1x2 + a13x1x3 + a23x2x3

+ a11x12 + a22x2

2 + a33x32

Run this block 3rd to:

(i) repeat main effects estimate,

(ii) check if process has shifted

(iii) add curvature effects to

model if needed.

*May be all that are needed withappropriate physics-based scaling

JMP supports non-linear modeling

x1

x3 x3x3

x1x1

13

Page 14: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Why Use Design of Experiments (DOE)?

Quicker answers, lower costs, solve bigger problems

14

Why is Using DOE Important?

“One thing we have known for many months is that the spigot of defense funding opened by 9/11 is closing.”

“In the past, modernization programs have sought a 99 percent solution over a period of years, rather than a 75 percent solution over a period of weeks or months.”

Robert M. Gates, Secretary of Defense, January 27, 2009

Page 15: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Why Use Design of Experiments (DOE)?

Quicker answers, lower costs, solve bigger problems

Real Data

• Get a ranking of the factors – pick a winner

• Get a predictive “picture” (with 95% limits) of the process

Simulation Data – used more and more in DoD and Industry

• Obtain a fast surrogate “metamodel” of the long-running simulation

Analysis benefits for both types of data:

• more rapidly answer “what if?” questions

• do sensitivity analysis of the control factors

• optimize multiple responses and make trade-offs

By running efficient subsets of all possible combinations, one can –for the same resources and constraints – solve bigger problems

By running sequences of designs one can be as cost effective as possible & run no more trials than are needed to get a useful answer

15

Page 16: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Response Surface & Contour Plot

(four control variables)

3-D

response

surface

HorizVert

t4

rate

rpm

viscosity

Factor

320

115

255

80

Current X

melt

tensile

Response

250

20000

Contour

305.35337

41081.766

Current Y

.

.

Lo Limit

.

.

Hi Limit

150

175

200

225

250

275

300

rpm

tensile

12955.46

17096.21

21236.97

25377.73

29518.49

33659.24

37800

8814.698

100 110 120 130 140 150 160 170 180 190 200

rate

0rate

rpm

tensile

2-D

contour

plot

16

Page 17: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.17

Response Surfaces & Contour Plots

(two responses and four control variables)

Page 18: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

1-D Prediction Profiles are a Way to View Higher Dimensionality as “Interactive Small Multiples” -Here 4 Controls & 2 Responses

250

260

270

280

290

300

310

320

330

me

lt3

05

.0278

±4.8

63317

0

10000

20000

30000

40000

50000

ten

sil

e4

10

80.7

1[3

2758.7

,51516.8

]

250

260

270

280

290

300

310

320

330

320t4

80

100

120

140

160

180

200

220

115.6rate

150

200

250

300

254.3rpm

60

65

70

75

80

80viscosity

Prediction Profiler

1-D

profiler

plots

18

Page 19: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.19

1-D Prediction Profiles are a Way to View Higher Dimensionality as “Interactive Small Multiples” - Here 4 Controls & 2 Responses

Page 20: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Interaction Profiles are Another Way to View Higher Dimensionality -Here 4 Controls and 1 Response

0

10000

20000

30000

40000

ten

sile

0

10000

20000

30000

40000

ten

sile

0

10000

20000

30000

40000

ten

sile

0

10000

20000

30000

40000

ten

sile

t4

100

200

150

300

60

80

260

270

280

290

300

310

320

330

260320

rate

150300

6080

100

120

140

160

180

200

220

260

320

100

200

rpm

60

80

150

200

250

300

260

320

100

200

150

300

viscosity

60

65

70

75

80

t4ra

terp

mvis

co

sity

Interaction Profiles

20000

1-D plots at high &

low of other

factors

Parallel

indicates NO

interaction

NOT Parallel

indicates

interaction

20

Page 21: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.21

Find Robust Operating Conditions that are Insensitive to Variability in Control Factors

Monte Carlo simulations can be run using known or assumed

distributions of input variability to better assess transmitted

variation about the model point estimate.

Page 22: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Three Sections to Presentation

What is Design of Experiments (DOE)?

The power of predictive modeling

• Show how you can provide management with process knowledge that makes their decision making easier

Using a modern Custom Design tool

• Quickly create a design for a proposed model

• Review the Custom Design creation process, but answer the “why did you do that?” questions

• Make a Complex Custom DOE

− 10 factors, 4 types of factors, additional constraints

22

Page 23: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Multiple Response Optimization3 responses and 4 control factors

23

Page 24: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Three Sections to Presentation

What is Design of Experiments (DOE)?

The power of predictive modeling

• Show how you can provide management with process knowledge that makes their decision making easier

Using a modern Custom Design tool

• Quickly create a design for a proposed model

• Review the Custom Design creation process, but answer the “why did you do that?” questions

• Make a Complex Custom DOE

− 10 factors, 4 types of factors, additional constraints

24

Page 25: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Many Design Choices – All the Classics –But Modern Custom Design Simplifies Use

25

The “real-world” DOE

solution that‟s good to

use even when the

problem is simple.

Design methods for

very specialized

problem areas.

Page 26: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Create a Custom DOE

Enter Factor and Response Information

• Responses – Speed, Contrast and Cost

• Factors and ranges (or levels):

− Sensitizer 1 50 to 90

− Sensitizer 2 50 to 90

− Dye 200 to 300

− Reaction Time 120 to 180

Propose Model

• 2nd order for prediction

Make Design

26

Page 27: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.27

“Minimum” is equal to number of terms in the model

When factors are all continuous “Default” is the smallest power

of 2 greater than the number of terms in the model

If “Default” is not at least 5 more than “Minimum,” then enter 5 +

“Minimum” (or more if you can afford it) in “User Specified”

Increase degrees of freedom in model error estimate

Page 28: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

# Unique Trials for 3 Response-Surface Designs and # Quadratic Model Terms

vs.# Continuous Factors

28

0

10

20

30

40

50

60

70

80

Y

2 3 4 5 6 7 8 9

Number of

Continuous Factors

Unique Trials in Central Composite DesignY

Terms in Quadratic Model

Unique Trials in Custom Design with 5 df for Model Error

Unique Trials in Box-Behnken Design

90

Page 29: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Increase degrees of freedom for pure error estimate

29

Value input is actual number of trials added to design.

Value input is number of times the design is replicated.

If design has 20 unique trials, then a “2” here adds

2 X 20 = 40 more trials to design for a total of 60.

Having a model error estimate and a pure error estimate allows for a lack-of-fit test to be conducted.

Page 30: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Complex design combining 4 types of factors with additional constraints including the mixture proportion summing to less than 1, i.e. some component(s) held constant in the blend

1) Base 0.40 0.55 mixture

2) Filler 0.20 0.40 mixture

3) X-linker 0.01 0.03 mixture

4) A-Polymer 0.00 0.30 mixture

5) B-Polymer 0.00 0.30 mixture

6) C-Polymer 0.00 0.30 mixture

7) Cure Time 15 45 continuous

8) Temperature 140 160 continuous

9) Mixer Brand A B categorical

10) Days 1 through 7 blocking

30

Everything but the kitchen sink…

Factor Name Range or Levels Factor Type

Page 31: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

On-Demand Webcasts Available

31

http://www.jmp.com/about/events/webcasts/ondemand_detail.shtml?reglink=70130000000BobW

Page 32: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Summary

Building predictive models of multiple-responses allows one to provide management with the knowledge to make better business decisions.

A Design of Experiments (DOE) is a collection of trials built to support a proposed model.

Modern computer-based DOE tools can quickly build a design for your predictive model – and do it for virtually any real-world combination of factor types, additional constraints, and special models.

32

Page 33: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Page 34: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Topics covered in detail in JMP DOE Training -Assumptions, Caveats and Rules of Thumb

Ask good questions. DOE is only as good at answering questions as you are at asking them! Ask a silly question, you‟ll get a silly answer.

Assume errors are IIDN(0, σ^2): Independent and identically distributed in a normal distribution with mean zero and variance σ^2. JMP provides transformations that can frequently help make variance more uniform.

Sample size: If signal is twice the noise, then life is easy. If signal is half the noise, then life is tough. JMP has a sample size calculator.

Boldness: The more you turn a “knob,” the bigger its effect and the easier it is to detect with significance for a given sample size. Make range as wide as you can without “breaking” the process. If process/design breaks, then repair it with JMP‟s Augment Design.

Randomization is an insurance policy to prevent correlation of studied with unknown factors. A premium must be paid. Fortunately, JMP lowers the premium by providing split-plot designs for hard-to-change factors.

Checkpoints: Today, next month, at optima, to support next higher model…

22 questions to ask at start of any DOE…

“Purpose of the model is to sharpen the questions.” – Samuel Karlin

34

Page 35: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

Twenty-two Questions I Like to Askat the Start of DOE Discussions:

1. What is the goal of the experimentation?

2. How do you measure success?

3. What response variables do you measure?

4. What are all the control factors that may affect these responses?

5. Over what ranges does it make sense to operate these variables?

6. Do any combinations of variable settings cause problems? (Safety? Cost? Breaks the equipment? Impossible to achieve?)

7. Do you currently run control samples for this process?

8. If you do exactly the same process on separate days do you ever get obviously/surprisingly different results?

9. How big is the variability (What is the standard deviation?) for each response?

10. Do you have past records of replicated trials for each response?

11. Are the replicate trials close together or spread out over time?

12. How tiny of a difference for each response is considered practically important?

13. Do you think we are looking for tiny differences in big variability (hard to do because lots of replication is needed) or big differences in small variability (easy)?

14. If more than one response needs to be characterized for your process, what is their relative importance?

15. Are you interested in identifying the best trade-off in performance of several responses?

16. Are you more interested in identifying important control factors or in ending up with a model that can predict your responses?

17. How many trials can be run in a day?

18. Are there any hard-to-change factors?

19. How many devices do you have of each type?

20. How hard is it to come back at a later time to run checkpoint trials?

21. What is your budget?

22. What is your deadline?35

Page 36: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

The JMP Training Path

Some courses also available in half-day, Live-Web sessions

• Visit www.jmp.com

What I showed today

used Custom Design

36

Page 37: Using Modern Design of Experiments (DOE) Methods to ...nymetro.chapter.informs.org/prac_cor_pubs/03-10 tom donnelly DOE... · Using Modern Design of Experiments (DOE) Methods to Optimize

Copyright © 2008, SAS Institute Inc. All rights reserved.

JMP DOE Help

The JMP DOE Tutorial under

Help>Tutorials> DOE Tutorial

The JMP DOE Guide under

Help>Books> JMP DOE Guide