- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration,...

17
- 1 - Overall procedure of validation Calibratio n Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy assessment by comparison of model outputs with experimental measurements. Adjusting of physical modeling parameters in the model to improve agreement with experimental data Optional If calibrated by experiment, prediction at untried conditions and validate again. New Experiment Validated Model Blind prediction Prediction

Transcript of - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration,...

Page 1: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 1 -

Overall procedure of validation

Calibration

Validation

Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ).

Model accuracy assessment by comparison of model outputs with experimental measurements.

Adjusting of physical modeling parameters in the model to improve agreement with experimental data

Optional

If calibrated by experiment, prediction at untried conditions and validate again.

New Experiment

Validated Model

Blind prediction

Prediction

Page 2: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 2 -

Approaches for calibration

• Traditional (deterministic) calibration– Parameters are estimated as a single value such that minimize the

squared error between the computer model and experimental data.– As a result, the model is given by a single function.

• Statistical calibration– Also called Calibration under Uncertainty (CUU). – Parameters are estimated using statistical inference technique to incor-

porate uncertainty due to the observation error. – As a result, the model is given by the confidence bounds.

Page 3: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 3 -

Approaches for statistical calibration

– Based on section 13.5, Oberkampf textbook.

• Frequentist approach– Parameter is constant, but unknown because of limited data. – The most popular method is to follow the two steps.

1. point estimate parameters by maximum likelihood estimation (MLE).2. draw samples of parameters by bootstrap technique.

– Advantage is they are simpler and easier to use than Bayesian methods.But less applied to the calibration problem.

• Bayesian approach– Parameter is treated as random variable, characterized by the probabil-

ity distribution conditional on the data. – Also called Bayesian updating. Before updating, the distribution is prior,

after updating , it is posterior.

Page 4: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 4 -

Calibration in order of complexity

• Deterministic calibration– Carry out parameter estimation using optimization technique to obtain a

single function as a calibrated model.– The method is not useful because uncertainty is not included. It is like to

use the mean value of the uncertainty in the design decision.

• Statistical calibration without discrepancy– Carry out parameter estimation using statistical technique to obtain con-

fidence bounds of the calibrated model.– Bayesian approach is common, with the MCMC as the technique for pa-

rameter estimation in the probabilistic way.– Due to lack of knowledge, model often differs inherently from the reality.

No matter how many data used for calibration, they may fail to agree.– Without accounting for this, assuming the model is correct, we end up

with large error, mistaken that they are from experiments, not the model.

Page 5: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 5 -

Calibration in order of complexity

• Statistical calibration with discrepancy– How to model the discrepancy ?

Gaussian process regression (GPR) is employed to express the dis-crepancy in approximate manner.

– Estimation includes not only the calibration parameters but also the as-sociated GPR parameters.

– The discrepancy term has two purposes.1. close the gap between the model and reality, making further im-provement of the calibration.2. validate the model accuracy. If small discrepancy, the model is good.

Page 6: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 6 -

Calibration in order of complexity

• Statistical calibration with surrogate model– During the MCMC, thousands of model evaluations are needed.

If the model is expensive, surrogate model should be introduced. – GPR is employed for this purpose, where design of computer experi-

ments (DACE) is critical in the process.– Estimation includes three parts: calibration parameters, GPR parame-

ters for surrogate model, GPR parameters for discrepancy.Efficiency decreases quickly as the number of parameters increases.

– MLE plug-in approach:Surrogate GPR model is deterministic. Parameters are point-estimated. Only the others are estimated probabilistically.

– Full Bayesian approach:Includes all the parameters in the estimation. This is the ultimate com-plexity in the calibration. This is the topic the KOH has addressed.

Page 7: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 7 -

Outline of the calibration lecture

• Motivating example• Deterministic calibration• Statistical calibration without discrepancy

– Bayesian approach– Frequentist approach

• Statistical calibration with discrepancy– GPR revisited.

• Statistical calibration with surrogate model– MLE plug-in approach– Full Bayesian approach

• Applications

Page 8: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 8 -

Motivating example

• Problem addressed in– Loeppky, Jason L., Derek Bingham, and William J. Welch. "Computer

model calibration or tuning in practice." Technometrics, submitted for publication (2006).

– Bayarri, Maria J., et al. "A framework for validation of computer models." Technometrics 49.2 (2007).

– Originally Fogler, H. S., (1999), Elements of Chemical Reaction Engi-neering, Prentice Hall.

• Chemical kinetics model

– Describes a chemical reaction process with initial chemical concentra-tion 5 and reaction rate 1.7. Amount of chemical remaining at time x is investigated.

1.5 3.5exp 1.7Ty x x

Page 9: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 9 -

Motivating example

• Chemical kinetics model

– Make virtual experimental (or observation) data with the noise Three replicates are made at 11 points of equal interval in [0,3].

– Repeated with data for right figure given in note page

• Objective– Find computer model to simulate observations as closely as possible.

1.5 3.5exp 1.7Ty x x , ~ 0,0.3F Ty x y x N

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

Page 10: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 10 -

Simplest computer model

– Somewhat wrong guess due to lack of knowledge.– Calibrate q such that minimizes the SSE between model and data.

• Optimum solution from matlab fminsearch

• Using nlinfit and nlparci with second data

but with n-1, (see notes page)

| 5expMy m x x

2min where ,F M M

i i i if y y y S x

0.6223

10.97, 0.5766SSE RMSE

2where , /F M

E i i E ESS y y RMS SS n

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

q=0.6223, sumsq=10.97

0.6271 0.4855

[0.5556 0.6986]

RMSE

CI

Page 11: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

Discussion

• If error was due only to noise, we would have expected RMSE=0.3

• Given the true function, we can see that the model is not good. What are the clues without looking at the true function?

• What is the hierarchy of calibration methods without discrepancy?

Page 12: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 12 -

Calibration with improved models

• Computer model and optimum solution

– Two parameters optimization problem

– Solution improved. but still substantial gap.

• Computer model and optimum solution

– Three parameters optimization problem

– Excellent match (true q1 = 1.5, q2 =3.5, q3 =1.7)

– Model change made on ad-hoc basis. Besides, the close match is un-doubtedly just luck. Is this possible in the real practice ?

1 2| expMy m x x

1 2 4.351, 0.511

8.92, 0.520SSE RMSE

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

q1=4.351,q2=0.511, sumsq=8.922

1 2 3| expMy m x x

1 2 3 1.558, 3.588, 1.899

2.77, 0.290RMSSE SE

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

q1 = 1.558, q2 =3.588, q3 =1.899, SSE = 2.774

Page 13: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 13 -

Calibration under uncertainty

• Bayesian approach– Assume that the model is accurate representation of reality.

– Field data is given by

– Posterior distribution of the unknown parameters , q s2

• Posterior distribution

| 5expMy m x x

2, , ~ 0,F Mi i i i iy y m x N

/ 2 12 22

1, | exp , ,

2

nF F Fp m m

Y Y X Y X

0.4 0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.90.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.40.5

0.60.7

0.80.9

0.2

0.4

0.6

0.8

10

5

10

15

20

Page 14: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 14 -

Calibration under uncertainty

• Posterior samples after MCMC (N=5e3)

• Posterior prediction

Means of q and s0.6295 0.59210.6235 0.58700.6249 0.58660.6269 0.5952

Simple optimization

0.6223, 0.5766

500 1000 1500 2000 2500 3000 3500 4000 4500 50000.4

0.5

0.6

0.7

0.8

0.9

500 1000 1500 2000 2500 3000 3500 4000 4500 50000.4

0.6

0.8

1

1.2

1.4

0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.90

100

200

300

400

0.4 0.5 0.6 0.7 0.8 0.9 1 1.10

100

200

300

400

| 5exp

, , ~ 0,

Mi i i

P Mi i i i i i

y m x x

y y m x N

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

Page 15: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 15 -

Calibration under uncertainty

• Frequentist approach– Likelihood of YF

• Maximum likelihood estimation

/ 22 22

1| , exp , ,

2

nF F FL m m

Y Y X Y X

2

2

,max | ,FL

Y

0.4 0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.90.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0.40.5

0.60.7

0.80.9

0.2

0.4

0.6

0.8

10

5

10

15

20

* *

Optimum solution

0.6223, 0.5766

Page 16: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 16 -

Calibration under uncertainty

• Bootstrap sampling– Make virtual experimental data by applying the estimated parameter into

the model and the noise.

– Go through MLE estimation using the data, and repeat this N times to get the samples of parameters.

* *25exp , ~ 0,F My y x N

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6

Meeker, William Q., and Luis A. Escobar. Statistical methods for reliability data. Vol. 314. Wiley. com, 1998.

Page 17: - 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.

- 17 -

Calibration under uncertainty

• Discussion– Confidence bounds of model is now obtained.

i.e., at x=1.5, the bound is (0.75, 3.19). – Due to the incorrect model, end up with large bound. However, this is

the best available solution under this condition.– Within this large bounds, not only measurement error but also the model

error are included. We need to account for this by introducing discrep-ancy function.

0 0.5 1 1.5 2 2.5 30

1

2

3

4

5

6