Verified computation with probabilities Scott Ferson, Applied Biomathematics IFIP Working Conference...

Post on 01-Apr-2015

217 views 1 download

Transcript of Verified computation with probabilities Scott Ferson, Applied Biomathematics IFIP Working Conference...

Verified computation with probabilitiesScott Ferson, Applied Biomathematics

IFIP Working Conference on Uncetainty Quantification and Scientific Computing Boulder, Colorado, 2 August 2011

© 2011 Applied Biomathematics

Euclid

• Given a line in a plane, how many parallel lines can be drawn through a point not on the line?

• For over 20 centuries, the answer was ‘one’

Relax one axiom

• Around 1850, Riemann and others developed non-Euclidean gemetries in which the answer was either ‘zero’ or ‘many’

• Controversial, but eventually accepted

• Mathematics richer and applications expanded

• Used by Einstein in general relativity

Variability = aleatory uncertainty

• Arises from natural stochasticity

• Variability arises from– spatial variation– temporal fluctuations– manufacturing or genetic differences

• Not reducible by empirical effort

Incertitude = epistemic uncertainty

• Arises from incomplete knowledge

• Incertitude arises from– limited sample size– mensurational limits (‘measurement uncertainty’)– use of surrogate data

• Reducible with empirical effort

Suppose

A is in [2, 4]

B is in [3, 5]

What can be said about the sum A+B?

4 6 8 10

The right answer for risk analysis is [5,9]

Propagating incertitude

Simmer down…

• I’m not saying we should only use intervals

• Not all uncertainty is incertitude

• Maybe most uncertainty isn’t incertitude

• Sometimes incertitude is entirely negligible

• If so, probability theory is perfectly sufficient

What I am saying is…

• Some analysts face non-negligible incertitude

• Handling it with standard probability theory requires assumptions that may not be tenable– Unbiasedness , Uniformity, Independence

• Useful to know what difference it might make

• Can be discovered by bounding probabilities

Bounding probability is an old idea

• Boole and de Morgan

• Chebyshev and Markov

• Borel and Fréchet

• Kolmogorov and Keynes

• Dempster and Ellsberg

• Berger and Walley

Several closely related approaches

• Probability bounds analysis

• Imprecise probabilities

• Robust Bayesian analysis

• Second-order probability

Boundingapproaches

Traditional uncertainty analyses

• Worst case analysis

• Taylor series approximations (delta method)

• Normal theory propagation (NIST; mean value method)

• Monte Carlo simulation

• Stochastic PDEs

• Two-dimensional Monte Carlo

Untenable assumptions

• Uncertainties are small

• Distribution shapes are known

• Sources of variation are independent

• Uncertainties cancel each other out

• Linearized models good enough

• Relevant science is known and modeled

Need ways to relax assumptions

• Hard to say what the distribution is precisely

• Non-independent, or unknown dependencies

• Uncertainties that may not cancel

• Possibly large uncertainties

• Model uncertainty

Probability bounds analysis (PBA)

• Sidesteps the major criticisms – Doesn’t force you to make any assumptions– Can use only whatever information is available

• Merges worst case and probabilistic analysis

• Distinguishes variability and incertitude

• Used by both Bayesians and frequentists

Probability box (p-box)

0

1

1.0 2.0 3.00.0X

Cum

ulat

ive

prob

abil

ity

Interval bounds on a cumulative distribution function

Envelope of ‘horsetail’ plots

Uncertain numbers

Not a uniform distribution

Cum

ulat

ive

prob

abil

ity

0 10

20 30 400

1

10 20 30 400

1

10 20 300

1

Probability distribution

Probability box Interval

Uncertainty arithmetic

• We can do math on p-boxes

• When inputs are distributions, the answers conform with probability theory

• When inputs are intervals, the results agree with interval (worst case) analysis

Calculations

• All standard mathematical operations– Arithmetic (+, , ×, ÷, ^, min, max)– Transformations (exp, ln, sin, tan, abs, sqrt, etc.)– Magnitude comparisons (<, ≤, >, ≥, )– Other operations (nonlinear ODEs, finite-element methods)

• Faster than Monte Carlo

• Guaranteed to bound the answer

• Optimal solutions often easy to compute

Probability bounds arithmetic

P-box for random variable A P-box for random variable B

What are the bounds on the distribution of the sum of A+B?

0

1

0 2 4 6 8 10 12 14Value of random variable B

Cu

mu

lati

ve P

rob

abil

ity

0

1

0 1 2 3 4 5 6Value of random variable A

Cu

mu

lati

ve P

rob

abil

ity

Cartesian product

A+Bindependence

A[1,3]p1 = 1/3

A[3,5]p3 = 1/3

A[2,4]p2 = 1/3

B[2,8]q1 = 1/3

B[8,12]q3 = 1/3

B[6,10]q2 = 1/3

A+B[3,11]prob=1/9

A+B[5,13]prob=1/9

A+B[4,12]prob=1/9

A+B[7,13]prob=1/9

A+B[9,15]prob=1/9

A+B[8,14]prob=1/9

A+B[9,15]prob=1/9

A+B[11,17]prob=1/9

A+B[10,16]prob=1/9

A+B under independence

0 3 6 9 12 180.00

0.25

0.50

0.75

1.00

15

A+B

Cum

ulat

ive

prob

abil

ity

RigorousBest-possible

X,Y

~ u

nifo

rm(1

,25)

No assumptions Uncorrelated

“Linear” correlation Positive dependence

10 20 30 40 500X + Y

0

1

CD

F

Particular dependence

Impr

ecis

eP

reci

se

Perfect

10 20 30 40 500X + Y

Opposite

Independence

Fréchet dependence bounds

• Guaranteed to enclose results no matter what correlation or dependence there may be between the variables

• Best possible (couldn’t be any tighter without saying more about the dependence)

• Can be combined with independence assumptions between other variables

Example: exotic pest establishment

F = A & B & C & D

Probability of arriving in the right seasonProbability of having both sexes presentProbability there’s a suitable hostProbability of surviving the next winter

Imperfect information

• Calculate A & B & C & D, with partial information:– A’s distribution is known, but not its parameters – B’s parameters known, but not its shape– C has a small empirical data set– D is known to be a precise distribution

• Bounds assuming independence?

• Without any assumption about dependence?

A = {lognormal, mean = [.05,.06], variance = [.0001,.001])B = {min = 0, max = 0.05, mode = 0.03}C = {sample data = 0.2, 0.5, 0.6, 0.7, 0.75, 0.8}D = uniform(0, 1)

A=lognormal([.05,.06],sqrt([.0001,.001]))B= minmaxmode(0,0.05,.03)B = max(B,0.000001)

C = histogram(0.001,.9999,.2, .5, .6, .7, .75, .8)

D = uniform(0.0001,.9999)

f = A |&| B |&| C |&| Df ~(range=[9.48437e-14,0.0109203], mean=[0.00006,0.00119], var=[2.90243743e-09,0.00000208])

fi =A & B & C & Dfi ~(range=[0,0.05], mean=[0,0.04], var=[0,0.00052]) show fi , ffi ~(range=[0,0.05], mean=[0,0.04], var=[0,0.00052]) f ~(range=[9.48437e-14,0.0109203], mean=[0.00006,0.00119], var=[2.90243743e-09,0.00000208])

0 0.1 0.2 0.30

1

A

0 0.02 0.04 0.060

1

B

0 10

1

D

0 10

1

C

CD

FC

DF

Resulting probability

0 0.02 0.04 0.060

10 0.01 0.02

0

1

Cum

ulat

ive

prob

abil

ity

Cum

ulat

ive

prob

abil

ity

All variables independent

Makes no assumption about dependencies

Summary statisticsIndependentRange [0, 0.011]Median [0, 0.00113] Mean [0.00006, 0.00119] Variance [2.9109, 2.1106] Standard deviation [0.000054, 0.0014]

No assumptions about dependenceRange [0, 0.05]Median [0, 0.04] Mean [0, 0.04]Variance [0, 0.00052]Standard deviation [0, 0.023]

How to use the results

When uncertainty makes no difference (because results are so clear), bounding gives confidence in the reliability of the decision

When uncertainty swamps the decision(i) use results to identify inputs to study better, or

(ii) use other criteria within probability bounds

Justifying further empirical effort

• If incertitude is too wide for decisions, and bounds are best possible, more data is needed– Strong argument for collecting more data

• Planning empirical efforts can be improved by doing sensitivity analysis of the model– Sensitivity analysis can be done with p-boxes

Better than sensitivity studies

• Uncertainty about shape and dependence cannot be fully revealed by sensitivity studies– Because the problems are infinite-dimensional

• Probability bounding lets you be comprehensive

• Intermediate knowledge can be exploited

• Uncertainties can have large or small effects

Where do input p-boxes come from?

• Prior modeling– Uncertainty about dependence– Robust Bayes analysis

• Constraint information– Summary publications lacking original data

• Sparse or imprecise data– Shallow likelihood functions– Measurement uncertainty, censoring, missing data

Robust Bayes can make a p-box

class of priors, class of likelihoods class of posteriors-5 0 5 10 15 20

PosteriorsPosterior p-box

LikelihoodsPriors

Constraint propagation

min max

1

0 median

.5

min maxmode

1

0

min max

1

0 mean

sd

1

0 mean,

1

0min maxmean=mode

1

0 symmetric, mean, sd

1

0min maxmedian=mode

1

0min maxmean, sd

min max

1

0

CD

F

2 4 6 8 100

1

-10 0 10 200

1

0 10 20 30 40 500

1

2 4 6 8 100

1

2 4 6 8 100

1

2 4 6 8 100

1

0 10 20 30 40 500

1

2 4 6 8 100

1

2 4 6 8 100

1

min,max, mode

mean, std

range, quantile

min,max

min,max, mean

min, mean

Comparing p-boxes with maximum entropy distributions

sample data, range

min,max, mean,std p-box

Maximum entropy’s problem

• Depends on the choice of scale

• Range for degradation rate• Range for half life • Same information, but incompatible results

• P-boxes are the same whichever scale is used

North interprets Jaynes as saying that “two states of information that are judged to be equivalent should lead to the same probability assignments”. Maxent doesn’t do this! But PBA does.

Incertitude in data measurements

• Periodic observationsWhen did the fish in my aquarium die during the night?

• Plus-or-minus measurement uncertaintiesCoarse measurements, measurements from digital readouts

• Non-detects and data censoringChemical detection limits, studies prematurely terminated

• Privacy requirementsEpidemiological or medical information, census data

• Theoretical constraintsConcentrations, solubilities, probabilities, survival rates

• Bounding studies Presumed or hypothetical limits in what-if calculations

0 2 4 6 8 10X

0 2 4 6 8 10X

0 2 4 6 8 10X

0 2 4 6 8 10X

Skinny Puffy

Skinny data Puffy data[1.00, 2.00] [3.5, 6.4][2.68, 2.98] [6.9, 8.8][7.52, 7.67] [6.1, 8.4][7.73, 8.35] [2.8, 6.7][9.44, 9.99] [3.5, 9.7][3.66, 4.58] [6.5, 9.9]

[0.15, 3.8][4.5, 4.9][7.1, 7.9]

Imprecise data

Empirical distribution of intervals

• Each side is cumulation of respective endpoints

• Represents both incertitude and variability

2 4 6 8

Cum

ulat

ive

prob

abil

ity

X X

Skinny Puffy

0 100

1

2 4 6 80 10

Cum

ulat

ive

prob

abil

ity

X X

Skinny Puffy

Uncertainty about the EDF

0

1

0 2 4 6 8 10

Cum

ulat

ive

prob

abil

ity

Puffy

X

Fitted to normals

0 10 200

1

0 10 20

Cum

ulat

ive

prob

abil

ity

Skinny Puffy

X XMethod of matching moments Regression approachesMaximum likelihood

Don’t have to specify the distribution

• Specifying distributions can be challenging

• Sensitivity analysis is very hard since it’s an infinite-dimensional problem

• Maximum entropy criterion erases uncertainty rather than propagates it

• Bounding is reasonable, but should reflect all available information

NOT interval or worst case analysis

• Yields fully probabilistic assessments

• Allows analysts to quantitatively describe– Means, Dispersions, Tail risks

about output variables, but the descriptions may not be scalars if information is sparse

• Morally equivalent to sensitivity analysis

Heresies

• Independence should not be the default

• Maximum entropy criterion is obsolete

• Monte Carlo is partially obsolete

• Variability should be modeled with probability

• Incertitude should be modeled as intervals

• Imprecise probabilities can do both at once

• Sensitivity analysis is grossly insufficient

So is this really practical?

• Not only is it practical, it is essential

• Must distinguish incertitude and variability, otherwise the results are misleading and confusing to humans

• Consider some fundamental observations…

Neuroscience of risk perception

Risk aversion• You’d get $1000 if a random ball from the

urn is red, or you can just get $500 instead

• Which prize do you want?

$500EU is the same, but most people take the sure $500

Ambiguity aversion

• Two urns, both with red or blue balls

• But now one urn is opaque

• Get $1000 if a randomly drawn ball is red

• Which urn do you wanna draw from?

A probabilist could explain your preference by saying your probability for red in the opaque urn is low

Everyone chooses the transparent urn

Ambiguity (incertitude)

• Ambiguity aversion is ubiquitous in human decision making

• Ellsberg showed it is utterly incompatible with Bayesian norms

• Humans are wired to process incertitude separately and differently from variability

Bayesian reasoning (poor)

12-18% correct

If a test to detect a disease whose prevalence is 0.1% has a false positive rate of 5%, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person’s symptoms or signs? ___%

Casscells et al. 1978 replicated in Cosmides and Tooby 1996

Bayesian reasoning (good)

If a test to detect a disease whose prevalence is 1/1000 has a false positive rate of 50/1000, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person’s symptoms or signs? ___ out of ___.

76-92% correct

1 51

Casscells et al. 1978 replicated in Cosmides and Tooby 1996

Neuroscience of risk perception

Instead of being divided into rational and emotional sides, the human brain has many special-purpose calculators(Marr 1982; Barkow et al. 1992; Pinker 1997, 2002)

A calculator must be triggered

• Humans have an innate probability sense, but it is triggered by natural frequencies

• This calculator kicked in for the medical students who got the question in terms of natural frequencies, and they mostly solved it

• The mere presence of the percent signs in the question hobbled the other group

Multiple calculators can fire

• There are distinct calculators associated with – Probabilities and risk (variability) medical students

– Ambiguity and uncertainty (incertitude) Hsu et al.

– Trust and fairness Ultimatum Game

• Brain processes them differently– Different parts of the brain– Different chemical systems

• They can give conflicting responses

Neuroimagery

• Functional brain imaging suggests there is a general neural circuit that processes incertitude– Localized in amygdala and orbitofrontal cortex

• People with brain lesions in these spots– Are insensitive to the degree of incertitude– Do not exhibit ambiguity aversion– Make decisions according to Bayesian norms

Hsu et al. 2005

Ambiguity processing

• Neuroimagery and clinical psychometrics show humans distinguish incertitude and variability – Normal feature of the human brain– Especially focused on the worst case– Makes us ask how bad it could be and neglect

probability

• Probabilists traditionally use equiprobability to model incertitude which confounds the two

Conclusions

• Probability has an inadequate model of ignorance

• Interval (worst case) analysis has an inadequate model of dependence

• Probability bounds analysis corrects both and does things sensitivity studies cannot

• PBA is much simpler computationally than IP

Take-home messages

• You don’t have to make a lot of assumptions to get quantitative results

• PBA merges worst case and probabilistic analyses in a way that’s faithful to both

• Calculations are guaranteed (you can still be wrong, but the method won’t be the reason)

Many applications

• ODE dynamics in a chemostat

• Global climate change forecasts

• Engineering design

• Safety of engineered systems (e.g., bridges)

• Human health Superfund risk analyses

• Conservation biology extinction/reintroduction

• Wildlife contaminant exposure analyses

NASA, SNL, LANL, VT, UM, GT

EPA

PIK

EPA, DEC

Several software implementations

• UC add-in for Excel (NASA, beta 2011)

• RAMAS Risk Calc (EPRI, NIH, commercial)

• Statool (Dan Berleant, freeware)

• Constructor (Sandia and NIH, freeware)

• Pbox.r library for R

• PBDemo (freeware)

• Williamson and Downs (1990)

Acknowledgments

National Institutes of Health (NIH)

Electric Power Research Institute (EPRI)

Sandia National Laboratories

National Aeronautics and Space Administration (NASA)

End

Relax one axiom

• IP satisfies all the Kolmogorov axioms

• Avoid sure loss and are ‘rational’ in the Bayesian sense (except that agents are not forced to either buy or sell any offered gamble)

• Relaxes one decision theory axiom: assumption that we can always compare any two gambles

Richer mathematics, wider array of applications

Comparing IP to Bayesian approach

• “Uncertainty of probability” is meaningful

• Operationalized as the difference between the max buying price and min selling price

• If you know all the probabilities (and utilities) perfectly, then IP reduces to Bayes

Bayes fails incertitude

• Traditional probability theory doesn’t account for gross uncertainty correctly

• Output precision depends strongly on the number inputs and not so much on their shapes

• The more inputs, the tighter the answer

Uniform Uniform Uniform Uniform

Pro

babi

lity

A few grossly uncertain inputs

UniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniformUniform

A lot of grossly uncertain inputs...

Where does this surety come from?

What justifies it?

Pro

babi

lity

“Smoke and mirrors” certainty

• Probability makes certainty out of nothing

• It has an inadequate model of ignorance

• Probability bounds analysis gives a vacuous answer if all you give it are vacuous inputs

Wishful thinking

Risk analysts often make assumptions that are convenient but are not really justified:

1. All variables are independent of one another

2. Uniform distributions model incertitude

3. Distributions are stationary (unchanging)

4. Specifications are perfectly precise

5. Measurement uncertainty is negligible

You don’t have to do this

Removing wishful thinking creates a p-box

1. Don’t have to assume any dependence at all

2. An interval can be a better model of incertitude

3. P-boxes can enclose non-stationary distributions

4. Can handle imprecise specifications

5. Measurement data with plus-minus, censoring

Uncertainties expressible with p-boxes

• Sampling uncertainty– Confidence bands, confidence structures

• Measurement incertitude– Intervals

• Uncertainty about distribution shape– Constraints (non-parametric p-boxes)

• Surrogacy uncertainty– Modeling

Can uncertainty swamp the answer?

• Sure, if uncertainty is huge

• This should happen (it’s not “unhelpful”)

• If you think the bounds are too wide, then put in whatever information is missing

• If there is no more information, what justifies your belief? Who are you trying to fool?

Software demonstration

Assumes independence

Makes no dependence assumption

Risk Calc finds unit mistakes

Given the input

[2 m, 4 ft] * N(12 g, 300 cm) / 0.0025 sec + 56.2 N

Risk Calc detects all three errors– inverted interval – [mass] and [length] don't conform– [length] [mass] [time]-1 and [force] don't conform

Risk Calc finishes calculations

Given 2 ft + 4 m, Risk Calc computes 4.61 m

Given

[4 ft, 2 m]*2 wids * N(12 g, 300 cg) /0.0025 sec2 wid +56.2 N

Risk Calc computes the right answer because it knows– feet can be converted to meters– distancemasstime2 can be converted to newtons– N means both “normal distribution” and “newtons”– a “wid” and “wids” can cancel– centigrams can be converted to grams