Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a...
Transcript of Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a...
![Page 1: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/1.jpg)
Algorithms for Uncertainty QuantificationTobias Neckel, Ionut,-Gabriel Farcas,
Lehrstuhl Informatik V
Summer Semester 2017
![Page 2: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/2.jpg)
Lecture 4: More advanced samplingtechniques
![Page 3: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/3.jpg)
Repetition from previous lecture
![Page 4: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/4.jpg)
Repetition from previous lecture• Sampling methods→ a popular technique for uncertainty propagation
• Most widely used sampling approach→ Monte Carlo sampling
• Monte Carlo sampling→ simple, robust, independent of probability distribution, number of randomparameters, ...
• ... but slow convergence rate
• Model problem→ damped linear oscillator
• Uncertainty in some input parameters→ higher impact in the output uncertainty
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 4
![Page 5: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/5.jpg)
Monte Carlo sampling error analysis
![Page 6: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/6.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 7: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/7.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 8: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/8.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 9: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/9.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 10: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/10.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 11: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/11.jpg)
Short error analysis for standard Monte CarloRemember
• for N samples, the MCS error is O( 1√N)
How did we get that?
• Monte Carlo sampling↔ averaging
• let f : [0,1]→ R and I :=∫ 1
0 f (x)dx
• I ≈ If = 1N ∑
Ni=1 f (Ui), Ui ∼U (0,1)
• if σ2f = Var[f (x)], E[(I− If )2] = σ2
f /N
• σ2f ≈ σ2
f = 1N−1 ∑
Ni=1(f (Ui)− If )2
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 6
![Page 12: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/12.jpg)
Improving standard Monte Carlosampling
![Page 13: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/13.jpg)
Towards more advanced sampling techniques• How to increase the accuracy of standard Monte Carlo? (E[(I− If )2] = σ2
f /N)
− improve your code• parallelize• vectorize• remove if statements• use memory efficiently• etc.
− increase N• not desirable
− decrease σ2f
• desirable
− improve the random number generation
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 8
![Page 14: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/14.jpg)
Towards more advanced sampling techniques• How to increase the accuracy of standard Monte Carlo? (E[(I− If )2] = σ2
f /N)− improve your code• parallelize• vectorize• remove if statements• use memory efficiently• etc.
− increase N• not desirable
− decrease σ2f
• desirable
− improve the random number generation
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 8
![Page 15: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/15.jpg)
Towards more advanced sampling techniques• How to increase the accuracy of standard Monte Carlo? (E[(I− If )2] = σ2
f /N)− improve your code• parallelize• vectorize• remove if statements• use memory efficiently• etc.
− increase N• not desirable
− decrease σ2f
• desirable
− improve the random number generation
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 8
![Page 16: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/16.jpg)
Towards more advanced sampling techniques• How to increase the accuracy of standard Monte Carlo? (E[(I− If )2] = σ2
f /N)− improve your code• parallelize• vectorize• remove if statements• use memory efficiently• etc.
− increase N• not desirable
− decrease σ2f
• desirable
− improve the random number generation
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 8
![Page 17: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/17.jpg)
Towards more advanced sampling techniques• How to increase the accuracy of standard Monte Carlo? (E[(I− If )2] = σ2
f /N)− improve your code• parallelize• vectorize• remove if statements• use memory efficiently• etc.
− increase N• not desirable
− decrease σ2f
• desirable
− improve the random number generation
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 8
![Page 18: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/18.jpg)
Improving standard Monte Carlo sampling
Variance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 19: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/19.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 20: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/20.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 21: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/21.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 22: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/22.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 23: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/23.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 24: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/24.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 25: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/25.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 26: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/26.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 27: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/27.jpg)
Improving standard Monte Carlo samplingVariance-minimization techniques
• antithetic sampling
• importance sampling
• stratified sampling
• control variates
Alternative random number generation techniques
• Fibonacci generators
• latin hypercube sampling
• Sobol’ sequences
• Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 9
![Page 28: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/28.jpg)
Variance reduction techniques
![Page 29: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/29.jpg)
Antithetic sampling
• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 30: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/30.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R
• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 31: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/31.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 32: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/32.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 33: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/33.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 34: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/34.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 35: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/35.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 36: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/36.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 37: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/37.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 38: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/38.jpg)
Antithetic sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume fX (x) is symmetric, c being the center of symmetry
• let t(X1, . . . ,Xn) denote an estimator
• let x ∈ supp(X ); the symmetric of x w.r.t. c is x = 2c−x
• symmetry implies fX (x) = fX (x)
• sample n/2 samples X1, . . . ,Xn/2 from ∼ fX (x)
• the remaining n/2 samples X1, . . . , Xn/2 are obtained via reflection
• then t(X1, . . . ,Xn) = t(X1, . . . ,Xn/2)+ t(X1, . . . , Xn/2)
• example: if U ∼U (0,1), take U ∼U (0,1); the antithetic samples are, U = 1−U
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 11
![Page 39: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/39.jpg)
Stratified sampling
• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 40: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/40.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 41: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/41.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 42: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/42.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 43: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/43.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 44: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/44.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 45: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/45.jpg)
Stratified sampling• let X denote a (continuous) random variable with pdf ∼ fX (x), supp(X )⊂ R• assume, without loss of generality, that supp(X ) = [0,1]
• let t(X1, . . . ,Xn) denote an estimator
• idea: prevent samples from clustering in a particular region of the interval
• select λ ∈ (0,1)
• then draw n1 = λn samples in [0,λ ] and n2 = n−n1 = (1−λ )n samples in [λ ,1]
• t(X1, . . . ,Xn) = t(X1, . . . ,Xn1) + t(X1, . . . ,Xn2)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 12
![Page 46: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/46.jpg)
Control variates
• remember Monte Carlo integration: estimate∫ 1
0 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 47: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/47.jpg)
Control variates• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 48: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/48.jpg)
Control variates• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 49: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/49.jpg)
Control variates• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 50: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/50.jpg)
Control variates• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 51: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/51.jpg)
Control variates• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• assume there exists φ : [0,1]→ R that can be easily integrated
• therefore∫ 1
0 f (x)dx =∫ 1
0 (f (x)+φ(x)−φ(x))dx =∫ 1
0 φ(x)dx +∫ 1
0 (f (x)−φ(x))dx
• Var(f - φ ) = Var(f) + Var(φ ) - 2cov(f, φ )
• if cov(f, φ ) is high, i.e. f and φ are “similar”, Var(f - φ ) < Var(f)
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 13
![Page 52: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/52.jpg)
Importance sampling
• remember Monte Carlo integration: estimate∫ 1
0 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 53: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/53.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 54: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/54.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 55: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/55.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 56: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/56.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 57: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/57.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 58: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/58.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 59: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/59.jpg)
Importance sampling• remember Monte Carlo integration: estimate
∫ 10 f (x)dx via sampling
• standard Monte Carlo solution∫[0,1] f (x)dx ≈ I = 1
N ∑Ni=1 f (Ui), Ui ∼U (0,1)
• however, Ui are spread all over the domain
• idea: sample from another distribution gX that better captures the structure of f
•∫ 1
0 f (x)dx =∫ 1
0f (x)
gX (x)gX (x)dx =
∫ 10 h(x)gX (x)dx
• therefore, instead of sampling from the uniform distribution, sample according to gX
• variance reduced if f and gX have similar shapes
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 14
![Page 60: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/60.jpg)
Quasi-Monte Carlo: alternative samplingtechniques
![Page 61: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/61.jpg)
Quasi-Monte Carlo sampling• standard Monte Carlo: pseudo-random
samples
⇒
• quasi-Monte Carlo (QMC):deterministic samples
• in this lecture: QMC based on low discrepancy sequences• note: QMC methods defined for [0,1]d ; for any other domain, we need transformations
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 16
![Page 62: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/62.jpg)
Quasi-Monte Carlo sampling• standard Monte Carlo: pseudo-random
samples
⇒
• quasi-Monte Carlo (QMC):deterministic samples
• in this lecture: QMC based on low discrepancy sequences• note: QMC methods defined for [0,1]d ; for any other domain, we need transformations
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 16
![Page 63: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/63.jpg)
Quasi-Monte Carlo sampling• standard Monte Carlo: pseudo-random
samples
⇒
• quasi-Monte Carlo (QMC):deterministic samples
• in this lecture: QMC based on low discrepancy sequences
• note: QMC methods defined for [0,1]d ; for any other domain, we need transformations
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 16
![Page 64: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/64.jpg)
Quasi-Monte Carlo sampling• standard Monte Carlo: pseudo-random
samples
⇒
• quasi-Monte Carlo (QMC):deterministic samples
• in this lecture: QMC based on low discrepancy sequences• note: QMC methods defined for [0,1]d ; for any other domain, we need transformations
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 16
![Page 65: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/65.jpg)
Upper bound of integration error
![Page 66: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/66.jpg)
Koksma-Hlawa inequalityRemember
• I :=∫ 1
0 f (x)dx
• If = 1N ∑
Ni=1 f (xi)
Theorem
Koksma-Hlawka inequality: |I− If | ≤ V (f )DN
• V (f )→ variation of f
• DN = supA⊂[0,1]
∣∣∣card(A)N −vol(A)
∣∣∣→ discrepancy of {xi}Ni=1
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 18
![Page 67: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/67.jpg)
Koksma-Hlawa inequalityRemember
• I :=∫ 1
0 f (x)dx
• If = 1N ∑
Ni=1 f (xi)
Theorem
Koksma-Hlawka inequality: |I− If | ≤ V (f )DN
• V (f )→ variation of f
• DN = supA⊂[0,1]
∣∣∣card(A)N −vol(A)
∣∣∣→ discrepancy of {xi}Ni=1
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 18
![Page 68: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/68.jpg)
Koksma-Hlawa inequalityRemember
• I :=∫ 1
0 f (x)dx
• If = 1N ∑
Ni=1 f (xi)
Theorem
Koksma-Hlawka inequality: |I− If | ≤ V (f )DN
• V (f )→ variation of f
• DN = supA⊂[0,1]
∣∣∣card(A)N −vol(A)
∣∣∣→ discrepancy of {xi}Ni=1
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 18
![Page 69: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/69.jpg)
Low discrepancy sequences
![Page 70: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/70.jpg)
Low discrepancy sequencesBasic idea
• in |I− If | ≤ V (f )DN , assume that V (f ) = constant
• idea: minimize error by reducing DN , i.e.
• produce {xi}Ni=1 that are “well” spaced
• in this way: O( 1√N)→ O( log(N)d
N ), where d is the dimension
⇐ ⇒
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 20
![Page 71: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/71.jpg)
Low discrepancy sequencesBasic idea
• in |I− If | ≤ V (f )DN , assume that V (f ) = constant
• idea: minimize error by reducing DN , i.e.
• produce {xi}Ni=1 that are “well” spaced
• in this way: O( 1√N)→ O( log(N)d
N ), where d is the dimension
⇐ ⇒
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 20
![Page 72: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/72.jpg)
Low discrepancy sequencesBasic idea
• in |I− If | ≤ V (f )DN , assume that V (f ) = constant
• idea: minimize error by reducing DN , i.e.
• produce {xi}Ni=1 that are “well” spaced
• in this way: O( 1√N)→ O( log(N)d
N ), where d is the dimension
⇐ ⇒
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 20
![Page 73: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/73.jpg)
Low discrepancy sequences example:Halton sequences
![Page 74: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/74.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 75: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/75.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]
• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 76: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/76.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3
− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 77: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/77.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 78: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/78.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 79: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/79.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9
− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 80: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/80.jpg)
Halton sequences• start with a prime number p
• construct a sequence based on finer and finer p-based divisions of sub-intervals of [0,1]• e.g. let p = 3− break [0,1] into 3 equal subintervals
− break each sub-interval into 3 equal subintervals
− now, the sequence is 1/3, 2/3, 1/9, 4/9, 7/9, 2/9, 5/9, 8/9− repeat until desired length
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 22
![Page 81: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/81.jpg)
Halton sequences example• 2D Halton grid with 100 elements
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 23
![Page 82: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/82.jpg)
Quasi-Monte Carlo sampling: example
![Page 83: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/83.jpg)
Model problem – damped linear oscillator
d2ydt2 (t)+c dy
dt (t)+ky(t) = f cos(ω t)y(0) = y0dydt (0) = y1
• t ∈ [0,30]
• k = 0.035
• f = 0.100
• ω = 1.000
• y0 = 0.500
• y1 = 0.000
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 25
![Page 84: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/84.jpg)
Model problem – damped linear oscillator
d2ydt2 (t)+c dy
dt (t)+ky(t) = f cos(ω t)y(0) = y0dydt (0) = y1
• t ∈ [0,30]
• k = 0.035
• f = 0.100
• ω = 1.000
• y0 = 0.500
• y1 = 0.000
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 25
![Page 85: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/85.jpg)
Quasi-Monte Carlo – example• t0 = 15
Deterministic result
• y(t0) =−1.51e−01
Stochastic results
• assume c ∼U (0.08,0.12)
• 100 samples, standard Monte Carlo→ E [y(t0)] =−1.61e−01, Var[y(t0)] = 6.51e−04
• 100 samples, QMC, Halton sequences→ E [y(t0)] =−1.53e−01, Var[y(t0)] = 7.78e−04
• 1000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.30e−04
• 1000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.81e−04
• 10000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.84e−04
• 10000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.80e−04
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 26
![Page 86: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/86.jpg)
Quasi-Monte Carlo – example• t0 = 15
Deterministic result
• y(t0) =−1.51e−01
Stochastic results
• assume c ∼U (0.08,0.12)
• 100 samples, standard Monte Carlo→ E [y(t0)] =−1.61e−01, Var[y(t0)] = 6.51e−04
• 100 samples, QMC, Halton sequences→ E [y(t0)] =−1.53e−01, Var[y(t0)] = 7.78e−04
• 1000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.30e−04
• 1000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.81e−04
• 10000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.84e−04
• 10000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.80e−04
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 26
![Page 87: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/87.jpg)
Quasi-Monte Carlo – example• t0 = 15
Deterministic result
• y(t0) =−1.51e−01
Stochastic results
• assume c ∼U (0.08,0.12)
• 100 samples, standard Monte Carlo→ E [y(t0)] =−1.61e−01, Var[y(t0)] = 6.51e−04
• 100 samples, QMC, Halton sequences→ E [y(t0)] =−1.53e−01, Var[y(t0)] = 7.78e−04
• 1000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.30e−04
• 1000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.81e−04
• 10000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.84e−04
• 10000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.80e−04
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 26
![Page 88: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/88.jpg)
Quasi-Monte Carlo – example• t0 = 15
Deterministic result
• y(t0) =−1.51e−01
Stochastic results
• assume c ∼U (0.08,0.12)
• 100 samples, standard Monte Carlo→ E [y(t0)] =−1.61e−01, Var[y(t0)] = 6.51e−04
• 100 samples, QMC, Halton sequences→ E [y(t0)] =−1.53e−01, Var[y(t0)] = 7.78e−04
• 1000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.30e−04
• 1000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.81e−04
• 10000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.84e−04
• 10000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.80e−04
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 26
![Page 89: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/89.jpg)
Quasi-Monte Carlo – example• t0 = 15
Deterministic result
• y(t0) =−1.51e−01
Stochastic results
• assume c ∼U (0.08,0.12)
• 100 samples, standard Monte Carlo→ E [y(t0)] =−1.61e−01, Var[y(t0)] = 6.51e−04
• 100 samples, QMC, Halton sequences→ E [y(t0)] =−1.53e−01, Var[y(t0)] = 7.78e−04
• 1000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.30e−04
• 1000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.81e−04
• 10000 samples, standard Monte Carlo→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.84e−04
• 10000 samples, QMC, Halton sequences→ E [y(t0)] =−1.52e−01, Var[y(t0)] = 7.80e−04
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 26
![Page 90: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/90.jpg)
Summary
![Page 91: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/91.jpg)
Summary• the accuracy of standard Monte Carlo can be improved via− optimizing your code− increasing the number of samples− decreasing the variance of the estimators− changing the sampling technique
• variance reduction techniques− antithetic sampling− importance sampling− stratified sampling− control variates• alternative random number generation techniques− Fibonacci generators− latin hypercube sampling− Sobol’ sequences− Halton sequences
• example low-discrepancy sequences: Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 28
![Page 92: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/92.jpg)
Summary• the accuracy of standard Monte Carlo can be improved via− optimizing your code− increasing the number of samples− decreasing the variance of the estimators− changing the sampling technique• variance reduction techniques− antithetic sampling− importance sampling− stratified sampling− control variates
• alternative random number generation techniques− Fibonacci generators− latin hypercube sampling− Sobol’ sequences− Halton sequences
• example low-discrepancy sequences: Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 28
![Page 93: Algorithms for Uncertainty Quantification...Repetition from previous lecture Sampling methods!a popular technique for uncertainty propagation Most widely used sampling approach!Monte](https://reader034.fdocuments.in/reader034/viewer/2022050218/5f6414801c7e351a7b79abe3/html5/thumbnails/93.jpg)
Summary• the accuracy of standard Monte Carlo can be improved via− optimizing your code− increasing the number of samples− decreasing the variance of the estimators− changing the sampling technique• variance reduction techniques− antithetic sampling− importance sampling− stratified sampling− control variates• alternative random number generation techniques− Fibonacci generators− latin hypercube sampling− Sobol’ sequences− Halton sequences
• example low-discrepancy sequences: Halton sequences
Dr. rer. nat. Tobias Neckel | Algorithms for Uncertainty Quantification | Summer Semester 2017 28