Examples using ROOT - JLab Computer …elton/talks/examples_hugs2013.pdfElton S. Smith HUGS Summer...
Transcript of Examples using ROOT - JLab Computer …elton/talks/examples_hugs2013.pdfElton S. Smith HUGS Summer...
Elton S. Smith HUGS Summer School June 2013 1
Examples using ROOT http://root.cern.ch
Ex1: Event Generation (Binomial Distribution) Ex2: Generate various random distributions Ex3: Linear fits Ex4: Determination of the area under a Gaussian Ex5: Bayesian inference
Elton S. Smith, Jefferson Lab
Elton S. Smith HUGS Summer School June 2013
Exercise 1 – Generate Binomial Distribution
2
Exercise: Use the ROOT random number generator to generate 10 random numbers according to the Binomial distribution. Compare with the parent probability distribution function.
Elton S. Smith HUGS Summer School June 2013
Exercise 2 – Various random distributions
4
Extra credit: Generate random number according to an arbitrary input function.
Exercise: Use the ROOT random number generator to generate random numbers according to a 1. Uniform distribution 2. Exponential distribution 3. Gaussian distribution 4. Poisson distribution
Compare with the parent probability distribution function.
Elton S. Smith HUGS Summer School June 2013
Random distributions
5
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
200
400
600
800
1000
1200
1400
1600
1800
2000
uniform distribution uniformEntries 100000
Mean 0.4997
RMS 0.2886
/ ndf 2! 94.51 / 98
p0 6.3± 1001
p1 10.950± -4.053
uniform distribution
0 5 10 15 20 25 30 35 400
2000
4000
6000
8000
10000
exponential distribution decayEntries 100000
Mean 3.999
RMS 4.002
/ ndf 2
! 88.19 / 91
Constant 0.004± 9.213
Slope 0.0008± -0.2507
exponential distribution
-10 -8 -6 -4 -2 0 2 4 6 8 100
500
1000
1500
2000
2500
3000
3500
4000
bell distribution bellEntries 100000
Mean 3.001
RMS 1.998
/ ndf 2! 68.21 / 75
Constant 15.4± 3988
Mean 0.006± 3.003
Sigma 0.004± 1.999
bell distribution
0 1 2 3 4 5 6 7 8 9 100
2000
4000
6000
8000
10000
12000
14000
16000
18000
20000
22000
Poisson distribution countsEntries 100000
Mean 2.995
RMS 1.71
/ ndf 2
! 1704 / 7
Constant 9.68e+01± 2.25e+04
Mean 0.01± 3.36
Sigma 0.006± 1.793
Poisson distribution
Elton S. Smith HUGS Summer School June 2013
Exercise 3 - Linear Fits
7
Assume a parent distribution of the form y(x) = a + bx, a=5, b=1 Assume one experiment collects a data set of ten points of the form (xi, yi±σ), i=0,1,2,...9, with the measurements yi following a Gaussian distribution with a fixed width σ=0.5. Invent the data points yi for one experiment. Fit the data yi to the form y = a + bx. Determine y and the uncertainty of y as a function of x from the fit.
Elton S. Smith HUGS Summer School June 2013
Exercise 3 - Linear Fits
8
Extra credit Generate 1000 Monte Carlo experiments For each experiment fit the data set to the functional form given above Plot the difference between the fitted function and data Histogram the difference between the fitted parameters and the true parameter a and b Use the width of the distribution to determine the uncertainty in the parameters. Compare with the estimated uncertainties in the fit
Elton S. Smith HUGS Summer School June 2013
Linear Fit – one “experiment”
9
Fit for one “experiment” showing the fitted parameters Repeat 1000 times
Elton S. Smith HUGS Summer School June 2013
Fitted Results to 1000 “experiments”
10
For each fit, plot the fitted value of the intercept and slope. Fit the distributions to Gaussian functions
Mean = 5.004 ± 0.010 Sigma = 0.303
Mean = 0.9997 ± 0.0018 Sigma = 0.05626 What is the relation
between these two?
Elton S. Smith HUGS Summer School June 2013
Plot difference between fitted and true values
11
Fit Gaussian to slices
y(x)-yfit
Uncertainty on σy can be
computed using
σy2=σa
2+x2σb2+2xσab
Correlation term is important
σab=0
Elton S. Smith HUGS Summer School June 2013
Exercise 4 - Fitted Gaussian area
12
Many measurements of a variable x have been accumulated in one experiment. The measurements are dominated by experimental resolution, so the distribution of measurements is Gaussian. Obtain Integral
Generate 1000 measurements (one experiment) according to a Gaussian distribution Fit the distribution to a Gaussian and determine the area under the curve and its uncertainty.
Systematic Study (extra credit) Generate 100 experiments with 1000 measurements each and empirically determine areas and uncertainties Use fit defaults and option=‘L’ Compare and discuss
Elton S. Smith HUGS Summer School June 2013
Integral =!
2!A"b
Var(Integral) = 2!!A2#33 + "2#11 + 2A"#13
"b2
b = bins/xunits
Fitted Gaussian area – one “experiment”
13
Variable x (xunits)
(969)
(5) (322)
Fitting Option=default
Elton S. Smith HUGS Summer School June 2013
Fitted Gaussian area – 100 “experiments”
14
Events Generated Fitted Gaussian Sum
Mean = 1000 ± 3.7 Mean = 976.5 ± 3.7 Sigma = 34 Sigma = 35
Elton S. Smith HUGS Summer School June 2013
Integral =!
2!A"b
Var(Integral) = 2!!A2#33 + "2#11 + 2A"#13
"b2
b = bins/xunits
Fitted Gaussian area – one “experiment”
15
Variable x (xunits)
(997)
(5) (322)
Fitting Option=‘L’
Elton S. Smith HUGS Summer School June 2013
Fitted Gaussian area – 100 “experiments”
16
Events Generated Fitted Gaussian Sum
Mean = 1000 ± 3.7 Mean = 998 ± 3.5 Sigma = 34 Sigma = 32
Elton S. Smith HUGS Summer School June 2013
Summary of examples of fitting
Linear fit Obtained values of the uncertainties of the parameters by
Monte Carlo. Demonstrated these uncertainties corresponded to the
computed estimate The correlation term must be included
Determination of areas under Gaussian distributions The area and uncertainties were computed The computation requires inclusion of the correlation
term. Monte Carlo determination of the area shows that the use
of least-square fits lead to estimates for the area that are systematically low
17
Elton S. Smith HUGS Summer School June 2013
Bayesian inference - example
18
An experiment is interested in identifying pions in the presence of a large number of muon tracks. A detector is designed to identify pions, but has a 96% efficiency for tagging pions as pions and a 1% chance of misidentifying a muon and tagging it as a pion. Assume there are 10 times more muons than pions. Assume that a track has been tagged as a pion. What is the degree of belief that the track is a pion? How does this change if there are 100 times more muons than pions?
Elton S. Smith HUGS Summer School June 2013
Bayesian inference
19
P (!|id) =P (id|!)P (!)
P (id|!)P (!) + P (id|µ)P (µ)P (id|!) = 0.96
P (!) = 1P (id|µ) = 0.01
P (µ) = 10Z (cm)
760 780 800 820 840 860 880 900
Y (
cm
)
-100
-80
-60
-40
-20
0
20
40
60
80
100
BCAL
BCAL
CDC
CDC
FDC FDC FDC FDC
TO
F
target
FCAL
solenoid
1MWPC
Iron
2MWPC
Muon Detector
π µ
µ
P (!|id) = 0.91
P (!|id) = 0.49 P (µ) = 100