Searching for Gravitational Waves with LIGO (Laser Interferometer Gravitational-wave Observatory)
Introduction to Gravitational Wave Data...
Transcript of Introduction to Gravitational Wave Data...
Introduction to Gravitational Wave
Data AnalysisLarry Price
2010 International School on Numerical Relativity and Gravitational Waves
July 26-30, APCTP
References
• Basic Data Analysis: L A Wainstein and V D Zubakov, Extraction of signals from noise, Prentice-Hall, 1962
• Compact Binary Analysis: Finn, L.S. and Chernoff, D.F., Phys. Rev. D47, 2198-2219 (1993); Blanchet et al, Class.Quant.Grav.13:575-584,1996
• Burst Analysis: Anderson et al, Phys. Rev. D63:042003, 2001
• Continuous Waves Analysis: Jaranowski et al, Phys.Rev.D58:063001,1998; Brady et al, Phys.Rev.D57:2101-2116,1998
• Stochastic Background: (LIGO) Allen and Romano, Phys.Rev.D59:102001,1999. (Pulsar Timing) Anholm et al, Phys.Rev.D79: 084030, 2009
Overview
• Lecture 1: Brief introduction to the instrument and what it measures. Introduction to time series analysis.
• Lecture 2: Frequentist vs. Bayesian. Bayes’s Theorem. Decision Rules. The likelihood function.
• Lecture 3: Optimal statistics for detecting signals in noise.
Introduction to Gravitational Wave
Data AnalysisLecture 1: The basics
Gravitational Waves
Spacetime interval can be written as
where is the Minkowski metric and is a metric perturbation
For weak gravitational fields
Solve the wave equation in vacuum
•
• Gravitational waves propagate at the speed of light
• Gravitational waves stretch and squeeze space
ds2 = (ηαβ + hαβ)dxαdxβ
�− ∂2
∂t2+∇2
�h
αβ= −16πTαβ
hαβ
= hαβ − 12ηαβh
hαβ
= Aαβ exp(ikδxδ) , kαkα = 0
ηαβ hαβ
h << 1
• Quadrupolar in nature
• Like EM field, there are two polarizations
GW direction x+GW direction Credit: Warren Anderson
h =δL
L
L
δL h ∼ 10−21
L ∼ 4 ly
δL ∼ 10−5 mcharacterizes GW signal
Gravitational Waves
Schematic DetectorAs a wave passes, one arm stretches
and the other shrinks ….
…causing the interference pattern to change at the photodiode
Some IFOs I knows
Time series analysis
• A time series, x(t), is some (continuous or discrete) function of time.
• In practice they are discrete:
xi(t) ≡ x(ti)
tj = t0 + j∆t
Time series analysis
• The Fourier transform:
n(f) =� ∞
−∞n(t)e−2πiftdt
n(t) =� ∞
−∞n(f)e2πiftdf
Parseval’s theorem
• Relates a function and it’s Fourier transform
• Energy spectral density
� ∞
−∞dt |x(t)|2 =
� ∞
−∞df |x(f)|2
Ex(f) ≡ |x(f)|2
Power spectral density• Use Parseval to define the (one-sided)
power spectral density
Power = limT→∞
1T
� T/2
−T/2dt |x(t)|2
= limT→∞
1T
� ∞
−∞df
�����
� T/2
−T/2dtx(t)e−2πift
�����
2
= limT→∞
1T
� ∞
−∞df |x(f)|2
≡ 12
� ∞
−∞dfSx(|f |)
Convolution
• Definition
• Describes the effect of linear systems
• Essentially all we need for GW data analysis
(x � K)(t) ≡� ∞
−∞dt� x(t�)K(t− t�)
signal filter
Convolution
• Theorem
• is called the impulse response
• is called the frequency response
K(t)
K(f)
(x � K)(t)↔ x(f)K(f)
The correlation theorem
• In the time domain
• Correlation measures how well two time series match up when shifted in time.
• Correlation theorem (generalization of Parseval)
Rxy(t)↔ x∗(f)y(f)
Rx,y(t) ≡� ∞
−∞dt� x(t�)y(t� + t)
Noise
• Noise is anything that isn’t the signal we’re interested in.
• Usually instrumental, but can be other signals! (cf. white dwarfs in LISA)
• Typically random in nature (If we knew what it looked like we’d take it out!)
• Characterized by its statistical properties
LIGO Noise
• The noise in the LIGO interferometers is dominated by three different processes depending on the frequency band
Colored noise: power spectrum depends on fWhite noise: power spectrum is independent of f
Stationarity
• A random process is stationary if its statistical properties are independent of time.
• According to the ergodic theorem time averages over a single realization are equivalent to ensemble averages over many realizations, for stationary processes. E.g.
µ = �x� = limT→∞
1T
� T/2
−T/2dt x(t)
• Auto-correlation function
• For a stationary process
Auto-correlation function
R(t) ≡ limT→∞
� T/2
−T/2dt� x(t�)x(t� + t)
R(t) = �x(t�)x(t� + t)�
• For a random variable with zero mean
• For a periodic process
• For a white process
Auto-correlation function
R(0) = σ2 = �x2� − �x�2
R(0) = R(nT )
R(t) = σ2δ(t)
max value!
Back to PSD
• The PSD and auto-correlation function are related by
• It also follows that the frequency components are statistically independent of each other for different frequencies
�x∗(f)x(f �)� =12Sx(|f |)δ(f − f �)
Sx(f) = 2� ∞
−∞dt R(t)e−2πift
Simple PSD estimate
• The periodogram
• The mean of the periodogram is
• And the variance is
Px(f) =1T
|x(f)|2
�Px(f)� = Sx(f)
�P 2x (f)� − �Px(f)�2 = S2
x(f)
Probability and random variables• Real random variable: function X that maps events ω to
real numbers x such that the probability of {ω: X(ω)≤x }∈[0,1], in shorthand P[X≤x]∈[0,1]
• Example: coin toss experiment. The events are ω∈[heads, tails] and X(heads)=1, X(tails)=0. The probability density over the real numbers is
• Expectation value of a function of X is
• If two random variables are independent
pX(x) =
0.5 if x = 00.5 if x = 10.0 otherwise
�f(X)� =�
f(x) pX(x) dx
�XY � = 0
Gaussian distribtuion
• Some reasons for assuming a Gaussian distribution:
• It might actually be Gaussian
• The central limit theorem
• It only requires a mean and a variance
Gaussian distribution
• The probability density for a Gaussian random variable is
• It generalizes to a random process as
• Where Q is inverse to R
pn(n) ∝ exp�−1
2
� �n(t)Q(t− t�)n(t�)dt dt�
�
pX [x] =1√
2πσ2exp
�−x2
2σ2
�
�Q(t− t��)R(t�� − t�)dt�� = δ(t− t�)
Gaussian Random Process
• E.g. Consider
• Then
R(τ) = σ2δ(τ) =⇒ Q(τ) = σ−2δ(τ)
pn(n) ∝ exp�−
�n2(t)dt
2σ2
�∼
�
t
exp�−n2(t)
2σ2
�
Putting it together
• Rewrite
• using the fact that Q is inverse to R as
• where the real inner product is
pn(n) ∝ exp�−1
2
� �n(t)Q(t− t�)n(t�)dt dt�
�
pn(n) ∝ exp�−
� ∞
−∞
n(f)n∗(f)Sn(|f |) df
�
= exp�−1
2(n, n)
�
(a, b) = 2� ∞
−∞
a(f)b∗(f)Sn(|f |) df