Post on 03-Dec-2014
1
Chapter 1 Mathematical Methods
1.1 Time average vs. ensemble averagetime average:
mean:
mean-square:
variance:
auto-correlation:
ensemble average:
mean:
mean-square:
variance:
2
covariance :
・・・first-order probability density function
・・・second-order probability density function
If time average = ensemble average
“ergodic ensemble”
1.2 Stationary vs. non-stationary processes
• If k-th order probability density function is invariant with respect to the shift of time origin,
stationary of order k
• If a stochastic process is stationary of any order k = 1, 2, ・・・・・,
strictly stationary
3
wide-sense stationary (weakly stationary)
example 2:
a basket full of batteries stationary but not ergodic
example 1:
ergodic in both mean and autocorrelation
1.3 Basic stochastic processes
1.3.1 Probability distribution functions and characteristic functions
probability mass function (PMF): P(x)
discrete random variablez-transform:
• If and are independent of , i.e. constants, and if is dependent only of ,
• If ergodic in the autocorrelation
• If ergodic in the mean
(uniform dist.)
4
probability density function (PDF): f(x)
continuous random variable
s-transform:
variance:
third order cumulant:
fourth order cumulant:
5
1.3.2 The Bernoulli processA. Bernouilli trial
PMF:
z-transform:
B. Binomial distributionA series of independent Bernouilli trials with the sameprobability of success produces k0 successes.
z-transform:
PMF:
no success one success
two success
definition of z-transform
( )pppx x -== 1, 2s
( )pnpnpk k -== 1, 2s
6
A series of identical and independent Bernoulli trials,one every , with a probability of success :
for a sufficiently small .
C. Geometric distribution
Number of Bernoulli trials after any one success andbefore the first next success, including this events.
PMF:
successivefailure
first success
z-transform:
1.3.3 The Poisson process
A. Poisson distribution
22
1
1,
11 p
p
p
-== ll s
tp D= l
k
k
7
Number of successes over a time interval [0, t] ?
mutually exclusive histories
(continuous limit)
iterative solution for k = 0, 1, 2….. with an initial condition
: PMFz-transform:
: average number of successful events
msm == 2, kk
8
B. Erlang distribution
Time interval between any one success and the r-th success after that.
(r – 1) successfulevents in
one successfulevent in
PDF:
s-transform:
is the sum of independent random variables
(exponential distribution)
22
1
1,
11 l
sl
== ll
22,
ls
lrr
rr == ll
9
C. Addition and random deletion of Poisson process
i) w = x + y
two independent Poisson random variables
compound PMF
(due to independence of x and y)
: Poisson distributioncf. The weak law of large numbers
( )1-zxe( )1-zye
( )( )1-+ zyxe
( )yxww
ew ww
+=-
!0
0
10
ii)
binomial distribution
: initial Poisson distribution
random deletion
final Poisson distributionD. Binomical to Poisson distribution
(very small probability of success)
yM n =
11
Poisson distribution
A sequence of single Bernoulli trials with a constant andsmall probability of success produces a Poisson distribution.
independent Bernoulli trials with the probability ofsuccess ( : constant, ) = definition of
Poisson process
Physically, it corresponds to a memoryless system with a veryfast internal relaxation.
1.3.4 The Gaussian process
binomial distribution:
n: very large
p, 1 – p: not very close to zeroA pronounced peak at
can be considered as a function of a continuousvariable .
12
small deviation
binomial PMF:
Truncate the Taylor expansion
pn
Taylor series expansion of about :0k
13
Gaussian distributioncf. The central limit theorem
s-transform:
0 0.5 1
14
0
1.4 Burgess variance theorem
n-constant (1-p) : random deletion
Regardless of the individual random variable PDF, thesum of n independent identically distributed randomvariables converges to the Gaussian PDF as .
p
14
binomial distribution
If the number of incident particles fluctuates, the final particlenumber does not obey a simple binomial distribution.
initial distribution binomial distribution
15
: Burgess variance theorem
attenuationfactor
initialvariance
partitionnoise
1.5 Fourier transform (analysis)
If x(t) is absolutely integrable, ,the Fourier transform of x(t) exists and is defined by
The inverse relation is
A statistically stationary process is not absolutely integrable,so strictly speaking, its Fourier transform does not exist.
16
absolutely integrable
1.5.1 Parseval theorem
: Parseval theorem
: energy theorem
gated function:
For example,
17
1.5.2 Power spectral density
average power of
ensemble average
ensemble averaged power of
power spectral density
: statistically stationary process
: statistically non-stationary process
: energy density of at
: complex amplitude of component of
18
1.5.3 Wiener-Khintchine theoremParseval theorem
ensemble average
en. av. auto-correlation (covariance)
power spectral density
: non-stationary process
: stationary processinverse relation:
19
power spectral densityensemble averaged
auto-correlation
Fourier transform pair
Example 1 White noise
Example 2 Wiener-Levy process
: mean square correlation time(memory time)
: Lorentzian
If (infinitesimally short memory time), the powerspectrum becomes white.
20
statistically-stationarynoisy waveform
statistically-nonstationarynoisy waveform
x(t)
t = 0 t
t = 0 t
Stationary vs. nonstationary noisy waveforms
21
covariance
If x(t) is ergodic in the correlation,
(Wiener-Khintchine theorem)
(white noise)
If ,
22
diffusion constant
: cumulative process
: no correlation
physical systems x(t)
laser
Brownianparticle
currentcarrying resistor
frequency w(t)
velocity v(t)
current i(t)
phase f(t) position x(t)
charge q(t)
23
-1 10
1
0.5
1
10-1
10-2
10-2 10-1 1 10
10-1
10-2
1
10-2 10-1 1 10-1 10
Autocorrelation Function Unilateral Power Spectrum
The autocorrelation function and unilateral powerspectrum of a stationary noisy waveform.
Autocorrelation Function Unilateral Power Spectrum
The autocorrelation function and unilateral powerspectrum of a nonstationary noisy waveform y(t).
1
248 TDy
24
1.5.4 Cross-correlation
x(t), y(t): statistically stationary processcross-correlation function
cross-spectral density
: c-number (carry the amplitude and phase)
Parseval theorem
: generalized Wiener-Khintchine theorem
coherence function
25
1 : complete positive correlation-1 : complete negative correlation 0 : no correlation
1.6 Random pulse train1.6.1 Carson theorem
random pulse train
Fourier transform
power spectral density
i) k = m
random variables
a1
a2
a3 ak
t1 t2 t3 tkt……………………
26
: average rate of pulse emission
: mean-square of the pulse amplitude
ii)
If a pulse emission time is a Poisson-point-process anda pulse amplitude is completely independent,
: mean of the pulse amplitude
tk is uniformly distributed in [0, T]
mtie w
27
Carson theorem1.6.2 Campbell’s theorem
: Wiener-Khintchine theorem
Parseval theorem1/2
: mean-square
Campbell’s theorem of mean-square
28
Campbell’s theorem of mean
1.6.3 Shot noise in a vacuum diode
cathodesurface chargeanodeQC = -CV QA = CV
electron
vacuum diode
i(t)
When an electron is thermionically emitted, this event createsan additional surface charge of +q on the cathode. This surfacecharge shields the electric field created by the electron andrealizes charge neutrality inside the cathode conductor.
As the electron travels from the cathode to the anode, thesurface charge on the cathode decreases and the surface chargeon the anode increases. This change in the surface charge isachieved by an external circuit current.
--------
+++++++
+++
-q++
V
29
Ramo Theorem
If an external circuit has a negligible resistance, the voltagebetween the two electrodes is kept constant.
circuit relaxationcurrent
electronvelocity
transit time
If each electron emission is independent, such a memorylesssystem obeys Carson’s theorem. If the electron transit time ismuch shorter than any relevant time constants, we can assumethe relaxation current pulse is an impulse with aconstant area q. infinite
noise powerCarson’s theorem
white
: Shottky formula of shot noise
constant voltage operation
energy gain by an electron
energy supply by a current
( ) 22 avSi =w22 qa =
30
If the electron transit time is not negligible, the Fouriertransform of i(t) provides the information about the cut-off ofshot noise component. finite noise power
If an external circuit has a finite resistance and thus a finitecircuit relaxation time , the voltage between the twoelectrodes is no more constant.
However, if the average inter-emission time of electrons ismuch longer than the circuit relaxation time , each electronemission process is still considered as an independent process. constant voltage = memory-less
If the average inter-emission time of electrons becomes shorterthen the circuit relaxation time, electron emission processbecomes self-regulated sub-Poissonian process. constant current operation
i(t)
CRSV(t)
t
t
31
ensemble averaged autocorrelation
: Campbell’s theoremWiener-Khintchine theorem
full shot noise
cut-off frequency
0
( )2ti
( )2ti
( ) ( )wd2
ti( )tiq2