STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING...

39
To appear in 2008 in the IEEE Transactions on Neural Networks Current draft: 25 April 2008 STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING NEURON MODELS WITH LEVY NOISE Ashok Patel and Bart Kosko Signal and Image Processing Institute Department of Electrical Engineering University of Southern California Los Angeles, CA 90089-2564, USA Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models, Mutual information ABSTRACT Levy noise can help neurons detect faint or subthreshold signals. Levy noise extends standard Brown- ian noise to many types of impulsive jump-noise processes found in real and model neurons as well as in models of finance and other random phenomena. Two new theorems and the Itˆ o calculus show that white Levy noise will benefit subthreshold neuronal signal detection if the noise process’s scaled drift velocity falls inside an interval that depends on the threshold values. These results generalize earlier ‘forbidden interval’ theorems of neuronal ‘stochastic resonance’ or noise-injection benefits. Global and local Lipschitz conditions imply that additive white Levy noise can increase the mutual information or bit count of several feedback neuron models that obey a general stochastic differential equation. Simulation results show that the same noise benefits still occur for some infinite-variance stable Levy noise processes even though the theorems themselves apply only to finite-variance Levy noise. The Appendix proves the two Itˆ o-theoretic lemmas that underlie the new Levy noise-benefit theorems.

Transcript of STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING...

Page 1: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

To appear in 2008 in the IEEE Transactions on Neural Networks

Current draft: 25 April 2008

STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING

NEURON MODELS WITH LEVY NOISE

Ashok Patel and Bart Kosko

Signal and Image Processing Institute

Department of Electrical Engineering

University of Southern California

Los Angeles, CA 90089-2564, USA

Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models, Mutual information

ABSTRACT

Levy noise can help neurons detect faint or subthreshold signals. Levy noise extends standard Brown-

ian noise to many types of impulsive jump-noise processes found in real and model neurons as well as in

models of finance and other random phenomena. Two new theorems and the Ito calculus show that white

Levy noise will benefit subthreshold neuronal signal detection if the noise process’s scaled drift velocity

falls inside an interval that depends on the threshold values. These results generalize earlier ‘forbidden

interval’ theorems of neuronal ‘stochastic resonance’ or noise-injection benefits. Global and local Lipschitz

conditions imply that additive white Levy noise can increase the mutual information or bit count of several

feedback neuron models that obey a general stochastic differential equation. Simulation results show that

the same noise benefits still occur for some infinite-variance stable Levy noise processes even though the

theorems themselves apply only to finite-variance Levy noise. The Appendix proves the two Ito-theoretic

lemmas that underlie the new Levy noise-benefit theorems.

Page 2: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

I. STOCHASTIC RESONANCE IN NEURAL SIGNAL DETECTION

Stochastic resonance (SR) occurs when noise benefits a system rather than harms it. Small amounts

of noise can often enhance some forms of nonlinear signal processing while too much noise degrades it

[12, 13, 22, 27, 44, 48, 57, 59, 60, 68, 70, 71, 82]. SR has many useful applications in physics, biology,

and medicine [5, 6, 7, 11, 14, 17, 18, 21, 23, 32, 40, 41, 42, 51, 52, 54, 55, 61, 69, 74, 81, 83, 87, 89].

SR in neural networks is itself part of the important and growing area of stochastic neural networks

[9, 10, 38, 84, 85, 86, 88]. We show that a wide range of general feedback continuous neurons and spiking

neurons benefit from a broad class of additive white Levy noise. This appears to be the first demonstration

of the SR effect for neuron models subject to Levy noise perturbations.

(a) (b) (c) (d) (e)

Figure 1: Stochastic resonance in the Kanisza square illusion with symmetric α-stable noise (α = 1.9) in a thick-

tailed bell curve with infinite-variance but finite intensity or dispersion γ [30]. The Kanisza square illusion improves

as the noise dispersion γ increases from 0.047 to 0.3789 and then it degrades as the dispersion increases further. Each

pixel represents the output of the noisy bistable potential neuron model (1)-(2) and (5) that uses the pixel values

of the original Kanisza square image as subthreshold input signals. The additive α-stable noise dispersions are (a)

γ = 0.047, (b) γ = 0.1015, (c) γ = 0.3789, (d) γ = 1, and (e) γ = 3.7321.

Figure 1 shows how impulsive Levy noise can enhance the Kanisza-square visual illusion in which four

dark-corner figures give rise to an illusory bright interior square. Each pixel is the thresholded output of a

noisy bistable neuron whose input signals are subthreshold and quantized pixel values of the original noise-

free Kanizsa image. The outputs of the bistable neurons do not depend on the input signals if there is no

2

Page 3: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

additive noise because the input signals are subthreshold. Figure 1(a) shows that adding infinite-variance

Levy noise induces a slight correlation between the pixel input and output signals. More intense Levy

noise increases this correlation in Figures 1 (b) - (c). Still more intense Levy noise degrades the image

and undermines the visual illusion in Figures 1(d) -(e). Figure 2 shows typical sample paths from different

types of Levy noise. Figure 3 shows the characteristic inverted-U or non-monotonic signature of SR for

white Levy noise that perturbs a continuous bistable neuron.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 t−0.2

−0.1

0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 t

0

0.1

0.2

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 t

−0.2

0.1

−0.1

0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 t

−0.2

−0.1

0

Lt

Lt

Lt

(a) Brownian diffusion

(b) jump diffusion

(c) normal inverse Gaussian process

Lt

(d) infinite−variance α−stable process

Figure 2: Sample paths from one-dimensional Levy processes: (a) Brownian motion with drift µ = 0.1 and variance

σ = 0.15, (b) jump diffusion with µ = 0.1, σ = 0.225, Poisson jump rate λ = 3, and uniformly distributed jump

magnitudes in the interval [-0.2,0.2] (and so with Levy measure ν(dy) = (3/0.4)dy for y ∈ [−0.2, 0.2] and zero else),

(c) normal inverse Gaussian (NIG) process with parameters α = 20, β = 0, δ = 0.1, and µ = 0, (d) infinite-variance

α-stable process with α = 1.9 and dispersion κ = 0.0272 (µ = 0, σ = 0, and ν(dy) is of the form k|y|1+α dy).

We generalize the recent ‘forbidden interval’ theorems [49, 50, 60, 64, 65] for continuous and spiking

neuron models to a broad class of finite-second-moment Levy noise that may depend on the neuron’s mem-

3

Page 4: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Scale κ of additive white Jump−Diffusion Levy noise

Mutu

al i

nfo

rmatio

n I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Scale κ of additive white NIG Levy noise

Mutu

al i

nfo

rmatio

n I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

Scale κ of additive white α−stable Levy noise (α = 1.9)

Mutu

al i

nfo

rmatio

n I(S

,R)

in b

its

(a) (b) (c)

Figure 3: Mutual information Levy noise benefits in the continuous bistable neuron (1)-(2) and (5). Additive white

Levy noise dLt increases the mutual information of the bistable potential neuron for the subthreshold input signals

s1 = −0.3 and s2 = 0.4. The types of Levy noise dLt are (a) Gaussian with uniformly distributed jumps, (b)

pure-jump normal inverse Gaussian (NIG), and (c) symmetric α-stable noise with α = 1.9 (thick-tailed bell curve

with infinite variance [62]). The dashed vertical lines show the total min-max deviations of the mutual information

in 100 simulation trials.

brane potential. The original forbidden interval theorem [49, 50] states that simple threshold neurons will

have an SR noise benefit in the sense that noise increases the neuron’s mutual information or bit count if

and only if the noise mean or location parameter µ does not fall in a threshold-related interval: SR occurs

if and only if µ /∈ (T – A, T +A) for threshold T where −A < A < T for bipolar subthreshold signal ±A.

The theorems below show that such an SR noise benefit will occur if the additive white Levy noise process

has a bounded scaled drift velocity that does not fall within a threshold-based interval. This holds for

general feedback continuous neuron models that include common signal functions such as logistic sigmoids

or Gaussians. It also holds for spiking neurons such as the FitzHugh-Nagumo, leaky integrate-and-fire,

and reduced Type-I neuron models. We used the Ito stochastic calculus to prove our results under the

assumption that the Levy noise has a finite second moment. But Figure 1 and the (c) sub-figures of Figures

3-8 all show that the SR noise benefit still occurs in the more general infinite-variance case of some types

of α-stable Levy noise. So the SR effect is not limited to finite-second-moment Levy noise. We were not

able to prove that these stable infinite-variance SR effects must occur as we did prove with simpler neuron

4

Page 5: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

models [49, 50, 64].

Levy noise has advantages over standard Gaussian noise in neuron models despite its increased mathe-

matical complexity. A Levy noise model more accurately describes how the neuron’s membrane potential

evolves than does a simpler diffusion model because the more general Levy model includes not only pure-

diffusion and pure-jump models but jump-diffusion models as well [35, 73]. Neuron models with additive

Gaussian noise are pure-diffusion models. These neuron models rely on the classical central limit theorem

for their Gaussian structure and thus they rely on special limiting-case assumptions of incoming Poisson

spikes from other neurons. These assumptions require at least that the number of impinging synapses

is large and that the synapses have small membrane effects due to the small coupling coefficient or the

synaptic weights [28, 46]. The Gaussian noise assumption may be more appropriate for signal inputs from

dendritic trees because of the sheer number of dendrites. But often fewer inputs come from synapses near

the post-synaptic neuron’s trigger zone and these inputs produce impulses in noise amplitudes because of

the higher concentration of voltage-sensitive sodium channels in the trigger zone [29, 45, 63]. Engineering

applications also favor the more general Levy model because physical devices may be limited in their num-

ber of model-neuron connections [58] and because real signals and noise can often be impulsive [30, 62, 79].

II. NOISY FEEDBACK NEURON MODELS

We study Levy SR noise benefits in the noisy feedback neuron models of the general form

x = −x(t) + f(x(t)) + s(t) + n(t) (1)

y(t) = g(x(t)) (2)

with initial condition x(t0) = x0. Here s(t) is the additive net excitatory or inhibitory input forcing signal—

either s1 or s2. The additive noise term n(t) is Levy noise with mean or location µ and intensity scale κ

(or dispersion γ for symmetric α-stable noise where γ = κα with characteristic function φ(u) = eiuµ−γ|u|α).

The neuron feeds its activation or membrane potential signal x(t) back to itself through −x(t) + f(x(t))

and emits the (observable) thresholded or spike signal y(t) as output. Here g is a static transformation

5

Page 6: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

function. We use the threshold

g(x) =

1 if x > 0

0 else(3)

for continuous neuron models. We use a related threshold g in spiking neuron models where g determines

the spike occurrence. The neuronal signal function f(x) of (1) can be of quite general form for continuous

neuron models [65]:

• Logistic. The logistic signal function [47] is sigmoidal and strictly increasing

f(x) =1

1 + e−cx(4)

for scaling constant c > 0. We use c = 8. This signal function gives a bistable additive neuron model.

• Hyperbolic Tangent. This signal function is also sigmoidal and gives a bistable additive neuron model

[2, 15, 37, 47]:

f(x) = 2 tanhx (5)

• Linear Threshold. This linear-threshold signal has the form [47]:

f(x) =

cx |cx| < 1

1 cx > 1

−1 cx < −1

(6)

for constant c > 0. We use c = 2.

• Exponential. This signal function is asymmetric and has the form [47]

f(x) =

1− exp−cx if x > 0

0 else(7)

for constant c > 0. We use c = 8.

6

Page 7: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

• Gaussian. The Gaussian or ‘radial basis’ signal function [47] differs in form from the signal functions

above because it is nonmonotonic:

f(x) = exp−cx2 (8)

for constant c > 0. We use c = 20.

The above neuron models can have up to three fixed points depending on the input signal and the

model parameters. The input signal is subthreshold in the sense that switching it from s1 to s2 or vice

versa does not change the output Yt of (22). There exist θ1 and θ2 such that the input S is subthreshold

when θ1 ≤ s1 < s2 ≤ θ2. The values of θ1 and θ2 depend on the model parameters. Consider the linear

threshold neuron model (1)-(2) and (6) with c = 2. A simple calculation shows that if the input signal St

∈ s1, s2 satisfies −0.5 < s1 < s2 < 0.5 then the linear threshold neuron has two stable fixed points (one

positive and the other negative) and has one unstable fixed point between them. The Gaussian neuron

model (1)-(2) and (8) has only one fixed point if 0 < s1 < s2. So the input is subthreshold because switch-

ing it from s1 to s2 or vice versa does not change the output Yt. Figure 3 shows the mutual information

noise benefits in the bistable neuron model (1)-(2) and (5) for three different additive white Levy noise

cases when the input signals are subthreshold. Note the signature nonmonotonic shape of all three SR

noise-benefit curves in Figure 3.

The membrane potential dynamics (1) are one-dimensional for all our neuron models except for the

two-dimensional FHN spiking neuron model below. So next we briefly describe multi-dimensional Levy

processes and set up a general multi-dimensional Levy stochastic differential equation framework for our

feedback continuous and spiking neuron models.

III. LEVY PROCESSES AND STOCHASTIC DIFFERENTIAL EQUATIONS

Levy processes [67, 75] form a wide class of random processes that include Brownian motion, α-stable pro-

cesses, compound Poisson processes, generalized inverse Gaussian processes, and generalized hyperbolic

processes. Figure 2 shows some typical scalar Levy sample paths. Levy processes can account for the im-

7

Page 8: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

pulsiveness or discreteness of both signals and noise. Researchers have used Levy processes to model diverse

phenomena in economics [4, 78], physics [77], electrical engineering [1, 4, 62, 66], biology [76], and seismol-

ogy [80]. A Levy process Lt = (L1t , ..., L

mt )′ for t ≥ 0 in a given probability space (Ω,F , (Ft)0≤t≤∞, P ) is

a stochastic process taking values in Rm with stationary and independent increments (we assume that L0

= 0 with probability 1). The Levy process Lt obeys three properties:

1. Lt − Ls is independent of sigma-algebra Fs for 0 ≤ s < t ≤ ∞

2. Lt − Ls has the same distribution as Lt−s

3. Ls → Lt in probability if s → t.

The Levy-Khintchine formula gives the characteristic function φ of Lt as [3]

φ(u) = E(ei〈u,Lt〉) = etη(u) for t ≥ 0 and u ∈ Rm (9)

where 〈·, ·〉 is the Euclidean inner product (so |u| = 〈u, u〉12 ). The characteristic exponent or the so-called

Levy exponent is

η(u) = i〈µ, u〉 − 12〈u,Ku〉+

∫Rm−0

[ei〈u,y〉 − 1− i〈u, y〉χ|y|<1(y)]ν(dy) (10)

for some µ ∈ Rm, a positive-definite symmetric m×m matrix K, and measure ν on Borel subsets of Rm0

= Rm\0 (or ν(0) = 0). Then ν is a Levy measure such that

∫Rm

0

min1, |y|2ν(dy) < ∞. (11)

A Levy process Lt combines a drift component, a Brownian motion (Gaussian) component, and a

jump component. The Levy-Khintchine triplet (µ,K, ν) completely determines these components. The

Levy measure ν determines both the average number of jumps per unit time and the distribution of jump

magnitudes in the jump component of Lt. Jumps of any size in a Borel set B form a compound Poisson

process with rate∫B ν(dy) and jump density ν(dy)/

∫B ν(dy) if the closure B does not contain 0. µ gives

the velocity of the drift component. K is the covariance matrix of the Gaussian component. If K = 0

and ν = 0 then (9) becomes E(ei〈u,Lt〉) = eit〈µ,u〉. Then Lt = µt is a simple m-dimensional deterministic

8

Page 9: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

motion (drift) with velocity vector µ. If K 6= 0 and ν = 0 then Lt is a m-dimensional Brownian motion

with drift because (9) takes the form E(ei〈u,Lt〉) = et[i〈µ,u〉−12〈u,Ku〉] and because this exponential is the

characteristic function of a Gaussian random vector with mean vector µt and covariance matrix tK. If

K 6= 0 and ν(Rm) <∞ then Lt is a jump-diffusion process while K = 0 and ν(Rm) <∞ give a compound

Poisson process. If K = 0 and ν(Rm) = ∞ then Lt is a purely discontinuous jump process and has an

infinite number of small jumps in any time interval of positive length. We consider only the Levy processes

whose components Lkt have finite second moments: E[(Lkt )]2 <∞. This excludes the important family of

infinite-variance α-stable processes (including the α = 0.5 Levy stable case) where α ∈ (0,2] measures the

tail thickness and where symmetric α-stable distributions have characteristic functions φ(u) = eiuµ−γ|u|α

[30, 49, 62, 79]. But a finite-moment assumption does not itself imply that the Levy measure is finite:

(ν(R) <∞). Normal inverse Gaussian NIG(α, β, δ, µ) distributions are examples of semi-thick-tailed pure-

jump Levy processes that have infinite Levy measure and yet have finite moments of all order [33, 72]. They

have characteristic functions of the form φ(u) = e[iuµ+δ(√α2−β2−

√α2−(β+iu)2)] where 0 ≤ |β| < α and δ > 0.

Let Lt(µ,K, ν) = (L1t , ..., L

mt )′ be a Levy process that takes values in Rm where Ljt (µ

j , σj , νj) are

real-valued independent Levy processes for j = 1, ...,m. We denote the Levy-Ito decomposition [3] of Ljt

for each j = 1, ...,m and t ≥ 0 as

Ljt = µjt+ σjBjt +

∫|yj |<1

yjN j(t, dyj) +∫|yj |≥1

yjN j(t, dyj) (12)

= µjt+ σjBjt +

∫ t

0

∫|yj |<1

yjN j(ds, dyj) +∫ t

0

∫|yj |≥1

yjN j(ds, dyj). (13)

Here µ determines the velocity of the deterministic drift process µit while the Bjt are real-valued inde-

pendent standard Brownian motions. Then µ = (µ1, ..., µm)′ and K = diag[(σ1)2, ..., (σm)2]. The N j are

independent Poisson random measures on R+×R0 with compensated (mean-subtracted) Poisson processes

N j and intensity/Levy measures νj .

Define the Poisson random measure as

N j(t, B) = # ∆Ls ∈ B for 0 ≤ s ≤ t (14)

for each Borel set B in R0. The Poisson random measure gives the random number of jumps of Lt in the

9

Page 10: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

time interval [0, t] with jump size ∆Lt in the set B. N j(t, B) is a Poisson random variable with intensity

νj(B) if νj(B) < ∞ and if we fix t and B. But N j(t, ·)(ω) is a measure if we fix ω ∈ Ω and t ≥ 0. This

measure is not a martingale. But the compensated Poisson random measure

N j(t, B) = N j(t, B)− tνj(B) (15)

is a martingale and gives the compensated Poisson integral (12) (the second term on the right-hand side

of (12)) as∫|yj |<1

yjN j(t, dyj) =∫|yj |<1

yjN j(t, dyj)− t∫|yj |<1

yjνj(dyj) for j = 1, ...,m. (16)

We assume again that each Ljt has a finite second moment (E|Ljt |2 <∞). But if Ljt is a Levy process

with triplet (µj , σj , νj) then Ljt has a finite pth moment for p ∈ R+ if and only if∫|yj |>1 |y

j |pνj(dyj) <∞

[75]. The drift velocity µj relates to the expected value of a Levy Process Ljt by E(Lj1) = µj+∫|yj |>1 y

jν(dyj)

and E(Ljt ) = tE(Lj1). So if Ljt is a standard Brownian motion then νj = 0, E(Ljt ) = 0, and Var(Ljt ) = t(σj)2.

The variance of the Levy process in (12) is

Var(Ljt ) = Var(σjBjt ) + Var(

∫|yj |<1

yjN j(t, dyj)) + Var(∫|yj |≥1

yjN j(t, dyj)) (17)

because the underlying processes are independent. The variance terms on the right-hand side of (17) have

the following form [3]:

Var(Ljt ) = t(σj)2 (18)

Var

(∫|yj |≥1

yjN j(t, dyj)

)= t

∫|yj |≥1

|yj |2νj(dyj) (19)

Var

(∫|yj |<1

yjN j(t, dyj)

)≤ E

(∫|yj |<1

yjN j(t, dyj)

)2

= t

∫|yj |<1

|yj |2νj(dyj). (20)

The last equality follows from the Ito isometry identity (Proposition 8.8 in [19]). Then (17) and (18)-(20)

imply that the Var(Ljt ) → 0 if and only if σj → 0 and νj → 0.

We can rewrite (1)-(2) as a more general Ito stochastic differential equation (SDE) [3]

dXt = b(Xt−)dt+ c(Xt−)dLt (21)

Yt = g(Xt) (22)

10

Page 11: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

with initial condition X0. Here b(Xt−) = −Xt− + f(Xt−) +St is a Lipschitz continuous drift term, c(Xt−)

is a bounded Levy diffusion term, and dLt is a white Levy noise with noise scale κ.

Our continuous neuron models are again one-dimensional but the spiking FHN neuron model is two-

dimensional. So consider the general d-dimensional SDE in the matrix form with m-dimensional Levy

noise Lt = (L1t , ..., L

mt )′

dXt = b(Xt−)dt+ c(Xt−)dLt (23)

which is shorthand for the system of SDEs

dXit = bi(Xt−)dt+

m∑j=1

cij(Xt−)dLjt for i = 1, .., d (24)

with initial condition Xi0. Here Xt = (X1

t , ..., Xdt )′, b(Xt) = (b1(Xt), ..., bd(Xt))′, and c is a d ×m matrix

with rows ci(Xt) = (ci1(Xt), ..., cim(Xt)). The functions bi: Rd → R are locally or globally Lipschitz

measurable functions. The functions cij : Rd → R are bounded globally Lipschitz measurable functions

such that |cij |2 ≤ H ij ∈ R+. The Ljt terms are independent Levy processes as in (13) with µj = 0 for j =

1, ..., m. Then

dXit = bi(Xt−)dt+

m∑j=1

[cij(Xt−)µj ]dt+m∑j=1

[cij(Xt−)σj ]dBjt

+m∑j=1

∫|yj |<1

[cij(Xt−)yj ]N j(dt, dyj) +m∑j=1

∫|yj |≥1

[cij(Xt−)yj ]N j(dt, dyj)

= bi(Xt−)dt+m∑j=1

[µij(Xt−)]dt+m∑j=1

σij(Xt−)dBjt

+m∑j=1

∫|yj |<1

F ij (Xt− , yj)N j(dt, dyj) +

m∑j=1

∫|yj |≥1

Gij(Xt− , yj)N j(dt, dyj). (25)

where µij(Xt−) = cij(Xt−)µj = 0, σij(Xt−) = cij(Xt−)σj , F ij (Xt− , yj) = cij(Xt−)yj , and Gij(Xt− , y

j) =

cij(Xt−)yj are all globally Lipschitz functions. This equation has the integral form with initial condition

11

Page 12: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

Xi0:

Xit = Xi

0 +∫ t

0bi(Xs−)ds+

m∑j=1

∫ t

0σij(Xs−)dBj

s +m∑j=1

∫ t

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

+m∑j=1

∫ t

0

∫|yj |≥1

Gij(Xs− , yj)N j(ds, dyj). (26)

IV. LEVY NOISE BENEFITS IN CONTINUOUS NEURON MODELS

We now prove that Levy noise can benefit the noisy continuous neurons (21)-(22) with signal functions

(4)-(8) and subthreshold input signals. We assume that the neuron receives a constant subthreshold input

signal St ∈ s1, s2 for time T . Let S denote the input signal and let Y denote the output signal Yt for a

sufficiently large randomly chosen time t ≤ T .

Noise researchers have used various system performance measures to detect SR noise benefits [8, 17, 44,

51, 57, 59, 60, 64, 71]. These include the output signal-to-noise ratio, cross-correlation, error probability,

and Shannon mutual information between input and output signals. We use Shannon mutual information

to measure the Levy noise benefits. Mutual information measures the information that the neuron’s output

conveys about the input signal. It is a common detection performance measure when the input signal is

random [8, 39, 60, 82].

Define the Shannon mutual information I(S, Y ) of the discrete input random variable S and the output

random variable Y as the difference between the output’s unconditional and conditional entropy [20]:

I(S, Y ) = H(Y )−H(Y |S) (27)

= −∑y

PY (y) logPY (y)

+∑s

∑y

PSY (s, y) logPY |S(y|s) (28)

= −∑y

PY (y) logPY (y)

+∑s

PS(s)∑y

PY |S(y|s) logPY |S(y|s) (29)

=∑s,y

PSY (s, y) logPSY (s, y)PS(s)PY (y)

. (30)

12

Page 13: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

So the mutual information is the expectation of the random variable log PSY (s,y)PS(s)PY (y) :

I(S, Y ) = E[

logPSY (s, y)PS(s)PY (y)

]. (31)

Here PS(s) is the probability density of the input S, PY (y) is the probability density of the output Y ,

PY |S(y|s) is the conditional density of the output Y given the input S, and PSY (s, y) is the joint density

of the input S and the output Y . An SR noise benefit occurs in a system if and only if an increase in the

input noise variance or dispersion increases the system’s mutual information (31).

We need the following lemma to prove that noise improves the continuous neuron’s mutual information

or bit count. The Appendix gives the proof of Lemma 1.

Lemma 1: Let bi : Rd → R and cij : Rd → R in (23)-(24) be measurable functions that satisfy the global

Lipschitz conditions

||bi(x1)− bi(x2)|| ≤ K1||x1 − x2|| (32)

||cij(x1)− cij(x2)|| ≤ K2||x1 − x2|| (33)

and |cij |2 ≤ H ij for all x1, x2 ∈ Rd and for i = 1, ..., d and j = 1, ...,m. (34)

Suppose dXt = b(Xt−)dt+ c(Xt−)dLt (35)

dXt = b(Xt)dt (36)

where dLt is a Levy noise with µ = 0 and finite second moments. Then for every T ∈ R+ and for every

ε > 0:

E[ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m, (37)

and hence

P [ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m (38)

since mean-square convergence implies convergence in probability.

We prove the Levy SR theorem with the stochastic calculus and a special limiting argument. This

avoids trying to solve for the process Xt in (21). The proof strategy follows that of the ‘forbidden interval’

13

Page 14: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

theorems [49, 50, 64]: what goes down must go up. Jensen’s inequality implies that I(S, Y ) ≥ 0 [20]. Ran-

dom variables S and Y are statistically independent if and only if I(S, Y ) = 0. Hence I(S, Y ) > 0 implies

some degree of statistical dependence. So the system must exhibit the SR noise benefit if I(S, Y ) > 0 and

if I(S, Y ) → 0 when noise parameters σ → 0 and ν → 0. Theorem 1 uses Lemma 1 to show that I(S, Y )

→ 0 when noise parameters σ → 0 and ν → 0. So some increase in the noise parameters must increase the

mutual information.

Theorem 1: Suppose that the continuous neuron models (21)-(22) and (4)-(8) have a bounded globally

Lipschitz Levy diffusion term c(Xt−) ≤ H and that the additive Levy noise has drift velocity µ. Suppose

also that the input signal S(t) ∈ s1, s2 is subthreshold: θ1 ≤ s1 < s2 ≤ θ2 and that there is some

statistical dependence between the input random variable S and the output spike-rate random variable R

so that I(S,R) > 0. Then the neuron models (21)-(22) with signal functions including (4)-(8) exhibit the

nonmonotone SR effect in the sense that I(S, Y ) → 0 as the Levy noise parameters σ → 0 and ν → 0 if

θ1 − s1 ≤ Hµ ≤ θ2 − s2.

Proof: Let σk, νk∞k=1 be any decreasing sequence of Levy noise parameters such that σk → 0 and

νk → 0 as k → ∞. Define X(t)k and Y (t)k as solution processes of the continuous neuron models with

Levy noise parameters σk and νk instead of σ and ν.

Suppose that µ 6= 0. We can absorb the drift c(Xt−)µ into the input signal S because the Levy noise

Lt is additive in the neuron models. Then the new input signal S′ = S + c(Xt−)µ and it does not affect

the Lipschitz continuity of b(Xt−) in equation (21). Note that S′ is subthreshold (θ1 ≤ S′ ≤ θ2) if θ1 − s1

≤ Hµ ≤ θ2 − s2. So we lose no generality if we consider the noise dLt with µ = 0 and let S ∈ s1, s2 be

subthreshold in the continuous neuron models (21). This allows us to use Lemma 1.

Let the symbol ‘0’ denote the input signal S = s1 and the output signal Y = 0. Let the symbol

‘1’ denote the input signal S = s2 and the output signal Y = 1. Assume that 0 < PS(s) < 1 to avoid

triviality when PS(s) = 0 or 1. We show that S and Y are asymptotically independent by using the fact

that I(S, Y ) = 0 if and only if S and Y are statistically independent [20]. So we need to show only that

14

Page 15: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

PSY (s, y) = PS(s)PY (y) or PY |S(y|s) = PY (y) as σk → 0 and νk → 0 as k → ∞ for signal symbols s ∈ S

and y ∈ Y . The theorem of total probability and the two-symbol alphabet set S give

PY (y) =∑s

PY |S(y|s)PS(s)

= PY |S(y|0)PS(0) + PY |S(y|1)PS(1)

= PY |S(y|0)PS(0) + PY |S(y|1)(1− PS(0))

= (PY |S(y|0)− PY |S(y|1))PS(0) + PY |S(y|1)

So we need to show only that PYk|S(y|0)− PYk|S(y|1) = 0 as σk → 0 and νk → 0 for y ∈ 0, 1 . We prove

the case for y = 0 only: limk→∞PYk|S(0|0) − PYk|S(0|1) = 0 since the proof for y = 1 is similar. Then

the desired limit goes to zero because

limk→∞ PYk|S(0|0) − PYk|S(0|1) = lim

k→∞PYk|S(0|0)− lim

k→∞PYk|S(0|1)

= limk→∞

P [Yk = 0|S = 0]− limk→∞

P [Yk = 0|S = 1]

= limk→∞

P [X(t)k < 0|S = 0] − limk→∞

P [X(t)k < 0|S = 1] for large t

= limk→∞

P [X(t)k < 0, X(t) < 0|S = 0] + P [X(t)k < 0, Xt > 0|S = 0]

− limk→∞

P [X(t)k > 0, Xt < 0|S = 1] + P [X(t)k > 0, Xt > 0|S = 1]

for large t

= limk→∞

P [X(t)k < 0|X(t) < 0, S = 0]P [X(t) < 0|S = 0]

+ P [X(t)k < 0|Xt > 0, S = 0]P [X(t) > 0|S = 0]

− limk→∞

P [X(t)k > 0|X(t) < 0, S = 1]P [X(t) < 0|S = 1]

+ P [X(t)k > 0|X(t) > 0, S = 1]P [X(t) > 0|S = 1]

for large t

= 1 · 12

+ 0 · 12 − 0 · 1

2+ 1 · 1

2 by Lemma 1 and the assumption that

P [X(t) < 0|S = si] = P [X(t) > 0|S = si] = 1/2 for i = 1, 2.

= 0 Q.E.D.

Sub-figures (a) and (b) of Figures 4-6 show simulation instances of Theorem 1 for finite-variance jump-

diffusion and pure-jump additive white Levy noise in logistic, linear-threshold, and Gaussian neuron models.

15

Page 16: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white uniform jump−diffusion Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white NIG Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white α−stable Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

(a) (b) (c)

Figure 4: Mutual information SR Levy noise benefits in the logistic continuous neuron (1)-(2) and (4). Additive

white Levy noise dLt increases the mutual information of the logistic neuron for the subthreshold input signal s1 =

−0.6, s2 = −0.4, and c = 8. The types of Levy noise dLt are (a) Gaussian with uniformly distributed jumps, (b)

pure-jump normal inverse Gaussian (NIG), and (c) symmetric α-stable noise with α = 1.95 (thick-tailed bell curve

with infinite variance [62]). The dashed vertical lines show the total min-max deviations of the mutual information

in 100 simulation trials.

Small amounts of additive Levy noise in continuous neuron models produce the SR effect by increasing

the Shannon mutual information I(S, Y ) between realizations of a random (Bernoulli) subthreshold input

signal S and the neuron’s thresholded output random variable Y . The SR effect in Figures 2 (c) and 4-6

(c) lie outside the scope of the theorem because it occurs for infinite-variance α-stable noise. Thus the SR

effect in continuous neurons is not limited to finite-second-moment Levy noise.

V. LEVY NOISE BENEFITS IN SPIKING NEURON MODELS

We next demonstrate Levy SR noise benefits in three popular spiking neuron models: the leaky integrate-

and-fire model [17, 28], the reduced Type I neuron model [53], and the FitzHugh-Nagumo (FHN) model

[26, 16]. This requires the use of Lemma 2 as we discuss below. These neuron models have a one- or

two-dimensional form of (1). A spike occurs when the membrane potential x(t) crosses a threshold value

from below. We measure the mutual information I(S,R) between the input signal s(t) and the output

spike-rate response R of theses spiking neuron models. We define the average output spike-rate response

16

Page 17: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Scale κ of additive white uniform jump−diffusion Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Scale κ of additive white NIG Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Scale κ of additive white α−stable Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

(a) (b) (c)

Figure 5: Mutual information Levy noise benefits in the linear-threshold continuous neuron (1)-(2) and (6). Additive

white Levy noise dLt increases the mutual information of the linear-threshold neuron for the subthreshold input signal

s1 = −0.4, s2 = 0.4, and c = 2. The types of Levy noise dLt are (a) Gaussian with uniformly distributed jumps, (b)

pure-jump normal inverse Gaussian (NIG), and (c) symmetric α-stable noise with α = 1.95 (thick-tailed bell curve

with infinite variance [62]). The dashed vertical lines show the total min-max deviations of the mutual information

in 100 simulation trials.

R in the time interval (t1, t2] as

R =N(t1, t2]t2 − t1

(39)

where N(t1, t2] is the number of spikes in the time interval (t1, t2].

The Leaky Integrate-and-Fire Neuron Model

The leaky integrate-and-fire neuron model has the form [17]

v = −av + a− δ + S + n (40)

where v is the membrane voltage, a and δ are constants, δ/a is the barrier height of the potential, S is

an input signal, and n is independent Gaussian white noise in the neural literature but here is Levy white

noise. The input signal S is subthreshold when S < δ. The neuron emits a spike when the membrane

voltage v crosses the threshold value of 1 from below to above. The membrane voltage v resets to 1− δ/a

just after the neuron emits a spike.

17

Page 18: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white uniform jump−diffusion Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white NIG Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Scale κ of additive white α−stable Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

(a) (b) (c)

Figure 6: Mutual information Levy noise benefits in the Gaussian or ‘radial basis’ continuous neuron (1)-(2) and

(8). Additive white Levy noise dLt increases the mutual information of the Gaussian neuron for the subthreshold

input signal s1 = 0.2, s2 = 0.4, and c = 20. The types of Levy noise dLt are (a) Gaussian with uniformly distributed

jumps, (b) pure-jump normal inverse Gaussian (NIG), and (c) symmetric α-stable noise with α = 1.95 (thick-tailed

bell curve with infinite variance [62]). The dashed vertical lines show the total min-max deviations of the mutual

information in 100 simulation trials.

The Reduced Type I Neuron Model

The reduction procedure in [31, 36] gives a simple one-dimensional normal form [53] of the multi-dimensional

dynamics of Type I neuron models:

v = β + v2 + σn (41)

where v is the membrane potential, β is the value of input signal, and σ is the standard deviation of

Gaussian white noise n in the neural literature but here is Levy white noise. This reduced model (12)

operates in a subthreshold or excitable regime when the input β < 0.

The FitzHugh-Nagumo (FHN) Neuron Model

The FHN neuron model [16, 26, 28] is a two-dimensional simplification of the Hodgkin and Huxley neuron

model [34]. It describes the response of a so-called Type II excitable system [28, 54] that undergoes a

Hopf bifurcation. The system first resides in the stable rest state for subthreshold inputs as do multistable

systems. Then the system leaves the stable state in response to a strong input but returns to it after passing

18

Page 19: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

through firing and refractory states in a manner that differs from the behavior of multistable systems. The

FHN neuron model is a limit-cycle oscillator of the form

εv = −v(v2 − 14

)− w +A+ s(t) + n(t) (42)

w = v − w (43)

where v(t) is a fast (voltage) variable, w(t) is slow (recovery) variable, A is a constant (tonic) activation

signal, and ε = 0.005. n(t) is a white Levy noise and s(t) is a subthreshold input signal—either s1 or s2.

We measure the neuron’s response to the input signal s(t) in terms of the transition (firing) rate r(t).

We can rewrite (42)-(43) as

εv = −v(v2 − 14

)− w +AT − (B − s(t)) + n(t) (44)

w = v − w (45)

where B is a positive constant parameter that corresponds to the distance that the input signal s(t) must

overcome to cross the threshold. Then B−s(t) is the signal-to-threshold distance and so s(t) is subthreshold

when B − s(t) > 0. Our simulations used B = 0.007 and hence A = −(5/(12√

3 + 0.007)).

The deterministic FHN model (n(t) ≡ 0 in (44)) performs relaxation oscillations and has an action

potential x(t) that lies between -0.6 and 0.6. The system emits a spike when x(t) crosses the threshold

value θ = 0. We use a lowpass-filtered version of x(t) to avoid false spike detections due to the additive

noise. The lowpass filter is a 100-point moving-average smoother with a 0.001 second time-step.

We rewrite equations (42)-(43) as

x1 = −x1

ε((x1)2 − 1

4)− x2

ε+A

ε+s(t)ε

+n(t)ε

(46)

x2 = x1 − x2. (47)

Here x1 = v and x2 = w. The corresponding matrix Ito stochastic differential equation is

dXt = b(Xt−)dt+ c(Xt−)dLt (48)

where Xt = (X1t , X

2t )T , Lt = (L1

t , L2t )T ,

19

Page 20: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

b(Xt−) =

b1(X1t− , X

2t−)

b2(X1t− , X

2t−)

=

−X1

t−ε ((X1

t−)2 − 14)−

X2t−ε + A

ε + stε

X1t− −X

2t−

,

and c(Xt−) =

σε

0

.

Thus all of the above spiking neuron models have the SDE form (23). Note that the drift term of

the leaky integrate-and-fire neuron model is globally Lipschitz while the drift term of the reduced Type I

neuron model is locally Lipschitz. The Lipschitz condition is not easy to verify in the FHN model.

We now show that the drift term b1(Xt) in the preceding equation does not satisfy the global Lipschitz

condition. Note that b1(Xt) is differentiable on R2 because the partial derivatives of b1(X1t− , X

2t−) exist

and are continuous on R2. Suppose that b1(Xt) satisfies the following global Lipschitz condition: There

exists a constant K > 0 such that

|b1(Zt)− b1(Yt)| ≤ K||Zt − Yt|| for all Zt and Yt ∈ R2 and t ∈ [0, T ].

Then the mean-value theorem gives

b1(Zt)− b1(Yt) = [∂b1(ζ)∂X1

t

∂b1(ζ)∂X2

t

] · [Zt − Yt] for some ζ between Zt and Yt in R2. (49)

=∂b1(ζ)∂X1

t

(Z1t − Y 1

t ) +∂b1(ζ)∂X2

t

(Z2t − Y 2

t )

Then

|b1(Zt)− b1(Yt)| ≥ |∂b1(ζ)∂X1

t

||Z1t − Y 1

t | − |∂b1(ζ)∂X2

t

||Z2t − Y 2

t |

= |∂b1(ζ)∂X1

t

||Z1t − Y 1

t | −1ε|Z2t − Y 2

t | because∂b1

∂X2t

= −1ε

= |∂b1(ζ)∂X1

t

||Z1t − Y 1

t | choosing Zt and Yt such that Z2t = Y 2

t

= |∂b1(ζ)∂X1

t

|||Zt − Yt||

> K||Zt − Yt|| for some Zt ∈ R2 and Yt ∈ R2

20

Page 21: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

because | ∂b1∂X1

t| = |−3(X1

t )2

ε + 14ε | is unbounded and continuous on R2 and so there is a domain D ⊂ R2 such

that |∂b1(ζ)∂X1

t| > K for all ζ ∈ D. Thus b1(Xt) is not globally Lipschitz. So we cannot use Lemma 1 to prove

the sufficient condition for the SR effect in the FHN neuron model (44)-(45).

But b1(Xt) is locally Lipschitz. The partial derivatives of b1(X1t− , X

2t−) exist and are continuous on R2.

So ∂b1

∂X1t

and ∂b1

∂X2t

achieve their respective maxima on the compact set ζ ∈ R2 : ||ζ|| ≤ n. Then (49) gives

the required local Lipschitz condition:

|b1(Zt)− b1(Yt)| ≤ maxsupζ|∂b

1(ζ)∂X1

t

|, supζ|∂b

1(ζ)∂X2

t

|||Zt − Yt||

= K ′n||Zt − Yt||

for all Zt and Yt ∈ R2 such that ||Zt|| ≤ n, ||Yt|| ≤ n, and ||ζ|| ≤ n. Lemma 2 extends the conclusion of

Lemma 1 to the locally Lipschitz drift terms bi(Xt).

Theorem 2 below gives a ‘forbidden-interval’ sufficient condition for a Levy SR noise benefits in spiking

neuron models such as the leaky integrate-and-fire model [17, 28], the reduced Type I neuron model [53],

and the FitzHugh-Nagumo (FHN) model [26, 16]. It shows that these neuron models enjoy SR noise bene-

fits if the noise mean µ falls to the left of a bound. Theorem 2 requires Lemma 2 to extend the conclusion

of Lemma 1 to the locally Lipschitz drift terms bi(Xt). The Appendix gives the proof of Lemma 2.

Lemma 2: Let bi : Rd → R and cij : Rd → R in (23)-(24) be measurable functions that satisfy the

respective local and global Lipschitz conditions

||bi(z)− bi(y)|| ≤ Cn||z − y|| when ||z|| ≤ n and ||y|| ≤ n (50)

||cij(z)− cij(y)|| ≤ K1||z − y|| for all z and y ∈ Rd (51)

and |cij |2 ≤ H ij for i = 1, ..., d and j = 1, ...,m. (52)

Suppose dXt = b(Xt)dt+ c(Xt−)dLt (53)

dXt = b(Xt)dt (54)

where dLt is a Levy noise with µ = 0 and finite second moments. Then for every T ∈ R+ and for every

ε > 0:

21

Page 22: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

E[ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m, (55)

and hence

P [ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m (56)

since mean-square convergence implies convergence in probability.

We can now state and prove Theorem 2.

Theorem 2: Suppose that the spiking neuron models (40)-(41) and (42)-(43) have the form of the Levy

SDE (23) with a bounded globally Lipschitz Levy diffusion term c(Xt−) ≤ H and that the additive Levy

noise has drift velocity µ. Suppose that the input signal S(t) ∈ s1, s2 is subthreshold: S(t) < B. Sup-

pose there is some statistical dependence between the input random variable S and the output spike-rate

random variable R so that I(S,R) > 0. Then the spiking neuron models (40)-(41) and (42)-(43) exhibit

the SR effect in the sense that I(S,R)→ 0 as the Levy noise parameters σ → 0 and ν → 0 if Hµ < B−s2.

Proof: Let σk, νk∞k=1 be any decreasing sequence of Levy noise parameters such that σk → 0 and νk →

0 as k →∞. Define X(t)k and Rk as the respective solution process and spiking rate process of the FHN

spiking neuron model (48) with Levy noise parameters σk and νk instead of σ and ν.

Suppose that µ 6= 0. We can absorb the drift c(Xt−)µ into the input signal S because the Levy noise

Lt is additive in all the neuron models. Then the new input signal S′ = S + c(Xt−)µ and this does not

affect the Lipschitz continuity of b(Xt−) in (21). S′ is subthreshold (S′ < B) because c(Xt−)µ < Hµ < B

– s2 where s2 = maxs1, s2. So we lose no generality generality if we consider the noise dLt with µ = 0

and let S ∈ s1, s2 be subthreshold in the continuous neuron models (21). This allows us to use Lemma

2.

Recall that I(S,R) = 0 if and only if S and R are statistically independent [20]. So we need to show

only that fSR(s, r) = PS(s)fR(r) or fR|S(r|s) = fR(r) as σ → 0 and ν → 0 for signal symbols s ∈ s1, s2

22

Page 23: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

and for all r ≥ 0. Here fSR is the joint probability density function and fS|R is the conditional density

function. This is logically equivalent to FR|S = FR as σk → 0 and νk → 0 as k → 0 where FR|S is the

conditional distribution function [25]. Again the theorem of total probability and the two-symbol alphabet

set s1, s2 give

FR(r) =∑s

FR|S(r|s)PS(s) (57)

= FR|S(r|s1)PS(s1) + FR|S(r|s2)PS(s2)

= FR|S(r|s1)PS(s1) + FR|S(r|s2)(1− PS(s1))

= (FR|S(r|s1)− FR|S(r|s2))PS(s1) + FR|S(r|s2) (58)

So we need to show that limk→∞ FRk|S(r|s1)− FRk|S(r|s2) = 0 for all r ≥ 0. This holds if and only if

limk→∞

P [Rk > r|S = s1]− P [Rk > r|S = s2] = 0 (59)

We prove that limk→∞ P [Rk > r|S = si] = 0 for i = 1 and i = 2. Note that if r > 0 for (48) then X1(t)k

must cross the firing or spike threshold θ. Then

P [Rk > r|S = si] ≤ P [ supt1≤t≤t2

X1(t)k > θ|S = si].

Then Lemma 2 shows that the required limit goes to zero:

limk→∞

P [Rk > r|S = s2] ≤ limk→∞

P [ supt1≤t≤t2

X1(t)k > θ|S = si]

= limn→∞

P [ supt1≤t≤t2

X1(t)k > θ, X1(t) < θ|S = si]

because X1(t) converges to the FHN fixed-point ZFi < θ for large t

= 0 by Lemma 2. Q.E.D.

Sub-figures (a) and (b) of Figures 7-8 show simulation instances of Theorem 2 for finite-variance diffusion

and jump-diffusion white Levy noise in the leaky integrate-and-fire and the FHN neuron models. Small

23

Page 24: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0 0.005 0.01 0.015 0.02 0.0250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Scale κ of additive white Gaussian Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.005 0.01 0.015 0.02 0.0250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Scale κ of additive white uniform jump−diffusion Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.005 0.01 0.015 0.02 0.0250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Scale κ of additive white α−stable Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

(a) (b) (c)

Figure 7: Mutual information Levy noise benefits in the leaky integrate-and-fire spiking neuron model (40). Additive

white Levy noise dLt increases the mutual information of the IF neuron with parameters a = 0.5 and δ = 0.02 for

the subthreshold input signal s1 = 0.005 and s2 = 0.012. The types of Levy noise dLt are (a) Gaussian, (b) Gaussian

with uniformly distributed jumps, and (c) symmetric α-stable noise with α = 1.95 (thick-tailed bell curve with

infinite variance [62]). The dashed vertical lines show the total min-max deviations of the mutual information in 100

simulation trials.

amounts of additive Levy noise in these spiking neuron models produce the SR effect in terms of the noise-

enhanced Shannon mutual information I(S, Y ) between realizations of a random (Bernoulli) subthreshold

input signal S and the neuron’s thresholded output random variable Y . The SR effects in Figures 7-8(c)

again lie outside the scope of Theorem 2 because they occur for infinite-variance α-stable noise and because

Theorem 2 requires noise with finite second moments. Thus the SR effect in spiking neurons is not limited

to finite second moment Levy noise.

VI. CONCLUSIONS

Levy noise processes can benefit several continuous and spiking neuron models because general forms of

the SR ‘forbidden interval’ theorem hold for several types of Levy noise. The generality of Levy noise

extends simple Brownian models of noise to more complex and realistic Poisson jump models of noise that

can affect biological and model neurons. But both Levy SR theorems require the finite second-moment

restrictions of the two lemmas. This rules out the important class of stable noise distributions in all but

24

Page 25: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

x 10−3

0

0.1

0.2

0.3

0.4

0.5

0.6

Scale κ of additive white Gaussian Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

x 10−3

0

0.1

0.2

0.3

0.4

0.5

0.6

Scale κ of additive white uniform jump−diffusion Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

x 10−3

0

0.1

0.2

0.3

0.4

0.5

0.6

Scale κ of additive white α−stable Levy noise

Mutu

al in

form

ation I(S

,R)

in b

its

(a) (b) (c)

Figure 8: Mutual information Levy noise benefits in the FHN spiking neuron (42)-(43). Additive white Levy noise

dLt increases the mutual information of the FHN neuron for the subthreshold input signal s1 = −0.0045 and s2

= 0.0045. The types of Levy noise dLt are (a) Gaussian, (b) Gaussian with uniformly distributed jumps, and (c)

symmetric α-stable noise with α = 1.9 (thick-tailed bell curve with infinite variance [62]). The dashed vertical lines

show the total min-max deviations of the mutual information in 100 simulation trials.

the Gaussian or pure-diffusion case.

Relaxing the second-moment assumption may produce stochastic differential equations that are not

mathematically tractable. Yet the simulation evidence of Figure 1 and sub-figure (c) of Figures 3-8 shows

that the SR noise benefit continues to hold for several stable models where the noise has infinite variance

and infinite higher-order moments. It is an open research question whether a more general Levy SR result

can include these and other observed noise benefits in continuous and spiking neuron models.

APPENDIX: PROOF OF LEMMAS

The proof of Lemma 2 relies on the proof technique of Lemma 1 in which we bound a mean-squared term

by four additive terms and then show that each of the four terms goes to zero in the limit.

Lemma 1: Let bi : Rd → R and cij : Rd → R in (23)-(24) be measurable functions that satisfy the

global Lipschitz conditions

25

Page 26: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

||bi(x1)− bi(x2)|| ≤ K1||x1 − x2|| (60)

||cij(x1)− cij(x2)|| ≤ K2||x1 − x2|| (61)

and |cij |2 ≤ H ij for all x1 and x2 ∈ Rd and for i = 1, ..., d and j = 1, ...,m. (62)

Suppose dXt = b(Xt−)dt+ c(Xt−)dLt (63)

dXt = b(Xt)dt (64)

where dLt is a Levy noise with µ = 0 and finite second moments. Then for every T ∈ R+ and for every

ε > 0:

E[ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m, (65)

and hence

P [ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m (66)

since mean-square convergence implies convergence in probability.

Proof:

The Lipschitz conditions (60) and (61) ensure that the process Xt exists [3] for t ≥ 0 in (63). Then the

proof commences with the inequality

sup0≤t≤T

||Xt − Xt||2 ≤d∑i=1

sup0≤t≤T

(Xit − Xi

t)2 (67)

which implies that

E[ sup0≤t≤T

||Xt − Xt||2] ≤d∑i=1

E[ sup0≤t≤T

(Xit − Xi

t)2]. (68)

Thus the result follows if E[sup0≤t≤T (Xit−Xi

t)2]→ 0 as γ → 0 for i = 1, ..., d. Equations (26) and (63)-(64)

imply

Xit − Xi

t =∫ t

0[bi(Xs−)− bi(Xs−)]ds+

m∑j=1

∫ t

0σij(Xs−)dBj

s +m∑j=1

∫ t

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

+m∑j=1

∫ t

0

∫|yj |≥1

Gij(Xs− , yj)N j(ds, dyj). (69)

26

Page 27: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

This gives an upper bound on the squared difference as

(Xit − Xi

t)2 ≤ (3m+ 1)

∫ t

0[bi(Xs−)− bi(Xs−)]ds

2

+m∑j=1

∫ t

0σij(Xs−)dBj

s

2

+m∑j=1

∫ t

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

2

+m∑j=1

∫ t

0

∫|yj |≥1

Gij(Xs− , yj)N j(ds, dyj)

2 (70)

because (u1 + ...+ un)2 ≤ n(u21 + ...+ u2

n). The Cauchy-Schwartz inequality gives

∫ t

0[bi(Xs−)− bi(Xs−)]ds

2

≤(∫ t

0ds

)(∫ t

0[bi(Xs−)− bi(Xs−)]2ds

). (71)

Now put (71) in the first term of (70) and then take expectations of the supremum on both sides to get

four additive terms as an upper bound:

E

[sup

0≤t≤T(Xi

t − Xit)

2

]≤ (3m+ 1)

(E

[sup

0≤t≤Tt

∫ t

0[bi(Xs−)− bi(Xs−)]2ds

]

+m∑j=1

E

[sup

0≤t≤T

∫ t

0σij(Xs−)dBj

s

2]

+m∑j=1

E

sup0≤t≤T

∫ t

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

2

+m∑j=1

E

sup0≤t≤T

∫ t

0

∫|yj |≥1

Gij(Xs− , yj)N j(ds, dyj)

2 . (72)

We next show that each of the four terms goes to zero. Consider the first term on the right-hand side

of (72):

E

[sup

0≤t≤Tt

∫ t

0[bi(Xs−)− bi(Xs−)]2ds

]≤ TE

[sup

0≤t≤T·∫ t

0[bi(Xs−)− bi(Xs−)]2ds

]

≤ TK21E

[sup

0≤t≤T

∫ t

0[Xs− − Xs− ]2ds

]by the Lipschitz condition (60)

≤ TK21

∫ T

0E

[sup

0≤u≤s[Xu− − Xu−]2

]ds. (73)

The second term

E

[sup

0≤t≤T

∫ t

0σij(Xs−)dBj

s

2]≤ 4E

[∫ T

0σij(Xs−)dBj

s

2]

27

Page 28: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

because∫ t0 σ

ij(Xs−)dBj

s is a martingale and so we can apply Doob’s Lp inequality [56]: E[supa≤t≤b |Ut|p

]≤

(pp−1

)pE|Ub|p if Utt≥0 is a real-valued martingale, [a, b] is a bounded interval of R+, Ut ∈ Lp(Ω,R),

and if p > 1 (p = 2 in our case). But

4E

[∫ T

0σij(Xs−)dBj

s

2]

= 4∫ T

0E[σij(Xs−)2

]ds

by Ito isometry [3]: E(∫ T

0 f(t, y)dBs2)

=∫ T0 E(|f(t, y)|2)ds if f ∈ H2([0, T ],R) where H2([0, T ],R) is

the space of all real-valued measurable Ft-adapted processes such that E(∫ T

0 |f(t, y)|2ds)<∞. Then

4∫ T

0E[σij(Xs−)2

]ds ≤ 4(σj)2

∫ T

0E[cij(Xs−)2

]ds by definition of σij(Xs−)

≤ 4(σj)2TH ij because |cij |2 ≤ H i

j . (74)

Note that

E

sup0≤t≤T

∫ t

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

2

≤ 4E

∫ T

0

∫|yj |<1

F ij (Xs− , yj)N j(ds, dyj)

2 by Doob’s Lp inequality

= 4E

∫ T

0

∫|yj |<1

cij(Xs−)yjN j(ds, dyj)

2 by definition of F ij (Xs− , y

j)

≤ 4H ijE

∫ T

0

∫|yj |<1

yjN j(ds, dyj)

2 because |cij |2 ≤ H i

j

≤ 4H ijE

∫|yj |<1

yjN j(T, dyj)

2

= 4H ijT

∫|yj |<1

|yj |2νj(dyj) by Ito isometry and (20). (75)

Similar arguments and (19) give

E

sup0≤t≤T

∫ t

0

∫|yj |≥1

Gij(Xs− , yj)N j(ds, dyj)

2 ≤ 4H i

jT

∫|yj |≥1

|yj |2νi(dyi). (76)

Substituting the above estimates (73), (74), (75), and (76) in inequality (72) gives

E

[sup

0≤t≤T(Xi

t − Xit)

2

]≤ (3m+ 1)

(TK2

1 ·∫ T

0E

[sup

0≤u≤s[Xu− − Xu−]2

]ds

+ 4mTH ij ·maxj

[(σj)2 +

∫R|yj |2νj(dyj)

]). (77)

28

Page 29: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

Rewrite this inequality as

z(T ) ≤ A+Q

∫ T

0z(s)d(s) (78)

where z(T ) = E

[sup

0≤t≤T(Xi

t − Xit)

2

], A = (3m+ 1)4mTH i

j ·maxj

[(σj)2 +

∫R|yj |2νj(dyj)

],

and Q = (3m + 1)TK21 . Then we get z(T ) ≤ AeQT by Gronwall’s inequality [24]: φ(t) ≤ αeβt for all

t ∈ [0, T ] and for real continuous φ(t) in [0, T ] such that φ(t) ≤ α+β∫ t0 φ(τ)dτ where t ∈ [0, T ] and β > 0.

Note that A→ 0 as σj → 0 and νj → 0. Hence

E[ sup0≤t≤T

(Xit − Xi

t)2] → 0 as σj → 0 and νj → 0. (79)

for each j = 1, ...,m. Applying (79) to inequality (68) implies (65) and hence implies (66). Q.E.D.

Lemma 2: Let bi : Rd → R and cij : Rd → R in (23)-(24) be measurable functions that satisfy the

respective local and global Lipschitz conditions

||bi(z)− bi(y)|| ≤ Cn||z − y|| when ||z||, ||y|| ≤ n (80)

||cij(z)− cij(y)|| ≤ K1||z − y|| for all z, y ∈ Rd (81)

and |cij |2 ≤ H ij for i = 1, ..., d and j = 1, ...,m, (82)

Suppose dXt = b(Xt)dt+ c(Xt−)dLt (83)

dXt = b(Xt)dt. (84)

where dLt is a Levy noise with µ = 0 and finite second moments. Then for every T ∈ R+ and for every

ε > 0:

E[ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all i = 1, ...,m, (85)

and hence

P [ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all i = 1, ...,m (86)

29

Page 30: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

since mean-square convergence implies convergence in probability.

Proof:

First define the function bir such that

(i) bir(x) = bi(x) for ||x|| ≤ r

(ii) bir(x) = 0 for ||x|| ≥ 2r

(iii) bir(x) = ((2r − ||x||)/r)bi(rx/||x||) for r ≤ ||x|| ≤ 2r.

We then show that the function bir is globally Lipschitz:

||bir(x)− bir(y)|| ≤ C ′r||x− y|| for x, y ∈ Rn.

Consider the function bir(x). Write

bir(x) =

bi(x) if ||x|| ≤ r

f(x)gi(x) if r ≤ ||x|| ≤ 2r(87)

where

f(x) = ((2r − ||x||)/r) and (88)

gi(x) = bi(rx/‖x‖).

The definition of br implies that it is Lipschitz continuous on the region D1 = ‖x‖ ≤ r:

‖bir(x)− bir(y)‖ ≤ Cr‖x− y‖ for all x, y ∈ Rd. (89)

We first show that br(x) is Lipschitz continuous on the region D2 = r ≤ ‖x‖ ≤ 2r. For x, y ∈ D2 =

r ≤ ‖x‖ ≤ 2r:

‖f(x)− f(y)‖ =|‖y‖ − ‖x‖|

rby definition of f (90)

≤ ‖x− y‖r

30

Page 31: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

and

‖gi(x)− gi(y)‖ = ‖bi(rx/‖x‖)− bi(ry/‖y‖)‖ by definition of gi (91)

≤ Cr‖rx

‖x‖− ry

‖y‖‖ (92)

because rs‖s‖ ∈ D1 for all s ∈ Rd and bi is Lipschitz continuous on D1

≤ Cr2‖x− y‖ because r ≥ ‖x‖, ‖y‖ ≥ 2r. (93)

Hence

|bir(x)− bir(y)‖ = ‖f(x)gi(x)− f(y)gi(y)‖ (94)

≤ ‖f(x)gi(x)− f(z)gi(z)‖+ ‖f(z)gi(z)− f(y)gi(y)‖ (95)

= ‖f(x)gi(x)− f(x)gi(y)‖+ ‖f(x)gi(y)− f(y)gi(y)‖ (96)

by choosing z on the line segment between 0 and y such that ‖z‖ = ‖x‖

= ‖f(x)‖‖gi(x)− gi(y)‖+ ‖gi(y)‖‖f(x)− f(y)‖ (97)

≤ ‖f‖∞,2‖gi(x)− gi(y)‖+ ‖g‖∞,2‖f(x)− f(y)‖ (98)

where we define ‖v‖∞,i = sup‖v(s)‖ : s ∈ Di

≤ ‖f‖∞,2Cr2‖x− y‖+ ‖g‖∞,2

‖x− y‖r

(99)

≤ C ′r‖x− y‖ where C ′r = ‖f‖∞,2Cr2

+‖g‖∞,2r

. (100)

So br(x) is Lipschitz continuous on D2.

We next show that br(x) is Lipschitz continuous on D1 and D2. Choose x ∈ D1, y ∈ D2, and a point

z of ∂D1 on the line segment between x and y. Then

‖bir(x)− bir(y)‖ ≤ ‖bir(x)− bir(z)‖+ ‖bir(z)− bir(y)‖ (101)

≤ Cr‖x− z‖+ C ′r‖z − y‖ (102)

≤ C ′r‖x− y‖ because C ′r ≥ Cr and‖x− z‖+ ‖z − y‖ = ‖x− y‖ . (103)

So br(x) is Lipschitz continuous with coefficient C ′r on ‖x‖ ≤ 2r. We now choose x ∈ (D1 ∪ D2), y ∈

31

Page 32: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

(D1 ∪D2)c, and a point z of ∂(D1 ∪D2)c on the line segment between x and y. Then

‖bir(x)− bir(y)‖ ≤ ‖bir(x)− bir(z)‖+ ‖bir(z)− bir(y)‖ (104)

≤ Cr‖x− z‖+ 0 (105)

≤ C ′r‖x− z‖+ C ′r‖z − y‖ (106)

= C ′r‖x− y‖ . (107)

Then (89), (100), (103), and (107) show that bir(x) is Lipschitz continuous with coefficient C ′r on Rd.

Consider next the SDE

dXt = br(Xt)dt+ c(Xt−)dLt . (108)

Lemma 1 holds for (108) and so we can write

P [ sup0≤t≤T

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m. (109)

Now define Tr = inft ≥ 0 : ‖Xt‖ > r, Tr = inf t ≥ 0 : ‖Xt‖ > r, and τr = inf t : ‖Xt‖ > r or ‖Xt‖ > r

= minTr, Tr. Then Xt and Xt satisfy (83) on [0, τr]. Note that Tr and Tr are stopping times and thus τr

is also a stopping time. So arguments similar to those of the proof of Lemma 1 ((68)-(76) with appropriate

modifications) give

E

[sup

0≤u≤mint,τr(Xi

u − Xiu)2]≤ Q′

∫ t

0E

[sup

0≤u≤mins,τr(Xi

u − Xiu)2]ds (110)

Then E

[sup

0≤u≤mint,τr(Xi

u − Xiu)2]≤ 0 by Gronwall’s inequality. (111)

Hence Xit = Xi

t holds almost surely on [0, τr] for i = 1, ..., d. So Xt = Xt holds almost surely on [0, τr].

This result and (109) give

P [ sup0≤t≤τr

||Xt − Xt||2 > ε]→ 0 as σj → 0 and νj → 0 for all j = 1, ...,m. (112)

We need now show only that τr → ∞ almost surely as r →∞ to prove (86). So we need to prove that

Tr →∞ and Tr →∞ almost surely as r →∞. Chebychev’s inequality implies that

P (Tr ≤ T ) = P

(sup

0≤t≤TXit ≥ r

)≤E[sup0≤t≤T (Xi

t)2]

r2. (113)

32

Page 33: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

Again arguments similar to those of the proof of Lemma 1 ((68)-(76) with appropriate modifications) give

E

[sup

0≤t≤T(Xi

t)2

]≤ A′′ +Q′′

∫ T

0E

[sup

0≤u≤s(Xi

u)2]ds. (114)

Thus

E

[sup

0≤t≤T(Xi

t)2

]≤ A′′eQ

′′T by Gronwall’s inequality (115)

where A′′ and Q′′ do not depend on r because we do not use the Lipschitz condition in the derivation of

(114). Then (113) and (115) imply that

P (Tr ≤ T ) ≤E[sup0≤t≤T (Xi

t)2]

r2→ 0 as r →∞. (116)

Thus Tr → ∞ almost surely as r → ∞. We can similarly show that Tr → ∞ almost surely as r → ∞.

This implies the claim (86). Q.E.D.

References

[1] H. Ahn and R.E. Feldman, “Optimal Filtering of Gaussian Signals in the Presence of Levy Noise,” SIAM

Journal of Applied Mathematics, vol. 60, No. 1, pp. 359-369, November 1999.

[2] S. Amari, “Neural Theory of Association and Concept Formation,” Biological Cybernetics, vol. 26, pp. 175-185,

May 1977.

[3] D. Applebaum, Levy Processes and Stochastic Calculus. Cambridge University Press, 2004.

[4] O. Barndoff-Nielson and N. Shephard, “Non-Gaussian Ornstein-Uhlenbeck-based Models and Some of Their

Uses in Financial Economics,” Journal of the Royal Statistical Society, Series B, vol. 63, Issue 2, pp. 167-241,

2001.

[5] S. M. Bezrukov and I. Vodyanov, “Noise-Induced Enhancement of Signal Transduction Across Voltage-

Dependent Ion Channels,” Nature, vol. 378, Issue 6555, pp. 362-364, November 1995.

[6] A.R. Bulsara, E.W. Jacobs, T. Zhou, F. Moss, and L. Kiss, “Stochastic Resonance in a Single Neuron Model:

Theory and Analog Simulation,” Journal of Theoretical Biology, vol. 152, No. 4, pp. 531-555, October 1991.

[7] A.R. Bulsara, A.J. Maren, and G. Schmera, “Single Effective Neuron: Dendritic Coupling Effect and Stochastic

Resonance,” Biological Cybernatics, vol. 70, No. 2, pp. 145-156, December 1993.

33

Page 34: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[8] A.R. Bulsara and A. Zador, “Threshold Detection of Wideband Signals: A Noise Induced Maximum in the

Mutual Information,” Physical Review E, vol. 54, No. 3, R2185-R2188, September 1996.

[9] J. Cao and J. Lu, “Adaptive Synchronization of Neural networks with and without Time-Varying Delays,”

Chaos, vol. 16, 013133-1 – 013133-6, March 2006.

[10] J. Cao, Z. Wang, and Y. Sun, “Synchronization in an Array of Linearly Stochastically Coupled Networks with

Time Delays,” Physica A, vol. 385, pp. 718-728, November 2007.

[11] F. Chapeau-Blondeau, X. Godivier, and N. Chambet, “Stochastic Resonance in a Neuron Model that Transmits

Spike Trains,” Physical Review E, vol. 53, No. 1, pp. 1273-1275, January 1996.

[12] F. Chapeau-Blondeau and D. Rousseau, “Noise-Enhanced Performance for an Optimal Bayesian Estimator,”

IEEE Transactions on Signal Processing, vol. 52, pp. 1327-1334, May 2004.

[13] F. Chapeau-Blondeau, S. Blanchard, and D. Rousseau, “Noise-Enhanced Fisher Information in Parallel Arrays

of Sensors with Saturation,” Physical Review E, vol. 74, 0311021-03110210, September 2006.

[14] M. Chatterjee and M.E. Robert, “Noise Enhances Modulation Sensitivity in Cochlear Implant Listeners:

Stochastic Resonance in a Prosthetic Sensory System?” Journal of the Association for Research in Otolaryn-

gology, vol. 2, No. 2, pp. 159-171, August 2001.

[15] M.A. Cohen and S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage

by Competitive Neural Networks,” IEEE Transactions on System, Man, and Cybernetics, vol. SMC-13, no. 5,

pp. 815-826, September/October 1983.

[16] J.J. Collins, C.C. Chow, A.C. Capela, and T.T. Imhoff, “Aperiodic Stochastic Resonance in Excitable Systems,”

Physical Review E, vol. 52, No. 4, pp. R3321-R3324, October 1995.

[17] J.J. Collins, C.C. Chow, A.C. Capela, and T.T. Imhoff, “Aperiodic Stochastic Resonance,” Physical Review E,

vol. 54, No. 5, pp. 5575-5584, November 1996.

[18] J.J. Collins, T.T. Imhoff, and P. Grigg, “Noise Enhanced Information Transmission in Rat SA1 Cutaneous

Mechanoreceptors via Aperiodic Stochastic Resonance,” Journal of Neurophysiology, vol. 76, 642-645, July

1996.

[19] R. Cont and P. Tankov, Financial Modelling with Jump Processes. Chapman and Hall/CRC, 2003.

[20] T.M. Cover and J.A. Thomas, Elements of Information Theory. New York: Wiley, 1991.

[21] G. Deco and B. Schurmann, “Stochastic Resonance in the Mutual Information Between Input and Output Spike

Trains of Noisy Central Neurons”, Physica D, vol. 117, pp. 276-282, June 1998.

34

Page 35: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[22] T. Ditzinger, M. Stadler, D. Struber and J.A. Kelso, “Noise Improves Three-Dimensional Perception: Stochastic

Resonance and other Impacts of Noise to the Perception of Autostereograms,” Physical Review E, vol. 62, No.

2, pp. 25662575, August 2000.

[23] J.K. Douglass, L. Wilkens, E. Pantazelou, and F. Moss, “Noise Enhancement of Information Transfer in Crayfish

Mechanoreceptors by Stochastic Resonance,” Nature, vol. 365, pp. 337-340, September 1993.

[24] R. Durrett, Stochastic Calculus: A Practical Introduction. CRC Press, 1996.

[25] R. Durrett, Probability: Theory and Examples. Duxbury Press, 2004.

[26] R.A. FitzHugh, “Impulses and Physiological States in Theoretical Models of Nerve Membrane,” Biophysical

Journal, vol. 1, pp. 445-466, July 1961.

[27] L. Gammaitoni, “Stochastic Resonance and the Dithering Effect in Threshold Physical Systems,” Physical

Review E, vol. 52, No. 5, pp. 4691-4698, November 1995.

[28] W. Gerstner and W.M. Kistler, Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge

University Press, 2002.

[29] M.T. Giraudo and L. Sacerdote, “Jump-Diffusion Processes as Models of Neuronal Activity,” BioSystems, vol.

40, No. 1-2, pp. 75-82, 1997.

[30] M. Grigoriu, Applied Non-Gaussian Processes. Englewood Cliffs, NJ: Prentice Hall, 1995.

[31] B.S. Gutkin and G.B. Ermentrout, “Dynamics of Membrane Excitability Determine Interspike Interval Variabil-

ity: A Link Between Spike Generation Mechanism and Cortical Spike Train Statistics,” Neural Computation,

vol. 10, No. 5, pp. 1047-1065, July 1998.

[32] P. Hanggi, “Stochastic Resonance in Biology: How Noise Can Enhance Detection of Weak Signals and Help

Improve Biological Information Processing,” Chemphyschem, vol. 3, pp. 285-290, March 2002.

[33] A. Hanssen and T.A. Oigard, “The Normal Inverse Gaussian Distribution: A Versatile Model For Heavy-tailed

Stochastic Processes,” IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP’01,

pp. 3985-3988, May 2001.

[34] A.L. Hodgkin and A.F. Huxley, “A Quantitative Description of Membrane Current and its Application to

Conduction and Excitation in Nerve,” Journal of Physiology, vol. 117, pp. 500-540, August 1952.

[35] N. Hohn and A.N. Burkitt, “Shot Noise in Leaky Integrate-and-Fire Neuron,” Physical Review E, vol. 63,

031902-1 – 031902-11, March 2001.

35

Page 36: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[36] F.C. Hoppensteadt and E.M. Izhikevich, Weakly Connected Neural Networks. New York: Springer, pp. 111-142.

[37] J.J. Hopfield, “Neural Networks with Graded Response Have Collective Computational Properties Like Those

of Two-state Neurons,” Proceeding of National Academy of Science, vol. 81, pp. 3088-3092, May 1984.

[38] H. Huang and J. Cao, “Exponential Stability Analysis of Uncertain Stochastic Neural Networks with Multiple

Delays,” Nonlinear Analysis: Real World Applications, vol. 8, pp. 646-653, April 2007.

[39] M.E. Inchiosa, J.W.C. Robinson, and A.R. Bulsara, “Information-Theoretic Stochastic Resonance in Noise-

Floor Limited Systems: The Case for Adding Noise,” Physical Review Letters, vol. 85, pp. 3369-3372, October

2000.

[40] F. Jaramillo and K. Wiesenfeld, “Mechanoelectrical Transduction Assisted by Brownian Motion: A Role for

Noise in the Auditory System,” Nature Neuroscience, vol. 1, pp. 384-388, 1998.

[41] P. Jung, A. Cornell-Bell, F. Moss, S. Kadar, J. Wang, and K. Showalter, “Noise Sustained Waves in Subexcitable

Media: From Chemical Waves to Brain Waves,” Chaos, vol. 8, Issue 3, pp. 567-575, September 1998.

[42] G.G. Karapetyan, “Application of Stochastic Resonance in Gravitational-Wave Interferometer,” Physical Review

D, vol. 73, 122003, 1-9, June 2006.

[43] S.A. Kassam, Signal Detection in Non-Gaussian Noise. Springer-Verlag, 1988.

[44] S. Kay, “Can Detectability be Improved by Adding Noise?” IEEE Signal Processing Letters, vol. 7, pp. 8-10,

January 2000.

[45] E.R. Kandel, J.H. Schwartz, and Thomas M. Jessell, Principles of Neuroscience. McGraw-Hill, 4th Revised

Edition, pp. 224, 2000.

[46] A.F. Kohn, “Dendritic Transformations on Random Synaptic Inputs as Measured from Neuron’s Spike Train–

Modeling and Simulation,” IEEE Transactions on Biomedical Engineering, vol. 36, No. 1, pp. 44-54, January

1989.

[47] B. Kosko, Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence. En-

glewood Cliffs, NJ: Prentice-Hall, 1991.

[48] B. Kosko, Noise. Viking/Penguin, 2006.

[49] B. Kosko and S. Mitaim, “Stochastic Resonance in Noisy Threshold Neurons,” Neural Networks, vol. 14, no.

6-7, pp. 755-761, June 2003.

36

Page 37: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[50] B. Kosko and S. Mitaim, “Robust Stochastic Resonance for Simple Threshold Neurons,” Physical Review E,

vol. 70, 031911-1 – 031911-10, 27 September 2004.

[51] I. Y. Lee, X. Liu, C. Zhou, and B. Kosko, “Noise-Enhanced Detection of Subthreshold Signals with Carbon

Nanotubes,” IEEE Transactions on Nanotechnology, vol. 5, no. 6, pp. 613 - 627, November 2006.

[52] J.E. Levin, and J.P. Miller, “Broadband Neural Encoding in the Cricket Cercal Sensory System Enhanced by

Stochastic Resonance,” Nature, vol. 380, pp. 165-168, March 1996.

[53] B. Lindner, A. Longtin, and A. R. Bulsara, “Analytic Expressions for Rate and CV of a Type I Neuron Driven

by White Gaussian Noise,” Neural computation, vol. 15, pp. 1761-1788, August 2003.

[54] B. Lindner, J. Garcia-Ojalvo, A. Neiman, and L. Schimansky-Geier, “Effects of Noise in Excitable Systems,”

Physics Reports, vol. 392, pp. 321-424, March 2004.

[55] A. Longtin, “Autonomous Stochastic Resonance in Bursting Neurons,” Physical Review E, vol. 55, no. 1, pp.

868-876, January 1997.

[56] X. Mao, Stochastic Differential Equations and their Applications. Horwood Publishing, Chichester, 1997.

[57] M.D. McDonnell, N. Stocks, C.E.M. Pearce, and D. Abbott, “Quantization in the Presence of Large Amplitude

Threshold Noise,” Fluctuation Noise Letters, vol. 5, No. 3, L457-L468, September 2005.

[58] C. Mead, Analog VLSI and Neural Systems. Addison Wesley, 1989.

[59] S. Mitaim and B. Kosko, “Adaptive Stochastic Resonance,” Proceedings of the IEEE: Special Issue on Intelligent

Signal Processing, vol. 86, no. 11, pp. 2152-2183, November 1998.

[60] S. Mitaim and B. Kosko, “Adaptive Stochastic Resonance in Noisy Neurons Based on Mutual Information,”

IEEE Transactions on Neural Networks, vol. 15, no. 6, pp. 1526-1540, November 2004.

[61] F. Moss, L.M. Ward, and W.G. Sannita, “Stochastic Resonance and Sensory Information Processing: A Tutorial

and Review of Applications,” Clinical Neurophysiology, vol. 115, No. 2, pp. 267-281, February 2004.

[62] C.L. Nikias and M. Shao, Signal Processing with Alpha-Stable Distributions and Applications. John Wiley and

Sons, New York, 1995.

[63] A. Pacut, “Separation of Deterministic and Stochastic Neurotransmission,” International Joint Conference on

Neural Networks, IJCNN’01, vol. 1, pp. 55-60, July 2001.

[64] A. Patel and B. Kosko, “Stochastic Resonance in Noisy Spiking Retinal and Sensory Neuron Models,” Neural

Networks, vol. 18, pp. 467-478, August 2005.

37

Page 38: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[65] A. Patel and B. Kosko, “Mutual-Information Noise Benefits in Brownian Models of Continuous and Spiking

Neuron Models,” International Joint Conference on Neural Networks, IJCNN’06, pp. 2347-2354, July 2006.

[66] O. V. Poliannikov, Y. Bao and H. Krim, “Levy Processes for Image Modelling,” Proceedings of IEEE Signal

Processing Workshop on Higher-Order Statistics, June 14-16, Caesarea, Israel, 1999.

[67] S.I. Resnick, Heavy-Tail Phenomena: Probabilistic and Statistical Modeling. Springer 2007.

[68] D. Rousseau, J. Rojas Varela, and F. Chapeau-Blondeau, “Stochastic Resonance for Nonlinear Sensors with

Saturation,” Physical Review E, vol. 67, No.2, 0211021-02110216, February 2003.

[69] D. Rousseau and F. Chapeau-Blondeau, “Neuronal Signal Transduction Aided by Noise at Threshold and at

Saturation,” Neural Processing Letters, vol. 20, No.2, pp. 71-83, November 2004.

[70] D. Rousseau and F. Chapeau-Blondeau, “Constructive Role of Noise in Signal Detection From Parallel Array

of Quantizers,” Signal Processing, vol. 85, No. 3, pp. 571-580, March 2005.

[71] D. Rousseau, G. V. Anand, and F. Chapeau-Blondeau, “Noise-enhanced Nonlinear Detector to Improve Signal

Detection in Non-Gaussian noise,” Signal Processing, vol. 86, No. 11, pp. 3456-3465, November 2006.

[72] T. Rydberg, “The Normal Inverse Gaussian Levy Processes: Simulation and Approximation,” Communications

in Statistics: Stochastic Models, vol. 13, pp. 887-910, 1997.

[73] L. Sacerdote and R. Sirovich, “Multimodality of the Interspike Interval Distribution in a Simple Jump-Diffusion

Model,” Scientiae Mathematicae Japonicae, vol. 58, No. 2, pp. 307-322, September 2003.

[74] Y. Sakumara and K. Aihara, “Stochastic Resonance and Incidence Detection in Single Neuron,” Neural Pro-

cessing Letters, vol. 16, pp. 235-242, December 2002.

[75] K-I. Sato, Levy Processes and Infinitely Divisible Distributions. Cambridge University Press, 1999.

[76] R. Segev, M. Benveniste, E. Hulata, N. Cohen, A. Palevski, E. Kapon, Y. Shapira, and E. Ben-Jacob, “Long

Term Behavior of Lithographically Prepared In Vitro Neuronal Networks,” Physical Review Letters, vol. 88, N0.

11, 118102-1 – 118102-4, March 2002.

[77] M.F. Shlesinger, G. M. Zeslavsky, and U. Frisch, Eds., “Levy Flights and Related Topics in Physics: Proceedings

of the International Workshop Held at Nice, France, June 27-30,” Lecture Notes in Physics, vol. 450, Springer-

Verlag, Berlin, 1995.

[78] W. Schoutens, Levy Processes in Finance: Pricing Financial Derivatives. John Wiley & Sons, 2003.

[79] G. Samorodnitsky and M.s. Taqqu, Stable Non-Gaussian Random Processes: Stochastic Models with Infinite

Variance. Chapman & Hall/CRC, 1994.

38

Page 39: STOCHASTIC RESONANCE IN CONTINUOUS AND SPIKING …sipi.usc.edu/~kosko/LevyNoiseTNNpreprint_D08.pdf · Keywords: Levy noise, Signal detection, Stochastic resonance, Neuron models,

[80] O. Sotolongo-Costa, J.C. Antoranz, A. Posadas, F. Vidal and A. Vazquez, “Lvy Flights and Earthquakes,”

Geophysical Research Letters, vol. 27, pp. 1965-1967, July 2000.

[81] W.C. Stacey and D.M. Durand, “Stochastic Resonance Improves Signal Detection in Hippocampal CA1 Neu-

rons,” Journal of Neurophysiology, vol. 83, pp. 1394-1402, September 2001.

[82] N.G. Stocks, “Information Transmission in Parallel Threshold Arrays: Suprathreshold Stochastic Resonance,”

Physical Review E, vol. 63, No. 4, 041114-1 – 041114-9, March 2001.

[83] N.G. Stocks, D. Appligham, and R.P. Morse, “The Application of Suprathreshold Stochastic Resonance to

Cochlear Implant Coding,” Fluctuation and Noise Letters, vol. 2, No. 3, L169-L181, September 2002.

[84] Y. Sun and J. Cao, “Adaptive Lag Synchronization of Unknown Chaotic Delayed Neural Networks with Noise

Perturbations,” Physics Letters A, vol. 364, pp. 277-285, April 2007.

[85] Z. Wang, Y. Liu, M. Li, and X. Liu, “Stability Analysis for Stochastic Cohen-Grossberg Neural Networks with

Mixed Time Delays,” IEEE Transactions on Neural Networks, vol. 17, no. 3, pp. 814-820, May 2006.

[86] Z. Wang, Y. Liu, K. Fraser, and X. Liu, “Stochastic Stability of Uncertain Hopfield Neural Networks with

Discrete and Distributed Delays,” Physics Letters A, vol. 354, no. 4, pp. 288-297, June 2006.

[87] K. Wiesenfeld and F. Moss, “Stochastic Resonance and the Benefits of Noise: From Ice Ages to Crayfish and

SQUIDS,” Nature, vol. 373, pp. 33-36, January 1995.

[88] W. Yu and J. Cao, “Synchronization Control of Stochastic Delayed neural Network,” Physica A, vol. 373, pp.

252-260, January 2007.

[89] F.G. Zeng, Q.J. Fu and R. Morse, “Human Hearing Enhanced by Noise,” Brain Research, vol. 869, pp. 251255,

June 2000.

39