Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis...

131
Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp Team,INRIA Sophia Antipolis,France. Mathematical Modeling and Statistical Analysis in Neuroscience, 02-07-14 Bruno Cessac Statistical analysis of spike trains in neuronal networks

Transcript of Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis...

Page 1: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Statistical analysis of spike trains in neuronalnetworks

Bruno Cessac

NeuroMathComp Team,INRIA Sophia Antipolis,France.

Mathematical Modeling and Statistical Analysis inNeuroscience, 02-07-14

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 2: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 3: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Visual system

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 4: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Visual system

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 5: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Visual system

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 6: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Visual system

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 7: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Multi Electrodes Array

Figure : Multi-Electrodes Array.

Page 8: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Multi Electrodes Array

Page 9: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Page 10: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Page 11: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Page 12: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Page 13: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Do Ganglion cells act as independent encoders ?

Or do their dynamical (spatio-temporal) correlations play a role inencoding a visual scene (population coding) ?

Page 14: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Encoding a visual scene

Do Ganglion cells act as independent encoders ?

Or do their dynamical (spatio-temporal) correlations play a role inencoding a visual scene (population coding) ?

Page 15: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Let us measure (instantaneous pairwise) correlations

E. Schneidman, M.J. Berry, R. Segev, and W. Bialek. "Weak pairwise correlations imply strongly correlated

network states in a neural population". Nature, 440(7087):1007-1012, 2006.

Page 16: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Let us measure (instantaneous pairwise) correlations

E. Schneidman, M.J. Berry, R. Segev, and W. Bialek. "Weak pairwise correlations imply strongly correlated

network states in a neural population". Nature, 440(7087):1007-1012, 2006.

Page 17: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Let us measure (instantaneous pairwise) correlations

E. Schneidman, M.J. Berry, R. Segev, and W. Bialek. "Weak pairwise correlations imply strongly correlated

network states in a neural population". Nature, 440(7087):1007-1012, 2006.

Page 18: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Constructing a statistical model handling measuredcorrelations

Assume stationarity.

Measure empirical correlations.

Select the probability distribution which maximizes theentropy and reproduces these correlations.

Page 19: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Spike state.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g

Raster plot

!def= !T0

Page 20: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Spike pattern.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g

Raster plot

!def= !T0

Page 21: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Spike pattern.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

�1001

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g

Raster plot

!def= !T0

Page 22: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Spike block.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g

Raster plot

!def= !T0

Page 23: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Spike block.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g�1 1 0 1 00 1 0 1 00 0 0 1 00 1 0 1 1

Raster plot

!def= !T0

Page 24: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Spike events

Figure : Raster plot/Spike train.

Spike state

!k(n) 2 f 0; 1 g

Spike pattern

!(n) = (!k(n) )Nk=1

Spike block

!nm = f!(m)!(m + 1) : : : !(n) g

Raster plot

!def= !T0

Page 25: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Constructing a statistical model handling measuredcorrelations

Let �(T )! be the empirical measure:

�(T )! [ f ] =

1

T

TXt=1

f � �t(!)

e.g. �(T )! [!i ] =

1T

PTt=1 !i (t): �ring rate;

�(T )! [!i!j ] =

1T

PTt=1 !i (t)!j(t).

Find the (stationary) probability distribution � thatmaximizes statistical entropy under the constraints:

�(T )! [!i ] = �(!i );�

(T )! [!i!j ] = �(!i!j)

Page 26: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Constructing a statistical model handling measuredcorrelations

There is a unique probability distribution which satis�es theseconditions.

This is the Gibbs distribution with potential:

H(!(0)) =NXi=1

hi!i (0) +NX

i ;j=1

Jij!i (0)!j(0)

Ising model

Page 27: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

End of the story ?

Page 28: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

End of the story ?

Page 29: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

End of the story ?

The Ising potential:

H(!(0)) =NXi=1

hi!i (0) +NX

i ;j=1

Jij!i (0)!j(0)

does not consider time correlations between neurons.It is therefore bad at predicting spatio-temporal patterns !

Page 30: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Which correlations ?

Spikes correlations seem to play a role in spike coding.

Although this statement depends on several assumption that could bias statistics

Stationarity;

Binning;

Stimulus dependence ?

Modulo these remarks, Maximum entropy seems to be a relevantsetting to study the role of spatio-temporal spike correlations in

retina coding.

Page 31: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Which correlations ?

Spikes correlations seem to play a role in spike coding.Although this statement depends on several assumption that could bias statistics

Stationarity;

Binning;

Stimulus dependence ?

Modulo these remarks, Maximum entropy seems to be a relevantsetting to study the role of spatio-temporal spike correlations in

retina coding.

Page 32: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Which correlations ?

Spikes correlations seem to play a role in spike coding.Although this statement depends on several assumption that could bias statistics

Stationarity;

Binning;

Stimulus dependence ?

Modulo these remarks, Maximum entropy seems to be a relevantsetting to study the role of spatio-temporal spike correlations in

retina coding.

Page 33: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

OK. So let us consider spatio-temporal constraints.

Page 34: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

OK. So let us consider spatio-temporal constraints.

Easy !

Euh... In fact not so easy.

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2)

+????

Page 35: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

OK. So let us consider spatio-temporal constraints.

Easy !

Euh... In fact not so easy.

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2)

+????

Page 36: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

OK. So let us consider spatio-temporal constraints.

Easy !

Euh... In fact not so easy.

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2)

+????

Page 37: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

OK. So let us consider spatio-temporal constraints.

Easy ! Euh... In fact not so easy.

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2)

+????

Page 38: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Page 39: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Page 40: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Page 41: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Ising model considers successive times as independent

Page 42: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 43: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 44: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 45: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 46: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 47: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 48: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 49: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 50: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 51: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

The probability of a spike pattern ....depends on the network history (transition probabilities).

Page 52: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two "small" problems.

Handling temporality and memory.

Probability of characteristic spatio-temporal patterns

Given a set of hypotheses on transition probabilities there exists amathematical framework to solve the problem.

Page 53: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Handling memory.

Markov chains

Variable length Markov chains

Chains with complete connections

: : :

Gibbs distributions.

Page 54: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 Z

Conditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 55: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 56: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

�1 1 0 1 j 00 1 0 1 j 00 0 0 1 j 00 1 0 1 j 1

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 57: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 58: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�=

Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 59: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

� [!nm ] =Qn

l=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 60: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

Probability distribution on (bi-in�nite) rasters:� [!nm ] ;8m < n 2 ZConditional probabilities with memory depth D:Pn�!(n)

��!n�1n�D

�.

Generating arbitrary depth D blocks probabilities:

��!m+Dm

�= Pm+D

�!(m + D)

��!m+D�1m

���!m+D�1m

�� [!nm ] =

Qnl=m+D Pl

h!(l)

���!l�1l�D

i��!m+D�1m

�;

8m < n 2 ZChapman-Kolmogorov relation

Page 61: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

� [!nm ] =nY

l=m+D

Pl

h!(l)

���!l�1l�D

i�h!m+D�1m

i; 8m < n 2 Z

�l

�!ll�D

�= logPl

h!(l)

���!l�1l�D

i

� [!nm ] = expnX

l=m+D

�l

�!ll�D

��h!m+D�1m

i

�h!nm j!m+D�1

m

i= exp

nXl=m+D

�l

�!ll�D

Page 62: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

� [!nm ] =nY

l=m+D

Pl

h!(l)

���!l�1l�D

i�h!m+D�1m

i; 8m < n 2 Z

�l

�!ll�D

�= logPl

h!(l)

���!l�1l�D

i

� [!nm ] = expnX

l=m+D

�l

�!ll�D

��h!m+D�1m

i

�h!nm j!m+D�1

m

i= exp

nXl=m+D

�l

�!ll�D

Page 63: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

� [!nm ] =nY

l=m+D

Pl

h!(l)

���!l�1l�D

i�h!m+D�1m

i; 8m < n 2 Z

�l

�!ll�D

�= logPl

h!(l)

���!l�1l�D

i

� [!nm ] = expnX

l=m+D

�l

�!ll�D

��h!m+D�1m

i

�h!nm j!m+D�1

m

i= exp

nXl=m+D

�l

�!ll�D

Page 64: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Mathematical setting

� [!nm ] =nY

l=m+D

Pl

h!(l)

���!l�1l�D

i�h!m+D�1m

i; 8m < n 2 Z

�l

�!ll�D

�= logPl

h!(l)

���!l�1l�D

i

� [!nm ] = expnX

l=m+D

�l

�!ll�D

��h!m+D�1m

i

�h!nm j!m+D�1

m

i= exp

nXl=m+D

�l

�!ll�D

Page 65: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

8� � Zd ; �(f S g j @�) = 1

Z�;@�e��H

�;@�( fS g )

f (�) = � 1

�lim�"1

1

j�j logZ�;@�

(free energy density)

Page 66: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

8� � Zd ; �(f S g j @�) = 1

Z�;@�e��H

�;@�( fS g )

f (�) = � 1

�lim�"1

1

j�j logZ�;@�

(free energy density)

Page 67: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

8� � Zd ; �(f S g j @�) = 1

Z�;@�e��H

�;@�( fS g )

f (�) = � 1

�lim�"1

1

j�j logZ�;@�

(free energy density)

Page 68: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

Page 69: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

8m; n; �h!nm j!m+D�1

m

i= exp

nXl=m+D

�l

�!ll�D

(normalized potential)

Page 70: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

8m < n; A <� [!nm ]

expPn

l=m+DH�!ll�D

�exp�(n �m)P(H)

< B

(non normalized potential)

Page 71: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

P(H) is called "topological pressure" and is formalyequivalent to free energy density.

Does not require time-translation invariance (stationarity).

In the stationary case (+ assumptions) a Gibbs state is alsoan equilibrium state.

sup�2Minv

h(�) + �(H) = h(�) + �(H) = P(H)

.

Page 72: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

This formalism allows to handle the spatio-temporal case

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2) + : : :

even numerically.J.C. Vasquez, A. Palacios, O. Marre, M.J. Berry II, B. Cessac, J. Physiol. Paris, , Vol 106, Issues 34, (2012).

H. Nasser, O. Marre, and B. Cessac, J. Stat. Mech. (2013) P03006.

H. Nasser, B. Cessac, Entropy (2014), 16(4), 2244-2277.

Page 73: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Gibbs distribution

This formalism allows to handle the spatio-temporal case

H(!D0 ) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0)

+NX

i ;j=1

J(1)ij !i (0)!j(1)

+NX

i ;j ;k=1

J(2)ijk !i (0)!j(1)!k(2) + ?????

even numerically.J.C. Vasquez, A. Palacios, O. Marre, M.J. Berry II, B. Cessac, J. Physiol. Paris, , Vol 106, Issues 34, (2012).

H. Nasser, O. Marre, and B. Cessac, J. Stat. Mech. (2013) P03006.

H. Nasser, B. Cessac, Entropy (2014), 16(4), 2244-2277.

Page 74: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two small problems.

Exponential number of possible terms.

Contrarily to what happens usually in physics, we do not knowwhat should be

the right potential.

Page 75: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Two small problems.

Exponential number of possible terms.

Contrarily to what happens usually in physics, we do not knowwhat should be

the right potential.

Page 76: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Can we have a reasonable idea of what could be the spike statisticsby studying a neural network model ?

Page 77: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Page 78: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

R.Cofr�e,B. Cessac: "Dynamics and spike trains statistics in conductance-based Integrate-and-Fire neural networks

with chemical and electric synapses", Chaos, Solitons and Fractals, 2013.

Page 79: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

Page 80: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

Page 81: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

Page 82: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

0

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0.1

0.5 1 1.5 2 2.5 3 3.5 4

PSP

t

t=1t=1.2t=1.6

t=3g(x)

Page 83: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

0

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0.1

0.5 1 1.5 2 2.5 3 3.5 4

PSP

t

g(x)

Page 84: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

Page 85: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

An Integrate and Fire neural network model with chemicaland electric synapses

Sub-threshold dynamics:

CkdVk

dt= �gL;k(Vk � EL)

�Xj

gkj(t; !)(Vk � Ej)

�Xj

�gkj (Vk � Vj)

+i(ext)k (t) + �B�k(t)

Page 86: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Sub-threshold regime

CdV

dt+�G (t; !)� G

�V = I (t; !);

Gkl(t; !) =

24 gL;k +

NXj=1

gkj(t; !)

35 �kl

def= gk(t; !)�kl :

I (t; !) = I (cs)(t; !) + I (ext)(t) + I (B)(t)

I(cs)k (t; !) =

Xj

Wkj�kj(t; !); Wkjdef= GkjEj :

Page 87: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Sub-threshold regime

8<:

dV = (�(t; !)V + f (t; !))dt + �BcINdW (t);

V (t0) = v ;

�(t; !) = C�1�G � G (t; !)

f (t; !) = C�1I (cs)(t; !) + C�1I (ext)(t)

Page 88: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

�dV (t;!)

dt= �(t; !)V (t; !);

V (t0) = v ;

Theorem

�(t; !) square matrix with bounded elements.

M0(t0; t; !) = IN

Mk(t0; t; !) = IN +

Z t

t0

�(s; !)Mk�1(s; t)ds; t � t1;

converges uniformly in [t0; t1].Brockett, R. W., "Finite Dimensional Linear Systems",John Wiley and Sons, 1970.

Flow

�(t0; t; !)def= lim

k!1Mk(t0; t; !)

Page 89: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

�dV (t;!)

dt= �(t; !)V (t; !);

V (t0) = v ;

Theorem

�(t; !) square matrix with bounded elements.

M0(t0; t; !) = IN

Mk(t0; t; !) = IN +

Z t

t0

�(s; !)Mk�1(s; t)ds; t � t1;

converges uniformly in [t0; t1].Brockett, R. W., "Finite Dimensional Linear Systems",John Wiley and Sons, 1970.

Flow

�(t0; t; !)def= lim

k!1Mk(t0; t; !)

Page 90: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

�dV (t;!)

dt= �(t; !)V (t; !);

V (t0) = v ;

Theorem

�(t; !) square matrix with bounded elements.

M0(t0; t; !) = IN

Mk(t0; t; !) = IN +

Z t

t0

�(s; !)Mk�1(s; t)ds; t � t1;

converges uniformly in [t0; t1].Brockett, R. W., "Finite Dimensional Linear Systems",John Wiley and Sons, 1970.

Flow

�(t0; t; !)def= lim

k!1Mk(t0; t; !)

Page 91: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

If �(t; !) and �(s; !) commute

�(t0; t; !) =1Xk=0

1

k!(

Z t

t0

�(s; !)ds)k = eR tt0�(s;!)ds

Page 92: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

If �(t; !) and �(s; !) commute

�(t0; t; !) =1Xk=0

1

k!(

Z t

t0

�(s; !)ds)k = eR tt0�(s;!)ds

This holds only in two cases :

G = 0;

�(t0; t; !) = e� 1

c

R tt0G(s;!)ds

B. Cessac, J. Math. Neuroscience, 2011.

Page 93: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

If �(t; !) and �(s; !) commute

�(t0; t; !) =1Xk=0

1

k!(

Z t

t0

�(s; !)ds)k = eR tt0�(s;!)ds

This holds only in two cases :

G = 0;

�(t0; t; !) = e� 1

c

R tt0G(s;!)ds

B. Cessac, J. Math. Neuroscience, 2011.

G (t; !) = �(t; !)IN

Page 94: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Homogeneous Cauchy problem

In general:

�(t0; t; !) = IN++1Xn=1

XX1 = ( B; A(s1; !) )X2 = ( B; A(s2; !) )

: : :

Xn = ( B; A(sn; !) )

Z t

t0

� � �Z sn�1

t0

nYk=1

Xk ds1 � � � dsn :

B = C�1G ; A(t; !) = �C�1G (t; !)

Page 95: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Exponentially bounded ow

De�nition: An exponentially bounded ow is a two parameter(t0; t) family f�(t0; t; !)gt�t0 of ows such that, 8! 2 :

1 �(t0; t0; !) = IN and �(t0; t; !)�(t; s; !) = �(t0; s; !)whenever t0 � t � s;

2 For each v 2 RN and ! 2 , (t0; t)! �(t0; t; !)v iscontinuous for t0 � t;

3 There is M > 0 and m > 0 such that :

jj�(s; t; !)jj � Me�m(t�s); s � t: (1)

Page 96: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Exponentially bounded ow

Proposition

Let �1 be the largest eigenvalue of �G. If:

�1 < gL;

then the ow � in our model has the exponentially bounded ow

property.

Remark The typical electrical conductance values are of order 1nano-Siemens, while the leak conductance of retinal ganglion cellsis of order 50 micro-Siemens. Therefore, this condition is

compatible with the biophysical values of conductances in the

retina.

Page 97: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Exponentially bounded ow

Proposition

Let �1 be the largest eigenvalue of �G. If:

�1 < gL;

then the ow � in our model has the exponentially bounded ow

property.

Remark The typical electrical conductance values are of order 1nano-Siemens, while the leak conductance of retinal ganglion cellsis of order 50 micro-Siemens. Therefore, this condition is

compatible with the biophysical values of conductances in the

retina.

Page 98: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Exponentially bounded ow

Theorem

If �(t0; t; !) is an exponentially bounded ow , there is a unique

strong solution for t � t0 given by:

V (t0; t; !) = �(t0; t; !)v+

Z t

t0

�(s; t; !)f (s; !)ds+�B

c

Z t

t0

�(s; t; !)dW (s):

R. Wooster, "Evolution systems of measures for non-autonomous stochastic di�erential equations with Levy noise",

Communications on Stochastic Analysis, vol 5, 353-370, 2011

Page 99: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Membrane potential decomposition

V (t; !) = V (d)(t; !) + V (noise)(t; !);

V (d)(t; !) = V (cs)(t; !) + V (ext)(t; !);

V (cs)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (cs)(s; !)ds;

V (ext)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (ext)(s; !)ds;

V (noise)(t; !) =�B

c

Z t

�k (t;!)�(s; t; !) dW (s):

Page 100: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Membrane potential decomposition

V (t; !) = V (d)(t; !) + V (noise)(t; !);

V (d)(t; !) = V (cs)(t; !) + V (ext)(t; !);

V (cs)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (cs)(s; !)ds;

V (ext)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (ext)(s; !)ds;

V (noise)(t; !) =�B

c

Z t

�k (t;!)�(s; t; !) dW (s):

Page 101: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Membrane potential decomposition

V (t; !) = V (d)(t; !) + V (noise)(t; !);

V (d)(t; !) = V (cs)(t; !) + V (ext)(t; !);

V (cs)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (cs)(s; !)ds;

V (ext)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (ext)(s; !)ds;

V (noise)(t; !) =�B

c

Z t

�k (t;!)�(s; t; !) dW (s):

Page 102: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Membrane potential decomposition

V (t; !) = V (d)(t; !) + V (noise)(t; !);

V (d)(t; !) = V (cs)(t; !) + V (ext)(t; !);

V (cs)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (cs)(s; !)ds;

V (ext)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (ext)(s; !)ds;

V (noise)(t; !) =�B

c

Z t

�k (t;!)�(s; t; !) dW (s):

Page 103: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Membrane potential decomposition

V (t; !) = V (d)(t; !) + V (noise)(t; !);

V (d)(t; !) = V (cs)(t; !) + V (ext)(t; !);

V (cs)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (cs)(s; !)ds;

V (ext)(t; !) =1

c

Z t

�1�(s; t; !)�(s; !) I (ext)(s; !)ds;

V (noise)(t; !) =�B

c

Z t

�k (t;!)�(s; t; !) dW (s):

Page 104: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Transition probabilities

Pb: to determine P�!(n)

��!n�1�1

Fix !, n and t < n. Set:

b�k(t; !) = � � V(d)k (t; !); (1)

Neuron k emits a spike at integer time n (!k(n) = 1) if:

9t 2 [n � 1; n]; V(noise)k (t; !) = b�k(t; !):

"First passage" problem, in N dimension, with a time

dependent boundary b�k(t; !). (general form unknown).

Page 105: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Transition probabilities

Pb: to determine P�!(n)

��!n�1�1

�Fix !, n and t < n. Set:

b�k(t; !) = � � V(d)k (t; !); (1)

Neuron k emits a spike at integer time n (!k(n) = 1) if:

9t 2 [n � 1; n]; V(noise)k (t; !) = b�k(t; !):

"First passage" problem, in N dimension, with a time

dependent boundary b�k(t; !). (general form unknown).

Page 106: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Transition probabilities

Pb: to determine P�!(n)

��!n�1�1

�Fix !, n and t < n. Set:

b�k(t; !) = � � V(d)k (t; !); (1)

Neuron k emits a spike at integer time n (!k(n) = 1) if:

9t 2 [n � 1; n]; V(noise)k (t; !) = b�k(t; !):

"First passage" problem, in N dimension, with a time

dependent boundary b�k(t; !). (general form unknown).

Page 107: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Transition probabilities

Pb: to determine P�!(n)

��!n�1�1

�Fix !, n and t < n. Set:

b�k(t; !) = � � V(d)k (t; !); (1)

Neuron k emits a spike at integer time n (!k(n) = 1) if:

9t 2 [n � 1; n]; V(noise)k (t; !) = b�k(t; !):

"First passage" problem, in N dimension, with a time

dependent boundary b�k(t; !). (general form unknown).

Page 108: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

Without electric synapses the probability of !(n) conditionally to!n�1�1 can be approximated by:

P�!(n)

��!n�1�1

�=

NYk=1

P�!k(n)

��!n�1�1

�;

with P�!k(n)

��!n�1�1

�=

!k(n)� (Xk(n � 1; !)) + (1� !k(n)) (1� � (Xk(n � 1; !))) ;

where

Xk(n � 1; !) =� � V

(det)k (n � 1; !)

�k(n � 1; !);

and

�(x) =1p2�

Z +1

x

e�u2

2 du:

Page 109: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

�(!) = logP�!(n)

��!n�1�1

�de�nes a (in�nite range)

normalized potential de�ning a unique Gibbs distribution.

It depends explicitly on networks parameters and external

stimulus.

Its de�nition holds for a time-dependent stimulus (nonstationary).

It is similar to the so-called Generalized Linear Model used forretina analysis, although with a more complex structure.

The general form (with electric synapses) is yet unknown.

Page 110: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

�(!) = logP�!(n)

��!n�1�1

�de�nes a (in�nite range)

normalized potential de�ning a unique Gibbs distribution.

It depends explicitly on networks parameters and external

stimulus.

Its de�nition holds for a time-dependent stimulus (nonstationary).

It is similar to the so-called Generalized Linear Model used forretina analysis, although with a more complex structure.

The general form (with electric synapses) is yet unknown.

Page 111: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

�(!) = logP�!(n)

��!n�1�1

�de�nes a (in�nite range)

normalized potential de�ning a unique Gibbs distribution.

It depends explicitly on networks parameters and external

stimulus.

Its de�nition holds for a time-dependent stimulus (nonstationary).

It is similar to the so-called Generalized Linear Model used forretina analysis, although with a more complex structure.

The general form (with electric synapses) is yet unknown.

Page 112: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

�(!) = logP�!(n)

��!n�1�1

�de�nes a (in�nite range)

normalized potential de�ning a unique Gibbs distribution.

It depends explicitly on networks parameters and external

stimulus.

Its de�nition holds for a time-dependent stimulus (nonstationary).

It is similar to the so-called Generalized Linear Model used forretina analysis, although with a more complex structure.

The general form (with electric synapses) is yet unknown.

Page 113: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Conditional probability

�(!) = logP�!(n)

��!n�1�1

�de�nes a (in�nite range)

normalized potential de�ning a unique Gibbs distribution.

It depends explicitly on networks parameters and external

stimulus.

Its de�nition holds for a time-dependent stimulus (nonstationary).

It is similar to the so-called Generalized Linear Model used forretina analysis, although with a more complex structure.

The general form (with electric synapses) is yet unknown.

Page 114: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

Is there a Maximum Entropy potential corresponding to � (inthe stationary case) ?

Page 115: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

Is there a Maximum Entropy potential corresponding to � (inthe stationary case) ?

Page 116: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

One can make a Taylor expansion of �(!).

Page 117: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

Using !i (n)k = !i (n); k � 1 one ends up with a potential of the

form:

�(!) =NXi=1

hi!i (0) +NX

i ;j=1

J(0)ij !i (0)!j(0) + : : :

Page 118: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

The expansion is in�nite although one can approximate the in�niterange potential � by a �nite range approximation (�nite memory),

giving rise to a �nite expansion.

Page 119: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

The coe�cients of the expansion are non linear functions of thenetwork parameters and stimulus.

They are therefore somewhat redundant.

Page 120: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

Rodrigo Cofr�e, Bruno Cessac, "Exact computation of the maximum-entropy potential of spiking neural-network

models",Phys. Rev. E 89, 052117.

Given a set of stationary transition probabilities P�!(D)

��!D�1

0

�> 0

there is a unique (up to a constant) Maximum Entropy potential, written

as a linear combination of spike interactions terms with a minimal

number of terms (normal form). This potential can be explicitly (and

algorithmically) computed.

Hints: Using variable change one can eliminate terms in thepotential ("normal" form).

The construction is based on equivalence between Gibbs potentials(cohomology) and periodic orbits expansion.

Page 121: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to our second "small" problem

However, there is still a number of terms growing exponentially

with the number of neurons and the memory depth.

These terms are generically non zero.

Page 122: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

Neuromimetic models have typically O(N2) parameters whereN is the number of neurons.

The equivalent MaxEnt potential has generically a number ofparameters growing exponentially with N, non linear andredundant functions of the network parameters (synapticweights, stimulus).

)Intractable determination of parameters;

Stimulus dependent parameters;

Over�tting.

BUT Real neural networks are not generic

Page 123: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

Neuromimetic models have typically O(N2) parameters whereN is the number of neurons.

The equivalent MaxEnt potential has generically a number ofparameters growing exponentially with N, non linear andredundant functions of the network parameters (synapticweights, stimulus).

)Intractable determination of parameters;

Stimulus dependent parameters;

Over�tting.

BUT Real neural networks are not generic

Page 124: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

Neuromimetic models have typically O(N2) parameters whereN is the number of neurons.

The equivalent MaxEnt potential has generically a number ofparameters growing exponentially with N, non linear andredundant functions of the network parameters (synapticweights, stimulus).

)Intractable determination of parameters;

Stimulus dependent parameters;

Over�tting.

BUT Real neural networks are not generic

Page 125: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

Neuromimetic models have typically O(N2) parameters whereN is the number of neurons.

The equivalent MaxEnt potential has generically a number ofparameters growing exponentially with N, non linear andredundant functions of the network parameters (synapticweights, stimulus).

)Intractable determination of parameters;

Stimulus dependent parameters;

Over�tting.

BUT

Real neural networks are not generic

Page 126: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

Neuromimetic models have typically O(N2) parameters whereN is the number of neurons.

The equivalent MaxEnt potential has generically a number ofparameters growing exponentially with N, non linear andredundant functions of the network parameters (synapticweights, stimulus).

)Intractable determination of parameters;

Stimulus dependent parameters;

Over�tting.

BUT Real neural networks are not generic

Page 127: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Back to the retina

MaxEnt approach might be useful if there is some hidden law ofnature/ symmetry which cancels most terms in the expansion.

Page 128: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Acknowledgment

Neuromathcomp team

Rodrigo Cofr�e (pHd, September 2014)

Dora Karvouniari (M2)

Pierre Kornprobst (CR1 INRIA)

Slim Kraria (IR)

Gaia Lombardi (M2! Paris)

Hassan Nasser (pHd! Startup)

Daniela Pamplona (PostDoc)

Geo�rey Portelli (Post Doc)

Vivien Robinet (Post Doc! MCF Kourou)

Horacio Rostro (pHd! Docent Mexico)

Wahiba Taouali (Post Doc! Post Doc INTMarseille)

Juan-Carlos Vasquez (pHd! Post Doc Bogota)

Princeton University

Michael J. Berry II

ANR KEOPS

Maria-Jos�e Escobar (CN Valparaiso)

Adrian Palacios (CN Valparaiso)

Cesar Ravelo (CN Valparaiso)

Thierry Vi�eville (INRIA Mnemosyne)

Renvision FP7 project

Luca Bernondini (IIT Genova)

Matthias Hennig (Edinburgh)

Alessandro Maccionne (IIT Genova)

Evelyne Sernagor (Newcastle)

Institut de la Vision

Olivier Marre

Serge Picaud

Bruno Cessac Statistical analysis of spike trains in neuronal networks

Page 129: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Can we hear the shape of a Maximum entropy potential

Two distinct potentials H(1);H(2) of range R = D + 1 correspondto the same Gibbs distribution (are \equivalent"), if and only ifthere exists a range D function f such that (Chazottes-Keller(2009)):

H(2)�!D0

�= H(1)

�!D0

�� f

�!D�10

�+ f

�!D1

�+�; (2)

where � = P(H(2))� P(H(1)).

Page 130: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Can we hear the shape of a Maximum entropy potential

Summing over periodic orbits we get rid of the function f

RXn=1

�(!�nl1) =RXn=1

H�(!�nl1)� RP(H�); (3)

We eliminate equivalent constraints.

Page 131: Statistical analysis of spike trains in neuronal networks · 2017. 1. 28. · Statistical analysis of spike trains in neuronal networks Bruno Cessac NeuroMathComp eam,INRIAT Sophia

Can we hear the shape of a Maximum entropy potential

Conclusion

Given a set of transition probabilities Ph!(D)

���!D�10

i> 0 there

is a unique, up to a constant, MaxEnt potential, written as a linearcombination of constraints (average of spike events) with aminimal number of terms. This potential can be explicitly (andalgorithmically) computed.