Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN...

71
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key Laboratory of Cognitive Neuroscience and Learning Beijing Normal University China July 10, 2013

Transcript of Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN...

Page 1: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Scientific Python Tutorial – CCN Course 2013How to code a neural network simulation

Malte J. Rasch

National Key Laboratory of Cognitive Neuroscience and LearningBeijing Normal University

China

July 10, 2013

Page 2: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Goal of tutorial

We will program a neural network simulation together.

Page 3: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Page 4: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Usage of array notation

Page 5: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Usage of array notation

How to integrate ODEs

Page 6: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Usage of array notation

How to integrate ODEs

How to plot results

Page 7: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Usage of array notation

How to integrate ODEs

How to plot results

How to simulate neurons and synapses

Page 8: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts

Usage of array notation

How to integrate ODEs

How to plot results

How to simulate neurons and synapses

How to program a quite realistic network simulation

Page 9: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are

inter-connected with synapses.

Page 10: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are

inter-connected with synapses.

The network gets some input

Page 11: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are

inter-connected with synapses.

The network gets some input

Each neuron and each synapse follows a particular

dynamics over time.

Page 12: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are

inter-connected with synapses.

The network gets some input

Each neuron and each synapse follows a particular

dynamics over time.

The simulation solves the interplay of all components and

e.g. yields spiking activity of the network for given

inputs, which can be further analyzed (e.g. plotted)

Page 13: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

Page 14: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

2 Simulate a single neuron with Poisson input

Page 15: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

2 Simulate a single neuron with Poisson input

3 Simulate 1000 neurons (no recurrent connections)

Page 16: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

2 Simulate a single neuron with Poisson input

3 Simulate 1000 neurons (no recurrent connections)

4 Simulate a recurrent network

Page 17: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

2 Simulate a single neuron with Poisson input

3 Simulate 1000 neurons (no recurrent connections)

4 Simulate a recurrent network

5 Simulate a simple orientation column

Page 18: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps

1 Simulate a single neuron with current step input

2 Simulate a single neuron with Poisson input

3 Simulate 1000 neurons (no recurrent connections)

4 Simulate a recurrent network

5 Simulate a simple orientation column

Result plots

0 200 400 600 800 1000

Time [ms]

−80

−60

−40

−20

0

20

40

Me

mb

ran

evo

lta

ge

[mV

]

A single qIF neuronwith current step input

0 200 400 600 800 1000

Time [ms]

−80

−60

−40

−20

0

20

40

Mem

branevoltage[m

V]

A single qIF neuronwith 100 Poisson inputs

Voltage trace

0 200 400 600 800 1000

Time [ms]

0

200

400

600

800

1000

Ne

uro

nnu

mb

er

[#]

An unconnected networkof 1000 qIF neurons

Exc.

Inh.

0 200 400 600 800 1000

Time [ms]

0

200

400

600

800

1000

Ne

uro

nnu

mb

er

[#]

An recurrent networkof 1000 qIF neurons

Exc.

Inh.

Page 19: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Which neuron model to use?

Biophysical model (i.e. Hodgkin-Huxley model)

Cm

dVm

dt= −

1

Rm

(Vm − VL)−∑

i

gi(t)(Vm − Ei) + I

Including non-linear dynamics of many channels in gi(t)

Page 20: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Which neuron model to use?

Biophysical model (i.e. Hodgkin-Huxley model)

Cm

dVm

dt= −

1

Rm

(Vm − VL)−∑

i

gi(t)(Vm − Ei) + I

Including non-linear dynamics of many channels in gi(t)

Mathematical simplification (Izhikevich, book chapter 8)

if v < 35 :

v = (0.04v + 5) v + 150− u − I

u = a (b v − u)

if v ≥ 35 :

v ← c

u ← u + d

With b = 0.2, c = −65, and d = 8, a = 0.02 for excitatory neurons and d = 2, a = 0.1 forinhibitory neurons.

Page 21: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Neuron model

regular spiking (RS) intrinsically bursting (IB) chattering (CH) fast spiking (FS)

40 ms

20 mV

low-threshold spiking (LTS)pa

ram

eter

b

parameter c

para

met

er d

thalamo-cortical (TC)

-87 mV

-63 mV

thalamo-cortical (TC)

peak 30 mV

reset c

reset ddecay with rate a

sensitivity b

v(t)

u(t)0 0.1

0.05

0.2

0.25

RS,IB,CH FS

LTS,TC

-65 -55 -50

2

4

8

IB

CH

RS

FS,LTS,RZ

TC

0.02parameter a

resonator (RZ)

RZ

v(t)

I(t)

v'= 0.04v 2+5v +140 - u + I u'= a(bv - u) if v = 30 mV, then v c, u u + d

Page 22: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1: Simulate a single neuron with injected current

Exercise 1

Simulate one excitatory neuron for 1000ms and plot the resulting voltagetrace. Apply a current step (Iapp = 7pA) between time 200ms and 700ms.

Page 23: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1: Simulate a single neuron with injected current

Exercise 1

Simulate one excitatory neuron for 1000ms and plot the resulting voltagetrace. Apply a current step (Iapp = 7pA) between time 200ms and 700ms.

Neuron model:

if v < 35 :

v = (0.04v + 5) v + 150− u − Iapp

u = a (b v − u)

if v ≥ 35 :

v ← c

u ← u + d

with d = 8, a = 0.02, b = 0.2, c = −65

0 200 400 600 800 1000

Time [ms]

−80

−60

−40

−20

0

20

40

Me

mb

ran

evo

lta

ge

[mV

]

A single qIF neuronwith current step input

Page 24: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

Page 25: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

Page 26: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

3 Loop over T − 1 time steps and do for each step t

Page 27: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

3 Loop over T − 1 time steps and do for each step t

1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)

Page 28: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

3 Loop over T − 1 time steps and do for each step t

1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)

2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt +∆t {(0.04 vt + 5) vt − ut + 140 + Iapp}

ut+1 ← ut +∆t a (b vt − ut)

Page 29: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

3 Loop over T − 1 time steps and do for each step t

1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)

2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt +∆t {(0.04 vt + 5) vt − ut + 140 + Iapp}

ut+1 ← ut +∆t a (b vt − ut)

3 if vt ≥ 35: set the variables, vt ← 35, vt+1 ← c , and ut+1 ← ut + d .

Page 30: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:Open Spyder and create a new file (script) that will simulate the neuron. Import thenecessary modules (from pylab import *)

Proceed as follows:

1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )

2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set firstelement to −70 and −14, respectively.

3 Loop over T − 1 time steps and do for each step t

1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)

2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt +∆t {(0.04 vt + 5) vt − ut + 140 + Iapp}

ut+1 ← ut +∆t a (b vt − ut)

3 if vt ≥ 35: set the variables, vt ← 35, vt+1 ← c , and ut+1 ← ut + d .

4 Plot the voltage trace v versus t

Page 31: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 1 (Python)

1 from py l ab impor t ∗

2

3# 1) i n i t i a l i z e pa ramete r s4 tmax = 1000 .5 dt = 0 .56

7# 1 . 1 ) Neuron / Network pa r s8 a = 0.02 # RS , IB : 0 . 02 , FS : 0 . 19 b = 0 .2 # RS , IB , FS : 0 . 2

10 c = −65 # RS , FS : −65 IB : −5511 d = 8 . # RS : 8 , IB : 4 , FS : 212

13# 1 . 2 ) I npu t pa r s14 I app=1015 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt # stm t ime16

17# 2) r e s e r v e memory18T = c e i l ( tmax/ dt )19 v = z e r o s (T)20 u = z e r o s (T)21 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l22 u [ 0 ] = −14 # s t eady s t a t e23

24# 3) fo r−l oop ove r t ime25 f o r t i n arange (T−1):26 # 3 . 1 ) ge t i n pu t27 i f t>t r [ 0 ] and t<t r [ 1 ] :

28 I = Iapp29 e l s e :30 I = 031

32

33 i f v [ t ]<35:34 # 3 . 2 ) update ODE35 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]36 v [ t +1] = v [ t ] + ( dv+I )∗ dt37 du = a ∗( b∗v [ t ]−u [ t ] )38 u [ t +1] = u [ t ] + dt ∗du39 e l s e :40 # 3 . 3 ) s p i k e !41 v [ t ] = 3542 v [ t +1] = c43 u [ t +1] = u [ t ]+d44

45# 4) p l o t v o l t a g e t r a c e46 f i g u r e ( )47 t v e c = arange ( 0 . , tmax , dt )48 p l o t ( tvec , v , ’ b ’ , l a b e l= ’ Vo l tage t r a c e ’ )49 x l a b e l ( ’ Time [ms ] ’ )50 y l a b e l ( ’Membrane v o l t a g e [mV] ’ )51 t i t l e ( ”””A s i n g l e qIF neuron52 with c u r r e n t s t e p i npu t6 ””” )53 show ( )

Page 32: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Synapse model

Conductance based synaptic input

A simple synaptic input model would be

Isyn =∑

j

wjsj(v − Ej)

where wj is the weight of the jth synapse and Ej its reversal potential (forinstance 0 mV for excitatory and −85 mV for inhibitory synapses).

Variable sj implements the dynamics of the jth synapse:

sj = −sj/τs

sj ← sj + 1, if pre-synaptic neuron spikes

Page 33: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Synapse model

Conductance based synaptic input

A simple synaptic input model would be

Isyn =∑

j

wjsj(v − Ej)

where wj is the weight of the jth synapse and Ej its reversal potential (forinstance 0 mV for excitatory and −85 mV for inhibitory synapses).

Variable sj implements the dynamics of the jth synapse:

sj = −sj/τs

sj ← sj + 1, if pre-synaptic neuron spikes

Optional: Synaptic depressionChange the update to

sj ← sj + hj , hj ← 1− (1 + (U − 1)hj)e−∆tjτd ,

with e.g. U = 0.5, τd = 500ms. ∆tj is the interval between current and previous spike ofneuron j .

Page 34: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2: Single neuron with synaptic input

Exercise 2

Simulate the neuron model for 1000ms and plot the resulting voltage trace.Assume that 100 synapses are attached to the neuron, with each pre-synapticneuron firing with a Poisson process of rate frate = 2 Hz between time 200msand 700ms.

Page 35: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2: Single neuron with synaptic input

Exercise 2

Simulate the neuron model for 1000ms and plot the resulting voltage trace.Assume that 100 synapses are attached to the neuron, with each pre-synapticneuron firing with a Poisson process of rate frate = 2 Hz between time 200msand 700ms.

Synaptic input model:

Isyn =∑

j

w inj s inj (t)(E

inj − v(t))

s inj = s inj /τs

s inj ← s inj + hj , if synapse j spikes

with hj = 1∗, τs = 10, w inj = 0.07, Ej = 0,

j = 1 . . . 100. Poisson: Input synapse j spikes if

rj(t) < frate∆t, where rj(t) ∈ [0, 1] are uniform

random numbers drawn for each step t.

0 200 400 600 800 1000

Time [ms]

−80

−60

−40

−20

0

20

40

Mem

branevoltage[m

V]

A single qIF neuronwith 100 Poisson inputs

Voltage trace

Page 36: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:

Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:

1 Initialize new parameter values (τs = 10, frate = 0.002ms−1)

Page 37: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:

Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:

1 Initialize new parameter values (τs = 10, frate = 0.002ms−1)

2 Reserve memory and initialize the vectors sin = (s inj ), win = (w in

j ), and E = (Ej)with nin = 100 constant elements (same values as in Step 1)

Page 38: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:

Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:

1 Initialize new parameter values (τs = 10, frate = 0.002ms−1)

2 Reserve memory and initialize the vectors sin = (s inj ), win = (w in

j ), and E = (Ej)with nin = 100 constant elements (same values as in Step 1)

3 Inside the for-loop change/add the following:

Page 39: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:

Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:

1 Initialize new parameter values (τs = 10, frate = 0.002ms−1)

2 Reserve memory and initialize the vectors sin = (s inj ), win = (w in

j ), and E = (Ej)with nin = 100 constant elements (same values as in Step 1)

3 Inside the for-loop change/add the following:

1 Set pj = 1 if rj ≤ frate∆t (otherwise 0) during times of applied input. rj is anuniform random number between 0 and 1. Use array notation to set theinput for all nin input synapses. Hint

Page 40: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:

Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:

1 Initialize new parameter values (τs = 10, frate = 0.002ms−1)

2 Reserve memory and initialize the vectors sin = (s inj ), win = (w in

j ), and E = (Ej)with nin = 100 constant elements (same values as in Step 1)

3 Inside the for-loop change/add the following:

1 Set pj = 1 if rj ≤ frate∆t (otherwise 0) during times of applied input. rj is anuniform random number between 0 and 1. Use array notation to set theinput for all nin input synapses. Hint

2 before the vt update: Implement the conductance dynamics s and set Iappaccording to the input. Use array notation with dot “·” product andelement-wise “⊙” product. Hint

s inj ← s inj + pj

Iapp ← win ·(

sin ⊙ Ein)

−(

win · sin)

⊙ vt

s inj ← (1−∆t/τs) sinj

Page 41: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 2 (Python)

1 from py l ab impor t ∗23# 1) i n i t i a l i z e pa ramete r s4 tmax = 1000 .5 dt = 0 .567# 1 . 1 ) Neuron / Network pa r s8 a = 0.029 b = 0 .2

10 c = −6511 d = 8 .12 t a u s = 10 # decay o f s ynap s e s [ms ]1314# 1 . 2 ) I npu t pa r s15 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt16 r a t e i n = 2 # inpu t r a t e17 n i n = 100 # number o f i n p u t s18 w in = 0.07 # inpu t we i gh t s19 W in = w in∗ones ( n i n ) # v e c t o r2021# 2) r e s e r v e memory22 T = c e i l ( tmax/ dt )23 v = z e r o s (T)24 u = z e r o s (T)25 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l26 u [ 0 ] = −14 # s t eady s t a t e27 s i n = z e r o s ( n i n ) # s y n a p t i c v a r i a b l e28 E i n = z e r o s ( n i n ) # r e v p o t e n t i a l29 p r a t e = dt∗ r a t e i n ∗1e−3 # abbrev3031# 3) fo r−l oop ove r t ime32 f o r t i n arange (T−1):33 # 3 . 1 ) ge t i n pu t

34 i f t>t r [ 0 ] and t<t r [ 1 ] :35 # NEW: get i n pu t Po i s son s p i k e s36 p = un i fo rm ( s i z e=n i n )<p r a t e ;37 e l s e :38 p = 0 ; # no i npu t3940 # NEW: c a l c u l a t e i n pu t c u r r e n t41 s i n = (1 − dt / t a u s )∗ s i n + p42 I = dot (W in , s i n ∗E in )43 I −= dot (W in , s i n )∗ v [ t ]4445 i f v [ t ]<35:46 # 3 . 2 ) update ODE47 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]48 v [ t +1] = v [ t ] + ( dv+I )∗ dt49 du = a∗(b∗v [ t ]−u [ t ] )50 u [ t +1] = u [ t ] + dt∗du51 e l s e :52 # 3 . 3 ) s p i k e !53 v [ t ] = 3554 v [ t +1] = c55 u [ t +1] = u [ t ]+d5657# 4) p l o t v o l t a g e t r a c e58 f i g u r e ( )59 t v e c = arange ( 0 . , tmax , dt )60 p l o t ( tvec , v , ’ b ’ , l a b e l= ’ Vo l tage t r a c e ’ )61 x l a b e l ( ’ Time [ms ] ’ )62 y l a b e l ( ’Membrane v o l t a g e [mV] ’ )63 t i t l e ( ”””A s i n g l e qIF neuron64 wi th %d Po i s son i n p u t s ””” % n i n )65 show ( )

Page 42: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 2 with STP (Python)

1 from py l ab impor t ∗23# 1) i n i t i a l i z e pa ramete r s4 tmax = 1000 .5 dt = 0 .567# 1 . 1 ) Neuron / Network pa r s8 a = 0.029 b = 0 .2

10 c = −6511 d = 8 .12 t a u s = 10 # decay o f s ynap s e s [ms ]13 tau d = 500 # s y n a p t i c d e p r e s s i o n [ms ]14 s t p u = 0 .5 # STP paramete r1516# 1 . 2 ) I npu t pa r s17 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt18 r a t e i n = 10 # inpu t r a t e19 n i n = 1 # number o f i n p u t s20 w in = 0.03 # inpu t we i gh t s21 W in = w in∗ones ( n i n ) # v e c t o r2223# 2) r e s e r v e memory24 T = c e i l ( tmax/ dt )25 v = z e r o s (T)26 u = z e r o s (T)27 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l28 u [ 0 ] = −14 # s t eady s t a t e29 s i n = z e r o s ( n i n ) # s y n a p t i c v a r i a b l e30 E i n = z e r o s ( n i n ) # r e v p o t e n t i a l31 p r a t e = dt∗ r a t e i n ∗1e−3 # abbrev32 h = ones ( n i n )33 l a s t s p = − i n f t y ∗ones ( n i n )3435# 3) fo r−l oop ove r t ime36 f o r t i n arange (T−1):37 # 3 . 1 ) ge t i n pu t38 i f t>t r [ 0 ] and t<t r [ 1 ] :

39 # NEW: get i n pu t Po i s son s p i k e s40 p = un i fo rm ( s i z e=n i n )<p r a t e ;4142 #update s y n a p t i c d e p r e s s i o n43 tmp = exp ( dt∗( l a s t s p [ p]− t )/ tau d )44 h [ p ] = 1 − (1+( s tp u −1)∗h [ p ] )∗ tmp45 l a s t s p [ p ] = t46 e l s e :47 p = 0 ; # no i npu t484950 # NEW: c a l c u l a t e i n pu t c u r r e n t51 s i n = (1 − dt / t a u s )∗ s i n + p∗h52 I = dot (W in , s i n ∗E in )53 I −= dot (W in , s i n )∗ v [ t ]5455 i f v [ t ]<35:56 # 3 . 2 ) update ODE57 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]58 v [ t +1] = v [ t ] + ( dv+I )∗ dt59 du = a∗(b∗v [ t ]−u [ t ] )60 u [ t +1] = u [ t ] + dt∗du61 e l s e :62 # 3 . 3 ) s p i k e !63 v [ t ] = 3564 v [ t +1] = c65 u [ t +1] = u [ t ]+d6667# 4) p l o t v o l t a g e t r a c e68 f i g u r e ( )69 t v e c = arange ( 0 . , tmax , dt )70 p l o t ( tvec , v , ’ b ’ , l a b e l= ’ Vo l tage t r a c e ’ )71 x l a b e l ( ’ Time [ms ] ’ )72 y l a b e l ( ’Membrane v o l t a g e [mV] ’ )73 t i t l e ( ”””A s i n g l e qIF neuron74 wi th %d Po i s son i n p u t s ””” % n i n )75 show ( )

Page 43: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3: Simulate 1000 neurons (not inter-connected)

Exercise 3

Simulate 1000 neurons for 1000 ms and plot the resulting spikes. Assumethat each neuron receives (random) 10% of the 100 Poisson spike trains ofrate frate = 2 Hz between time 200 ms and 700 ms. Note that the neuronsare not yet inter-connected.

Page 44: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3: Simulate 1000 neurons (not inter-connected)

Exercise 3

Simulate 1000 neurons for 1000 ms and plot the resulting spikes. Assumethat each neuron receives (random) 10% of the 100 Poisson spike trains ofrate frate = 2 Hz between time 200 ms and 700 ms. Note that the neuronsare not yet inter-connected.

Excitatory and inhibitory neurons:

A neuron is, with probability pinh = 0.2, a(fast-spiking) inhibitory neuron (a = 0.1,d = 2), others are (regular spiking) excitatoryneurons (a = 0.02 and d = 8).

Input weights of input synapse j to neuron i is

set to w in = 0.07 if connected (otherwise 0).

0 200 400 600 800 1000

Time [ms]

0

200

400

600

800

1000

Ne

uro

nnu

mb

er

[#]

An unconnected networkof 1000 qIF neurons

Exc.

Inh.

Page 45: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

Page 46: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

2 Initialize 2 logical vectors kinh and kexc of length n, where kinh(i) is True withprobability p = 0.2 (marking an inhibitory neuron) and False otherwise. Andkexc = ¬kinh.

Page 47: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

2 Initialize 2 logical vectors kinh and kexc of length n, where kinh(i) is True withprobability p = 0.2 (marking an inhibitory neuron) and False otherwise. Andkexc = ¬kinh.

3 Reserve memory and initialize vi ,t , ui ,t (now being T × n matrices). Setparameter vectors a and d according to kexc and kinh.

Page 48: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

2 Initialize 2 logical vectors kinh and kexc of length n, where kinh(i) is True withprobability p = 0.2 (marking an inhibitory neuron) and False otherwise. Andkexc = ¬kinh.

3 Reserve memory and initialize vi ,t , ui ,t (now being T × n matrices). Setparameter vectors a and d according to kexc and kinh.

4 The weights w inij = 0.07 now form a n × nin matrix. Set 90 % random elements

to 0 to account for the connection probability.

Page 49: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

2 Initialize 2 logical vectors kinh and kexc of length n, where kinh(i) is True withprobability p = 0.2 (marking an inhibitory neuron) and False otherwise. Andkexc = ¬kinh.

3 Reserve memory and initialize vi ,t , ui ,t (now being T × n matrices). Setparameter vectors a and d according to kexc and kinh.

4 The weights w inij = 0.07 now form a n × nin matrix. Set 90 % random elements

to 0 to account for the connection probability.

5 Inside the for-loop change/add the following:

1 Same update equations (for vi ,t+1 and ui ,t+1) but use array notation.

Page 50: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize new parameter values (n = 1000)

2 Initialize 2 logical vectors kinh and kexc of length n, where kinh(i) is True withprobability p = 0.2 (marking an inhibitory neuron) and False otherwise. Andkexc = ¬kinh.

3 Reserve memory and initialize vi ,t , ui ,t (now being T × n matrices). Setparameter vectors a and d according to kexc and kinh.

4 The weights w inij = 0.07 now form a n × nin matrix. Set 90 % random elements

to 0 to account for the connection probability.

5 Inside the for-loop change/add the following:

1 Same update equations (for vi ,t+1 and ui ,t+1) but use array notation.

6 Plot the spike raster. Plot black dots at {(t, i)|vit ≥ 35} for excitatory neurons i .Use red dots for inhibitory neurons.

Page 51: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 3 (Python)

1 from py l ab impor t ∗23# 1) i n i t i a l i z e pa ramete r s4 tmax = 1000 .5 dt = 0 .567# 1 . 1 ) Neuron / Network pa r s8 n = 1000 # number o f neurons9 p inh = 0 .2 # prob o f i nh neuron

10 inh = ( un i fo rm ( s i z e=n)<p inh ) # whether i nh .11 exc = l o g i c a l n o t ( i nh )12 a = inh . choose ( 0 . 0 2 , 0 . 1 )# exc =0.02 , i nh =0.113 b = 0 .214 c = −6515 d = inh . choose (8 , 2 ) # exc=8, i nh=216 t a u s = 101718# 1 . 2 ) I npu t pa r s19 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt20 r a t e i n = 221 n i n = 10022 w in = 0.0723 pconn in = 0 .1 # inpu t conn prob24 C = un i fo rm ( s i z e =(n , n i n ))< pconn in25 W in = C . choose (0 , w in ) # mat r i x2627# 2) r e s e r v e memory28 T = c e i l ( tmax/ dt )29 v = z e r o s ( (T, n ) ) # now mat r i x30 u = z e r o s ( (T, n ) ) # now mat r i x31 v [ 0 ] = −70 # s e t 1 s t row32 u [ 0 ] = −1433 s i n = z e r o s ( n i n )34 E i n = z e r o s ( n i n )35 p r a t e = dt∗ r a t e i n ∗1e−33637# 3) fo r−l oop ove r t ime38 f o r t i n arange (T−1):39 # 3 . 1 ) ge t i n pu t40 i f t>t r [ 0 ] and t<t r [ 1 ] :

41 p = un i fo rm ( s i z e=n i n )<p r a t e ;42 e l s e :43 p = 0 ;4445 s i n = (1 − dt / t a u s )∗ s i n + p46 I = W in . dot ( s i n ∗E in )47 I −= W in . dot ( s i n )∗ v [ t ]4849 # NEW: hand l e a l l neu rons50 f i r e d = v [ t ]>=355152 # 3 . 2 ) update ODE, s imp l y update a l l53 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]54 v [ t +1] = v [ t ] + ( dv+I )∗ dt55 du = a∗(b∗v [ t ]−u [ t ] )56 u [ t +1] = u [ t ] + dt∗du5758 # 3 . 3 ) s p i k e s !59 v [ t ] [ f i r e d ] = 3560 v [ t +1] [ f i r e d ] = c61 u [ t +1] [ f i r e d ] = u [ t ] [ f i r e d ]+d [ f i r e d ]6263# 4) p l o t t i n g64# NEW: get s p i k e s and p l o t65 tspk , nspk = nonzero ( v==35)66 i d x i = in1d ( nspk , nonze ro ( i nh ) [ 0 ] ) # f i n d i nh67 i d x e = l o g i c a l n o t ( i d x i ) # a l l o t h e r s a r e exc6869 f i g u r e ( )70 p l o t ( t spk [ i d x e ]∗ dt , nspk [ i d x e ] , ’ k . ’ ,71 l a b e l= ’ Exc . ’ , ma r k e r s i z e =2)72 p l o t ( t spk [ i d x i ]∗ dt , nspk [ i d x i ] , ’ r . ’ ,73 l a b e l= ’ Inh . ’ , ma r k e r s i z e =2)74 x l a b e l ( ’ Time [ms ] ’ )75 y l a b e l ( ’ Neuron number [\#] ’ )76 x l im ( ( 0 , tmax ) )77 t i t l e ( ”””An unconnected network78 o f %d qIF neurons ””” % n)79 l eg end ( l o c=’ upper r i g h t ’ )80 show ( )

Page 52: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4: Simulate recurrent network

Exercise 4

Simulate 1000 neurons as before but with added recurrent connections.

Page 53: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4: Simulate recurrent network

Exercise 4

Simulate 1000 neurons as before but with added recurrent connections.

Recurrent synaptic activations

A neuron i is sparsely connected to a neuron j

(with probability pconn = 0.1). Thus neuron i

receives an additional current I syni of the form:

Isyni =

n∑

j=1

wijsj(t) (Ej − vi (t))

Weights are Gamma distributed(wavg = 0.005 and gsc = 0.002). Set theinhibitory to excitatory connections twice asstrong on average.

0 200 400 600 800 1000

Time [ms]

0

200

400

600

800

1000

Ne

uro

nnu

mb

er

[#]

An recurrent networkof 1000 qIF neurons

Exc.

Inh.

Page 54: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

Page 55: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

2 Reserve memory and initialize weights W = (wij) to zero. Randomly choose 10%of the matrix elements.

Page 56: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

2 Reserve memory and initialize weights W = (wij) to zero. Randomly choose 10%of the matrix elements.

3 Set the chosen weight matrix elements to values drawn from a Gammadistribution of scale gsc = 0.002 and shape gsh =

wavg

gsc, with wavg = 0.005. Hint

Page 57: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

2 Reserve memory and initialize weights W = (wij) to zero. Randomly choose 10%of the matrix elements.

3 Set the chosen weight matrix elements to values drawn from a Gammadistribution of scale gsc = 0.002 and shape gsh =

wavg

gsc, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint

Page 58: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

2 Reserve memory and initialize weights W = (wij) to zero. Randomly choose 10%of the matrix elements.

3 Set the chosen weight matrix elements to values drawn from a Gammadistribution of scale gsc = 0.002 and shape gsh =

wavg

gsc, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint

5 Scale weights from inh. to exc. neurons by the factor of 2. Hint

Page 59: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Initialize and allocate memory for the new variables (s = (sj), Ej). Set Ej = −85if j is an inhibitory neuron (otherwise 0).

2 Reserve memory and initialize weights W = (wij) to zero. Randomly choose 10%of the matrix elements.

3 Set the chosen weight matrix elements to values drawn from a Gammadistribution of scale gsc = 0.002 and shape gsh =

wavg

gsc, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint

5 Scale weights from inh. to exc. neurons by the factor of 2. Hint

6 Inside the for-loop change/add the following:

1 add the equations for recurrent synaptic dynamics sj and add Isyn to the totalapplied current.

sj ← sj + 1, if vj(t − 1) ≥ 35

Isyn ← W · (s⊙ E)− (W · s)⊙ v

sj ← (1−∆t/τs) sj

Page 60: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 41 from py l ab impor t ∗2 from s c i p y . s p a r s e impor t c s r m a t r i x34 # 1) i n i t i a l i z e pa ramete r s5 tmax = 1000 .6 dt = 0 .578# 1 . 1 ) Neuron / Network pa r s9 n = 1000

10 p inh = 0 .211 inh = ( un i fo rm ( s i z e=n)<p inh )12 exc = l o g i c a l n o t ( i nh )13 a = inh . choose ( 0 . 0 2 , 0 . 1 )14 b = 0 .215 c = −6516 d = inh . choose (8 , 2 )17 t a u s = 101819# NEW r e c u r r e n t paramete r20 w = 0.005 # ave rage r e c u r r e n t we ight21 pconn = 0 .1 # r e c u r r e n t conne c t i on prob22 s c a l e E I = 2 # s c a l e I−>E23 g s c = 0.002 # s c a l e o f gamma24 E = inh . choose (0 ,−85)25# NEW: make we ight mat r i x26W = ze r o s ( ( n , n ) )27 C = un i fo rm ( s i z e =(n , n ) )28 i d x = nonzero (C<pconn ) # sp a r s e c o n n e c t i v i t y29W[ i d x ] = gamma(w/ g sc , s c a l e=g sc , s i z e=i d x [ 0 ] . s i z e )30W[ i x ( exc , i nh ) ] ∗= s c a l e E I #submat i n d e x i n g31W = c s r m a t r i x (W) # make row s p a r s e3233# 1 . 2 ) I npu t pa r s34 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt35 r a t e i n = 236 n i n = 10037 w in = 0.0738 pconn in = 0 .139 C = un i fo rm ( s i z e =(n , n i n ))< pconn in40 W in = C . choose (0 , w in )4142# 2) r e s e r v e memory

43 T = c e i l ( tmax/ dt )44 v = z e r o s ( (T, n ) )45 u = z e r o s ( (T, n ) )46 v [ 0 ] = −7047 u [ 0 ] = −1448 s i n = z e r o s ( n i n )49 E i n = z e r o s ( n i n )50 p r a t e = dt∗ r a t e i n ∗1e−351 s = z e r o s ( n ) # r e c s ynap s e s5253# 3) fo r−l oop ove r t ime54 f o r t i n arange (T−1):55 # 3 . 1 ) ge t i n pu t56 i f t>t r [ 0 ] and t<t r [ 1 ] :57 p = un i fo rm ( s i z e=n i n )<p r a t e ;58 e l s e :59 p = 0 ;6061 s i n = (1 − dt / t a u s )∗ s i n + p62 I = W in . dot ( s i n ∗E in )63 I −= W in . dot ( s i n )∗ v [ t ]6465 f i r e d = v [ t ]>=356667 # NEW: r e c u r r e n t i n pu t68 s = (1 − dt / t a u s )∗ s + f i r e d69 I s y n = W. dot ( s∗E) − W. dot ( s )∗ v [ t ]70 I += I s y n # add to i npu t v e c t o r7172 # 3 . 2 ) update ODE73 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]74 v [ t +1] = v [ t ] + ( dv+I )∗ dt75 du = a∗(b∗v [ t ]−u [ t ] )76 u [ t +1] = u [ t ] + dt∗du7778 # 3 . 3 ) s p i k e s !79 v [ t ] [ f i r e d ] = 3580 v [ t +1] [ f i r e d ] = c81 u [ t +1] [ f i r e d ] = u [ t ] [ f i r e d ]+d [ f i r e d ]8283# 4) p l o t t i n g84 tspk , nspk = nonzero ( v==35)

Page 61: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5: Simulate an orientation column

Exercise 5

Restructure the connection matrix and the input to simulate an orientationcolumn. That is all E-E neurons only connect to neighboring neurons and thenetwork resembles a 1D ring.

Page 62: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5: Simulate an orientation column

Exercise 5

Restructure the connection matrix and the input to simulate an orientationcolumn. That is all E-E neurons only connect to neighboring neurons and thenetwork resembles a 1D ring.

Ring structure

A neuron i is still sparsely connected to aneuron j but now with probability pconn = 0.4.However, if all neurons are arranged on a ringfrom 0 to 2π exc-to-exc connections are onlypossible if two neurons are nearer than π/4.Input is only delivered to a half of neurons(e.g. from 0 to pi). Use the same inputconnection probability as before.

0 200 400 600 800 1000

Time [ms]

0

200

400

600

800

1000

Ne

uro

nnu

mb

er

[#]

An recurrent networkof 1000 qIF neurons

Exc.

Inh.

Page 63: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Set indexes of the weight matrix to zero, which belong to exc-exc connectionsfurther apart that π/4. Hint: One can use scipy.linalg.circulant

Page 64: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5 in detail:

Modify the last script (after saving it under new name).

Proceed as follows:

1 Set indexes of the weight matrix to zero, which belong to exc-exc connectionsfurther apart that π/4. Hint: One can use scipy.linalg.circulant

2 Change the input so that only half (e.g. from 0 to pi) of the neurons receiveinput (again with probability 0.2). neurons

Page 65: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 51 from py l ab impor t ∗2 from s c i p y . s p a r s e impor t c s r m a t r i x3 from s c i p y . l i n a l g impor t c i r c u l a n t4 # 1) i n i t i a l i z e pa ramete r s5 tmax = 1000 .6 dt = 0 .578# 1 . 1 ) Neuron / Network pa r s9 n = 1000

10 p inh = 0 .211 inh = ( un i fo rm ( s i z e=n)<p inh )12 exc = l o g i c a l n o t ( i nh )13 a = inh . choose ( 0 . 0 2 , 0 . 1 )14 b = 0 .215 c = −6516 d = inh . choose (8 , 2 )17 t a u s = 101819 width = p i /4 # ha l f−width o f the o r i e n t a t i o n tun i ng20 w = 0.00521 pconn = 0 .4 # s e t a b i t h i g h e r22 s c a l e E I = 223 g s c = 0.00224 E = inh . choose (0 ,−85)25W = ze r o s ( ( n , n ) )26 C = un i fo rm ( s i z e =(n , n ) )27 i d x = nonzero (C<pconn )28W[ i d x ] = gamma(w/ g sc , s c a l e=g sc , s i z e=i d x [ 0 ] . s i z e )29W[ i x ( exc , i nh ) ] ∗= s c a l e E I30 th e t a = l i n s p a c e (0 ,2∗ pi , n ) # NEW31 R = c i r c u l a n t ( cos ( t h e t a ))> cos ( width ) #NEW32W[ : , exc ] = where (R [ : , exc ] ,W[ : , exc ] , 0 ) # NEW33W = c s r m a t r i x (W)3435# 1 . 2 ) I npu t pa r s36 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / dt37 r a t e i n = 238 i nw i d t h = p i /239 w in = 0.0740 pconn in = 0 .241 n i n = 10042 C = un i fo rm ( s i z e =(n , n i n ))< pconn in

43 W in = C . choose (0 , w in )44 W in [ n / 2 : , : ] = 0 # NEW4546# 2) r e s e r v e memory47 T = c e i l ( tmax/ dt )48 v = z e r o s ( (T, n ) )49 u = z e r o s ( (T, n ) )50 v [ 0 ] = −7051 u [ 0 ] = −1452 s i n = z e r o s ( n i n )53 E i n = z e r o s ( n i n )54 p r a t e = dt∗ r a t e i n ∗1e−355 s = z e r o s ( n ) # r e c s ynap s e s5657# 3) fo r−l oop ove r t ime58 f o r t i n arange (T−1):59 # 3 . 1 ) ge t i n pu t60 i f t>t r [ 0 ] and t<t r [ 1 ] :61 p = un i fo rm ( s i z e=n i n )<p r a t e ;62 e l s e :63 p = 0 ;6465 s i n = (1 − dt / t a u s )∗ s i n + p66 I = W in . dot ( s i n ∗E in )67 I −= W in . dot ( s i n )∗ v [ t ]6869 f i r e d = v [ t ]>=357071 # NEW: r e c u r r e n t i n pu t72 s = (1 − dt / t a u s )∗ s + f i r e d73 I s y n = W. dot ( s∗E) − W. dot ( s )∗ v [ t ]74 I += I s y n # add to i npu t v e c t o r7576 # 3 . 2 ) update ODE77 dv = (0 .04∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]78 v [ t +1] = v [ t ] + ( dv+I )∗ dt79 du = a∗(b∗v [ t ]−u [ t ] )80 u [ t +1] = u [ t ] + dt∗du8182 # 3 . 3 ) s p i k e s !83 v [ t ] [ f i r e d ] = 3584 v [ t +1] [ f i r e d ] = c

Page 66: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Congratulation !

You have just coded and simulated a quite realisticnetwork model !

Page 67: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for generating random number

Use the uniform function to generate arrays of random numbersbetween 0 and 1.

1 from numpy . random impor t un i fo rm2 n i n = 1003 r = un i fo rm ( s i z e=n i n )

To set indexes i of a vector v to a with a probability p and otherwise tob one can use the method choose

1 r = un i fo rm ( s i z e=n i n )2 v = ( r<p ) . choose (b , a )

back

Page 68: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for generating gamma distributed random number

Use the numpy.random.gamma function to generate arrays of randomnumbers which are gamma distributed.

1 from numpy . random impor t gamma2

3 g shape , g s c a l e , n = 0 .003 , 2 , 10004 r = gamma( g shape , g s c a l e , n )

back

Page 69: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for making sparse matrices

There are several forms of sparse matrices in the modulescipy.sparse. The one which is interesting for our purposes is the“row-wise” sparse matrix (see the documentation of scipy.sparse formore information).

1 from py l ab impor t ∗2 from s c i p y . s p a r s e impor t c s r m a t r i x3

4R = un i fo rm ( s i z e =(100 ,100)) # example mat r i x5W = where (R<0.1 ,R , 0 ) # most l y 06W2 = c s r m a t r i x (W) # make s p a r s e mat r i x7

8 v = un i fo rm ( s i z e =100) # example v e c t o r9 x = W2. dot ( v ) # mat r i x dot p roduc t

back

Page 70: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for submatrix indexing

numpy.array provides a shortcut for (MatLab-like) submatrixindexing. Assume one has a matrix W and wanted to add 1 to add 1 toa selection of rows and columns. One could use the convenient functionix and write

1 from py l ab impor t ∗2

3W = un i fo rm ( s i z e =(10 ,10)) # example mat r i x4 i r ow = un i fo rm ( s i z e =10)<0.5 # s e l e c t some rows5 i c o l = un i fo rm ( s i z e =10)<0.5 # s e l e c t some c o l s6

7W( i x ( i r ow , i c o l ) ) += 1 # add 1 to the e l emen t s

back

Page 71: Scientific Python Tutorial – CCN Course 2013 - mjrlab.org · Scientific Python Tutorial – CCN Course 2013 How to code a neural network simulation Malte J. Rasch National Key

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for using dot product

Use the dot method of an numpy.array to compute the dot product.Caution: The operator * yields an element-wise multiplication!

1 from py l ab impor t ∗2 a = a r r a y ( [ 1 . , 2 , 3 , 4 ] )3 b = a r r a y ( [ 1 . , 1 , 5 , 5 ] )4 c = a∗b # element−wi s e !5 c . s i z e6 47 d = a . dot ( b ) # s c a l a r p roduc t8 d . s i z e9 1

back