Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March...

104
Optimizing online learning capacity in a biologically- inspired neural network Xundong Wu Neuroscience Graduate Program University of Southern California Advisor: Bartlett Mel

Transcript of Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March...

Page 1: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Optimizing online learning capacity in a biologically-inspired neural

network

Xundong WuNeuroscience Graduate ProgramUniversity of Southern California

Advisor: Bartlett Mel

Page 2: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Synaptic plasticity, online learning

Lee, Huang et.al 2005

Synaptic basis of learning and memoryHebb (1949)Bliss & Lomo (1973)Bliss & Gardner-Medwin (1973)Levy & Steward (1983)Lynch, Larson, Kelso, Barrionuevo & Schottler (1983)Lynch & Baudry (1984)Morris, Anderson, Lynch & Baudry (1986)Malenka (1988)Ito (1989)Bliss & Collingridge (1993)Malenka (1994)Frey & Morris (1997)

Online learning modelNadal, Toulouse, Changeux & Dehaene (1986) Amit & Fusi (1994) Henson & Willshaw (1995) Norman & O'Reilly (2003)Fusi, Drew & Abbott (2005) Fusi & Abbott (2007) Greve, Sterratt, Donaldson, Willshaw & van Rossum (2008)

pointneuron

Page 3: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

The network: axons crossing through dendrites, making synapses

Page 4: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

“Pattern”

A pattern is a set of activated axons

Page 5: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

“Pattern”

A pattern is a set of activated axons

Page 6: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Dendrites are the unit of learning

Page 7: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

We assume neurons have separately thresholded dendritic “subunits”

Compartmentalized Firing

Compartmentalized Plasticity

Herreras (1990)Kim & Connors (1993) Schiller et al (1997)Kamondi et al (1998)Larkum et al (1999)Helmchen et al (1999)Golding et al (2002)Schiller et al (2000)Losonczy & Magee (2006)Sobczyk & Svoboda (2007)Major et al (2008) Remy et al (2009)Larkum et al (2009)

Golding et al (2002)Frey & Morris (1997)Harvey & Svoboda (2007)Bollmann & Engert (2009)Govindarajan A, Israely I et al (2011)

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

Page 8: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

We assume neurons have separately thresholded dendritic “subunits”

Page 9: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

We assume neurons have separately thresholded dendritic “subunits”

Page 10: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

We assume neurons have separately thresholded dendritic “subunits”

Learning

Page 11: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

We assume neurons have separately thresholded dendritic “subunits”

Page 12: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

We assume neurons have separately thresholded dendritic “subunits”

Page 13: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

We assume neurons have separately thresholded dendritic “subunits”

Page 14: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

Trainedpattern θF

We assume neurons have separately thresholded dendritic “subunits”

Page 15: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Compartmentalized Firing

Compartmentalized Plasticity

Larkum et al (1999)

Losonczy & Magee (2006)

Harvey & Svoboda (2007)

θL

Trained pattern θF

We assume neurons have separately thresholded dendritic “subunits”

Page 16: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θL

Trained pattern θF

recognition threshold

θR no yes

untrainedresponses

trainedresponses

1% false negatives

1% false positives

We assume neurons have separately thresholded dendritic “subunits”

Page 17: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

We previously showed storage capacity is maximized when:

•Synaptic plasticity is extremely sparse

•Synaptic plasticity is dendritic specific

•Patterns are stored by strengthening synapsesSynaptic potentiation should be governed by both presynaptic θLpre and postsynaptic θLpost learning thresholds

•Patterns are forgotten by weakening synapsesSynaptic depression should occur in the least recently strengthened (i.e. “oldest”) synapsesWu, X. E. and B. W. Mel (2009). "Capacity-enhancing synaptic learning rules in a medial temporal lobe online learning model." Neuron 62(1): 31-41.

Page 18: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

What dendritic morphology maximizes the ability of this type of network to learn?

•Why do neurons in memory areas of the brain grow dendrites of particular sizes? •Knowing how memory capacity depends on dendritic size will help us understand which changes in morphology (e.g. spine density and/or dendritic length) are most disruptive to memory function

Dierssen, Benavides-Piccione, et al (2003)

Control Down syndrome

Irwin & Patel et al (2000)

Control Fragile X

Page 19: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

10 100 1,000 10,000 # of synapses per dendrite (K)

The central question

Morphology

Capa

city

???

Given N total synapses, how does capacity depend on dendrite size?

Page 20: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

10 100 1,000 10,000 # of synapses per dendrite (K)

The central question

Capa

city

???

Given N total synapses, how does capacity depend on dendrite size?

Page 21: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

The parameters of my study

1. Pattern activation density: from 1.5% to 6%

3. Correlations: small, medium, large

Each axon activates from 200 to 10,000 synapses, introducing correlations between dendrites

2. Noise level: low, medium, high

low noise high noise

Page 22: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

256 synapses × 3% ≈ 8 active synapses (on average)

Page 23: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

8 active synapses (on average)5 active synapses

# of synapses

256 synapses 12 active synapses

P

K = 256

Responses to Untrained Patterns

Page 24: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

1

1 111

1 11

11

111

111 1

111

1

11

1

1 1

0

0

0

0

0

0 0 0

00 0

0

0

00

0

0 0

0

0 0 0

0

0

0 0

0

0

00

0

0

0

# of synapses

P

pre-synaptic activation

K = 256

Responses to Untrained Patterns

Page 25: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

1

1 111

1 11

11

111

111 1

111

1

11

10

1 1

0

0

0

0

0 0 0

00 0

0

0

00

0

0 0

0

0 0 0

0

0

0 0

0

0

00

0

0

0

pre-synaptic activation

# of synapses

P

post-synaptic activation

K = 256

Responses to Untrained Patterns

Page 26: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

K = 256

θR trained

untrained

Responses to Untrained Patterns

Page 27: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

Responses to Untrained Patterns

K = 256

Page 28: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

M = 20,000 dendrites

20,000 dendrites x 0.1% = 20 active dendrites

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

K = 256

Responses to Untrained Patterns

Page 29: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

20,000 dendrites x 0.1% = 20 active dendrites

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

M = 20,000 dendrites

K = 256

Responses to Untrained Patterns

Page 30: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

untrainedresponses

K = 256

Responses to Untrained Patterns

Page 31: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

recognition threshold

θR

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

trainedresponses

untrainedresponses

training cost: 35 dendrites

K = 256

Responses to Untrained Patterns

Page 32: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

PF = 0.1% (Probability of dendrite firing)

pre-synaptic activation

post-synaptic activation

PF = 0.02% (Probability of dendrite firing)

recognition threshold

θR

trainedresponses

untrainedresponses

K = 256

training cost: 35 dendrites

Responses to Untrained Patterns

Page 33: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

recognition threshold

θR

θF

PP

# of synapses

pre-synaptic activation

post-synaptic activation

PF = 0.02% (Probability of dendrite firing)

cost: 18

Responses to Untrained Patterns

K = 256

Page 34: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

pre-synaptic activation

post-synaptic activation

PF = 0.02% (Probability of dendrite firing)

θR

K = 256

Responses to Untrained Patterns

Page 35: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

PP

# of synapses

pre-synaptic activation

post-synaptic activation

PF = 0.02% (Probability of dendrite firing)

PF = 5x10-7 (Probability of dendrite firing)

θR

PL M=5⋅

5

K = 256

Responses to Untrained Patterns

Page 36: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θLpre

θF

θLpost

PP

# of synapses

pre-synaptic activation

post-synaptic activation

θR

K = 256

Responses to Untrained Patterns

PL M=5⋅

Page 37: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Strengthening (or refreshing) a synapse makes it young; depression targets oldest synapses

Weak synapses

(unordered)“Young”

Strong synapses

“Old”

1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 11 11 1 1 1 11 1 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0

Page 38: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Strengthening (or refreshing) a synapse makes it young; depression targets oldest synapses

1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 11 11 1 1 1 11 1 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0

Weak synapses

(unordered)“Young”

Strong synapses

“Old”

Page 39: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Strengthening (or refreshing) a synapse makes it young; depression targets oldest synapses

1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 11 11 1 1 1 11 1 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0

Weak synapses

(unordered)“Young”

Strong synapses

“Old”

Page 40: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Distribution of synapse ages is determined by a geometric decay process

Page 41: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K = 256

Capacity = 26,500 patterns

Distribution of synapse ages is determined by a geometric decay process

LN

naive calculation 16,500 patterns

Page 42: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

???K=256

Page 43: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

And now for the capacity curve...

K=256

Page 44: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Does the simple model predict the simulations?

Now Simulation capacity

Analytical capacity

Why are the predicted and actual capacities different?

1. Simulations assume a soft dendritic firing threshold.

2. In simulations, each axon makes multiple synaptic contacts – leads to correlations.

3. In simulations, variance in dendrite decay times leads to premature response failures

Page 45: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Model

Page 46: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Model

Simulation

Page 47: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Simulation

Page 48: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Floor effect leads to reduced capacity for neurons with large dendrites

dendrite usage

θR

untrainedresponses

trainedresponses

dendrite usage

Response distribution of K=1024

capacity

Den

drite

usa

ge

Page 49: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Floor effect leads to reduced capacity for neurons with large dendrites

dendrite usage

Response distribution of K=1024

θR

untrainedresponses

trainedresponses

syna

pse

usag

e

capacity

Page 50: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

θLpre

PF = 0.000000125

probabilitya dendrite learns

probabilitya dendrite fires PL = 0.00001

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

θLpre

θF

number of dendrites

PL PF

Probabilityuntrainedresponses

trainedresponses

Page 51: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

PF = 0.000000125

PL = 0.00001

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

θLpre

θF

number of dendrites

PL PF

Probabilityuntrainedresponses

trainedresponses

45% false negatives!

Page 52: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

θLpre

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

number of dendrites

PL PF

Probabilityuntrainedresponses

Back to 1%error rate!trained

responses

θLpre

θF

30% readout failure

Page 53: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

θLpre

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

number of dendrites

PL PF

Probabilityuntrainedresponses

too higherror rate

trainedresponses

θLpre

θF

includesreadoutfailures

Page 54: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

θF

θLpre

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

PL PF

Probabilityuntrainedresponses

trainedresponses

θLpre

θF

Page 55: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Area under the tail

K=64

pre-synaptic activation

post-synaptic activation

PL PF

Probabilityuntrainedresponses

trainedresponses

θLpre

θF

20 dendrites trained

θF

θLpre

Page 56: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Short dendrites suffer from (1) the dendrite availability problem, and (2) high readout failure rates, both of which increase the number of dendrites needed to store a pattern.

dendrite usage

dendrite usage

capacity

Page 57: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K=64 K=256

minimum requirement7

minimum requirement7

to increase dendrite availability+4

+9to compensate for readout error

+1to compensate for readout error

20

8

Den

drite

usa

ge (U

D)

Page 58: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K=64

K=256

capacity

dendrite usage

Page 59: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

prediction

syna

pse

usag

e

capacity

dendrite usage

Page 60: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

actualprediction

syna

pse

usag

e

capacity

dendrite usage

Page 61: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K=64

K=256

θLpre

θLpre

average numberof synapses activated (Kx3%)

post-synaptic activation

pre-synaptic activation

post-synaptic activation

pre-synaptic activation

8

F=0.14

F=0.08

2

Page 62: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

prediction

syna

pse

usag

e

capacity

F factor

Page 63: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

actualprediction

syna

pse

usag

e

capacity

F factor

Page 64: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

syna

pse

usag

e

capacity

Page 65: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

US=160, N=K×M =5,120,000

US=160

syna

pse

usag

ecapacity

Page 66: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K=256K=64

Assume constant synapse usage Us

K=256

K=64

syna

pse

usag

ecapacity

Page 67: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Summary of results

syna

pse

usag

e

dendrite usage

Long dendrites are more expensive than short dendrites

Short dendrite suffer from:

Dendrite availability problem

Readout failure

High F factor

capacity

Page 68: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Higher activation density decreases capacity and favors small dendrites

Page 69: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Short dendrites are more sensitive to noise

low noise

high noisemedium noise

Ca

paci

ty (x

1000

)

Page 70: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Correlations lead to increased deviations in dendritic activation

small

medium

high

correlation

linear logarithm

K=1024

Page 71: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Correlations reduce memory capacity

small correlation

medium correlation

high correlation

Page 72: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Duplication avoidance reduce in dendrite summation deviation

P

In dendrite summation

Page 73: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Duplication avoidance rescues the impairment caused by signal correlation

without decorrelation

small correlation

medium correlation

high correlation

Page 74: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Duplication avoidance rescues capacity from the effects of signal correlation

without decorrelation

withdecorrelation

small correlation

medium correlation

high correlation

Page 75: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Branco, T. and K. Staras (2009). "The probability of neurotransmitter release: variability and feedback control at single synapses." Nat Rev Neurosci 10(5): 373-83.

Evidence for duplication avoidance?

Page 76: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

What we learned about online learning in a neural context

Conclusion

Page 77: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

For an online (sequential, one-shot) recognition memory containing

• a large number of synapses formed onto• 2-layer neurons containing separately thresholded dendritic subunits

Storage capacity is maximized by

• turning up the dendritic firing threshold very high to

keep background firing rates extremely low, which allows for

stored traces to be extremely weak with

very few synapses consumed per pattern so that

each synapse is very rarely used and can grow “old” before it is deleted

and• storing patterns by an LTP-like operation gated by

dual learning thresholds requiring both

1. strong pre-synaptic activation of the dendrite

i.e. having many axons driving the synapses and

2. strong post-synaptic activation of the dendrite

i.e. where many of the activated synapses are already strongand

• protecting strengthened (or refreshed) synapses by a tag that ages so that

• homeostatic depression is limited to the least-recently trained synapses

Page 78: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

• choosing a dendritic morphology with dendrites of “medium” size, that is

as short as possible to minimize synapse usage per dendrite but

as long as possible to reduce within-dendrite variability that

1. forces dendrites to have excessively high thresholds, leading to

availability problems, that in turn requires

lowering firing thresholds which

raises background firing rates which

lowers capacity

2. leads to excessively high readout failure rates that again requires

increases storage costs per pattern which

lowering dendritic thresholds to

increase memory trace strength which again

increases storage costs per pattern which

lowers capacity

3. leads to high synapse usage ratio (F) per dendrite producing

unfavorable conditions for geometric decay of age queues, leading to

shorter information survival times in dendrites which

lowers capacity

and

Page 79: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

How this will be helpful

Page 80: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

We now understand better how properties of dendrites, including• their sizes• their learning and firing thresholds

• their plasticity rules including

the rules and mechanisms governing “LTP”

the rules and mechanisms governing “LTD”

relate to memory capacity.

1. Scientific contribution

2. Practical/translational contribution

Our improved understanding will• Help us to propose experiments to test a variety of predictions that arise from this model

• Help us to interpret the significance of changes to neurons that disrupt memory function in

agingstressneurological disorders

Page 81: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Acknowledgment

Committee membersDr. Bartlett Mel

Dr. Michel Baudry

Dr. Manbir Singh

Dr. Fritz Sommer

Dr. Li Zhang

Lab membersBardia Behabadi and DJ StrouseMonika Jadi, Rishabh Jain, Chaithanya Ramachandra, Yichun Wei

Page 82: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

The end

Page 83: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

The end

Page 84: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 85: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 86: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 87: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Normalized Pre/Post synaptic summation

B(x;K,PA )

B(x × μ;K,PA ) / Max(B(x;K,PA ))

μ = K × PA

B(x;KS ,PA )

B(x × μ;KS ,PA ) / Max(B(x;KS ,PA ))

μS = KS × PA

KS = K /2

Page 88: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Simulation capacity

Analytical capacity

Page 89: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 90: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Dendrites must cross two thresholds to learn

(DA )

(DB)

Wu & Mel 2009

# of strong active synapses (DB)

Page 91: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Retrieval failure further prefer long dendrites

Page 92: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 93: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Theoretical calculation on capacity under different activation densities

Page 94: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Online learning memory is a sequential process

Page 95: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Memory trace saturation

Page 96: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Theoretical analysis

Page 97: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

K=2048

K=128

K=32

Page 98: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 99: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 100: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Second pressure on short dendrites

U≈160

Page 101: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Smaller dendrite size is associated with shorter synapse age queue

depthAssume constant synapse usage Us

K=32

RT=θLpre/K F

K=100

K=50

Page 102: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.
Page 103: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Now Simulation capacity

Analytical capacity

Page 104: Optimizing online learning capacity in a biologically-inspired neural network Janelia Farm, March 30, 2008 Xundong Wu Neuroscience Graduate Program University.

Short dendrites are more sensitive to noise

low noise

high noise

medium noise